Support Request: I’m playing a sim-style game, and the non-player characters you post have certain skills, weaknesses, likes, and dislikes. So I sometimes put them in situations that I know will make them uncomfortable, like sending a man with a fear of space into a mining asteroid. The results can be funny. But I also feel a little anxious because I’m not letting them live their best lives. Am I immoral?
Games of this kind allow ordinary humans to live the fantasy of playing the role of God. You become a flaw in your own digital universe, dictating the fates of characters whose lives remain, as they are, subject to your whims. Playing with it tends to raise the kinds of questions that have long been addressed in the theological and tragic literature.
Ever since we humans started writing, it seems, we’ve been suspicious of being pawns in higher being games. In the The IliadWhen Hector realizes he is facing death, he complains that men are playing gods whose wills change from day to day. It is a conclusion echoed by Gloucester in King Lear, which is wandering in health after being blinded mercilessly. “Like flies to wild boys we are to gods. / They kill us for their sport.”
In the Book of Job, Satan and God are betting on whether Job, the righteous man, will curse God if enough suffering and hardship befalls him. After Satan obtains God’s permission, he kills Job’s children, servants, and livestock, and causes his body to boil. Job, who has no evidence that his sufferings are merely the subject of a gentleman’s bet, can only assume that his hardships are divine punishment. He shouts: “My body is covered with worms and lumps of dirt.” “My skin is broken, and I have become obnoxious…my life is wind.”
It is difficult to read such passages without sympathizing with the human victims. And I imagine that the uneasiness you feel when you provoke your characters means that you suspect that you are likewise making them suffer for your entertainment. Of course, non-player characters – non-playable characters – are just algorithms with no brains or feelings, and therefore no ability to feel pain or discomfort. This is, however, the consensus. But humans, as you probably know, have a bad record of underestimating the emotion of other creatures (Descartes thought animals were just machines and couldn’t feel pain), so it’s worth taking a moment to really think about the possibility of algorithmic suffering.
Many NPCs rely on behavior tree algorithms that follow ‘condition’ rules, or – in more advanced characters – machine learning models that develop their own adaptive methods. The ability to suffer is often associated with things like pain receptors, prostaglandins, and opioid neurotransmitters, so video game characters seem to lack the neural systems needed to respond to pain. Emotional distress (our ability to feel fear, anxiety, and discomfort) is more complex, from a neurological standpoint, although emotion in humans and other animals often depends to some extent on external stimuli processed by the five senses. Given that these algorithms do not have sensory access to the world – they cannot see, feel or hear – it is unlikely that they will be able to test for negative emotions.
However, when it comes to the ethics of suffering, neuroscience is not the only relevant consideration. Some moral philosophers have argued that the ability to retain preferences—the ability to see the world in terms of positive and negative outcomes and to develop decision-making processes around these outcomes—is an ultimate criterion for true suffering. One advantage of talking about preferences rather than pain is that while pain is completely subjective, and is felt only by the person suffering, preferences can be observed. We know that cats have preferences because they back off from the bathtub water and sometimes get stuck when dogs approach them. The fact that your NPCs have, as you say, “certain skills, weaknesses, likes and dislikes” indicates that they do indeed have preferences, although this is also something you can test with simple observation. When you put them in unwanted situations, do they resist or struggle? Do they show facial expressions or physical movements that you associate with fear? You might object that such behavior was simply programmed by their designers, but animal preferences could also be thought of as a kind of algorithm programmed by evolutionary history.