02 July 2014

If I Only had an Emotion Chip

Tin Man
Image ©Bulent Yusuf

The non-biological entity lacking the ability to emote – it’s a familiar tale. He’s Data in Star Trek , he’s the Tin Man in The Wizard of Oz , he/she/it is most of the sci-fi robots you’ve ever read about or seen in movies.

The mythology of animate beings possessed of human form but not of human sentiments is ancient. In Jewish folklore, rabbis channelling the power of God raised magical ‘golems’ fashioned from mud. Though usually intended to protect their maker and his people, these beings sometimes ran amok, their self-control and moral judgement evidently compromised by their lack of soul. Only the ‘one true God’ is cogent enough to raise an exceptional mud-being – Adam – complete with the full set of human attributes.

In panel-beating the ‘automatones’ to burnished life, the Greek blacksmith-god Hephaestus put to shame the crude, daub creations of the rabbis. According to Homer, his ‘ Kourai Khryseai ’ – ‘Golden Maidens’ – were ‘in appearance like living young women’ and had ‘intelligence in their hearts’. Yet they were slaves, manufactured only to serve their fiery master.

And in more recent imaginings, the manufactured automaton is still a captive chattel. The word ‘robot’ – from the Czech word ‘robota’, meaning ‘forced labour’ or ‘serf labour’ – has only been in common usage since 1921, when Karel Čapek introduced it in his dystopian play R.U.R. (Rossum’s Universal Robots).

Although portrayed in the film as robotic in form, the erotically-charged ‘Maschinenmensch’ (machine-human) from Fritz Lang’s Metropolis (1927) was written as a magical construct. Her fabricator, the evil scientist/magician Rotwang, swathes his creation in a pleasingly rivet-less ‘skin’ to double as the seraphic heroine, Maria, and foment riot among the populace. (In less heavily edited versions of Metropolis , the Maschinenmensch also houses the soul of Hel, Rotwang’s former lover.) Though more Golden Maiden than robota, this beguiling gynoid (female android) lacks the intelligent autonomy of Hephaestus’ inventions. An instrument of the despotic Rotwang, she does his bidding without mercy, without self-determination, and without emotion.

With his I, Robot series of short stories, Isaac Asimov brought machine men into the popular consciousness. Published in the 1940s, his tales of robots with nicknames – including ‘Speedy’ (SPD-13), ‘Cutie’ (QT1), ‘Dave’ (DV-5), and of course ‘Robbie’ (RB) – also introduced sci-fi readers of the Rocket Age to the concept of machine morality. In the story ‘Robbie’, the robot’s ethical programming drives him to save the life of a young girl. Asimov’s ‘ Three Laws of Robotics ’ from these stories have since become a subject of serious debate among artificial intelligence (AI) researchers and technology ethicists.

Data from Star Trek seems an amalgam of Robbie and the Tin Man. Like Robbie, he gains the trust of those around him through acts of sound moral judgement; gradually, it comes to matter less to his human colleagues that he is a ‘machine’. And like the Tin Man, he wishes to have emotional capabilities. But hold on a minute. Isn’t that a desire ? How could an entity without emotions ever have desires?

Ah, but,’ said my wife when I discussed this with her recently, ‘Data is programmed to want to have emotions.’ Fair enough, but what does that mean ? Does that programming of desire to have desires not constitute a desire, and therefore an emotion? And what about our own desires? We do tend to distinguish between our various ‘wants’ when we think about them, considering some of them dreams and hopes, and others mere ‘drives’ and ‘instincts’. And, in some ways, we consider our emotions to be separate from our desires: our desires are the outcomes we want, whereas our emotions are how we feel about actual and potential outcomes. Perhaps, then, we are comfortable with classing Data’s wish to have emotions as a mere drive – one unconnected to any emotional content.

In his book The Emotion Machine , AI pioneer Marvin Minsky challenges us to think about the neurological mechanisms of emotion. He points out that

Saying that someone is like a machine has come to have two opposite meanings: (1) “to have no intentions, goals, or emotions,” and (2) “to be relentlessly committed to a single purpose or policy.” Each meaning suggests inhumanity, as well as a kind of stupidity, because excessive commitment results in rigidity, while lack of purpose leads to aimlessness. 1

It is notable that Minsky conflates intentions and goals with emotions – notable because most of us set emotions way above ‘mere’ goals in the ‘hierarchy’ of cognitive abilities. We do this to the extent that we may not even think of emotion as a cognitive ability. Data has goals – they may be pre-programmed, but they are still goals. Does he have intentions , though? Well, without emotions, he can’t have his ‘heart set’ on anything (and achieving his pre-programmed goals can’t be about fulfilling his ‘heart’s desire’) , but can he have something else ‘set’ when he ‘aims’ to do something?

Set’ is perhaps the wrong word; it seems in the nature of emotions to be dynamic and un set (indeed, often ‘unsettled’ and sometimes ‘unsettling’). In Incomplete Nature , Terrence Deacon discusses the ‘dynamical feel’ and ‘tension’ of emotions:

It is the tension that separates self from non-self; the way things are and the way they could be; the very embodiment of the intrinsic incompleteness of subjective experience that constitutes its perpetual becoming. 2

He goes on to discuss the resistance of the body to thinking, and how our basic drives try to ‘derail’ the delicate energistic processes involved. While conscious, we constantly feel this ‘fluid’ ebb and flow, this ‘tugging’ on our inner mental worlds.

So, Data’s ‘emotion chip’ – when he eventually receives it – must do a complex job: it must dissolve the ‘stupid’ algorithmic rigidity of his programmed behaviour without obliterating his logical faculties, it must instil in him dynamic-tension-causing urges, and it must enable him to reflect upon those tensions in a way that is elevated above them but not separate from them. And it must do all this without putting him into an infinite reboot loop or causing permanent shutdown!

For us, the feeling of struggle when concentrating hard on learning a new skill is, in effect, the feeling of a process of cognitive uninstallation and installation: we are trying to uninstall a too-dynamic-to-do-the-job ‘emotion chip’, and to install an ‘off by heart’ ‘algorithmic rigidity chip’. Of course we sometimes want to use our new skills – artistic, musical, empathetic – to deepen our emotional experience. But in order to do that, we have to sacrifice some of the areas of emotional tension that were involved in the struggle to learn the skill and in the feelings of frustration at not having that now-ingrained outlet. Seems like a fair trade.

The fear of becoming maschinenmenschen is understandable, but perhaps we can learn to better value our ability to automate parts of our minds without losing our esteem for emotion. We can be part-time positronic without losing compassion, without losing heart.


1 Marvin Lee Minsky, The Emotion Machine: Commonsense Thinking, Artificial Intelligence, and the Future of the Human Mind (New York: Simon & Schuster, 2006), 33.

2 Terrence William Deacon, Incomplete Nature: How Mind Emerged from Matter , 1st ed (New York: W.W. Norton & Co, 2012), 512.