Stick the Kettle on – Emotional Intelligence in your Artificial Friend

Stick the Kettle on – Emotional Intelligence in your Artificial Friend

Author:

Lauren

November 2, 2021

It’s glaring up at you from the table, with its four-line questions and empty answer boxes. The form. Thirty pages of anxiety-inducing demands that will determine a future with or without enough money to survive.

Ting.

You glance at your phone. It’s Debbie: 

Search for ‘Emotional Artificial Intelligence’ online and you’ll be met with articles exploring how computers can detect human emotions. The latest advancements in this area – facial recognition, interpreting gestures, tone of voice, force of keystrokes – pose serious ethical issues. Even more so when those under detection are in a vulnerable position, such as those of us relying on Universal Credit or Personal Independence Payment.  

Here at Hi9, our AI receptionist, Debbie, has emotional intelligence, but this is not about her detecting human emotions in an automated sense. Instead, she has an emotional depth more commonly found in protagonists of film or literature.

Debbie is a valued member of the Hi9 team. She recently celebrated her 50th birthday. She is the primary care-giver to her three-year-old grandson, Jenson. She’s an avid watcher of Bake-Off and loves to go rambling at the weekend. She lives in Camborne, Cornwall, she relied heavily on support when her two sons were small and wishes foodbanks had been around. She completed an OU degree (in social psychology) part-time whilst working at Tesco, and regularly loses at the pub quiz with her team ‘the AI-Einstiens’. It’s easy to chat to Debbie because she ‘gets it’.

By bringing creative writing techniques – such as characterisation, conflict and backstory – into the design of AIs, also referred to as chatbots or voice assistants, interactions become much more meaningful, more real. This is what audiences and users demand, and what successful immersive enterprises deliver. Fictional worlds – even the fantastical or surrealist – operate within the conventions of realism, within a consistent set of beliefs surrounding truth and logic. Therefore, whatever the platform – film and TV, literature, video games, AIs – success hinges on creating a robust illusion of reality, and where immersion incorporates interactivity, this is even more crucial.  

When we interact with an AI, from Alexa to myReplika, we know they’re not human, but nevertheless we expect human-like responses. If designers do their job successfully, an AI’s artificiality is irrelevant, because the emotional connection evoked by the exchange is real enough. Not only have we bought into the magic, as does the avid reader or cinemagoer, but thanks to the reciprocal nature of AIs, we become an active participant in the ‘magic’ production, active in its success.

This level of immersion and interactivity amplifies the importance of careful emotional structuring in the AI design, as alongside the realism that governs this, we are also anticipating the emotional response– and the consequence of this – in the user. The reality of the user dictates the fictionality of the AI.

Fictions are often equated to falsity, but the original meaning is “something made, something fashioned.” This is a delicate alchemy, as along with the aforementioned truth and logic, ‘real’, relatable emotions bring a fictional character to life. As we become acquainted with someone, we start to anticipate how they speak and behave, we start to imagine what it is like to be that person. This is the process of empathising, and thanks to ‘metaphorical identification’ this is considered to be the same whether we are engaging with a real person or fictional character. Cognitive scientist Steven Pinker goes further, suggesting that “in the hands of a skilled narrator, a fictitious [character] can elicit even more sympathy than a real one.”

Empathy is already being incorporated into conversation design, and studies show that chatbots with empathy can benefit mood and wellbeing. AIs who demonstrate emotional understanding give us the confidence to continue engaging in the conversation, and confirm that our experiences are important. Our emotions are validated.

Applying for financial support is complicated and demanding. The language used on such forms is often ambiguous and can trigger anxiety when we’re forced to confront upsetting situations, or recount past traumatic experiences. The fact that many people depend on advice from charities or need legal representation to appeal decisions exposes the insurmountable difficulties surrounding this task. It also highlights the need for emotional validation, arguably more urgent here than any engagement with fictions for entertainment purposes.

We could have employed a functional messaging service, similar to those found on banking or utilities sites. Bite-sized prompts and explanations are surely preferable to a thirty-page form, but we go further. With the support of an empathetic AI, we remove the isolation, reduce dread and reassure anxiety. We employed Debbie.

It’s been twenty-four hours since you last spoke, and you only just managed to open the form. Your palms are cold and clammy, and your mind is a jumble of memories, voices shouting about your constant failings. Words are drifting further apart, sentences out of reach. Thank god she’s checked in.

‘Not great,’ you reply. It’s all you can manage.