Emotional AI In A Digital World

Emotion AI — or emotional AI if you’re my British friend Andy McStay — describes tech that understands and conveys emotion in human and human-machine interactions. The algorithms are well understood — natural language processing, speech analytics, computer vision, and biometrics — but we’re only starting to come to grips with applications and with data, bias, and ethical implications.

Emotion AI has many applications: consumer and market research, conversational interfaces, contact center operations, policy-making and finance, education, and, notably, healthcare uses that include suicide prevention. Potential but not-yet-fully-realized applications include emotionally intelligent design, design that aims to humanize technology.

Actually, I wouldn’t call the emotion AI potential of any application fully realized. That thought was the starting point for an exploratory conversation I had with Andy, who is professor of digital life at Bangor University in Wales and who’ll be speaking at the up-coming Emotion AI Conference, taking place online on May 5th. Emotional AI is at the center of Andy’s work so I’m grateful we had the opportunity to talk.

Seth Grimes> Andy, you run a site called EmotionalAI.org with a number of collaborators. I’m going to quote from text on the site: “EmotionalAI.org contains social science insights, project details, art projects, analyses, and reports on technologies that pertain to, detect, and interact with emotional life.” You define Emotional AI on the site, but let’s open with your definition. Emotional AI is…?

Andy McStay> Emotional AI arose from affective computing and the work of Rosalind Picard and her team at MIT in the 1990s. I know there’s a risk emotional AI will be seen as old wine in new bottles, however in addition to the contribution of affective computing, there are wider AI issues involved, particularly in relation to [machine] learning, in relation to processing, and critically about relationships with technologies. For me, one of the more important aspects is that emotional AI isn’t just about technology. It’s about human inter-relationships and about human relationships with technology.

Seth> I think a lot of our technology relationships just sort of happen. Product designers create products, say a cell phone where there’s a complex interaction that designers are trying to make simple, without completely understanding the relationships people will form with the products. When we study actual interactions, we’re playing catch-up, trying to derive design best practices after the fact.

Andy> There’s some merit in that. I think people adopt and embrace technologies in ways that are not always expected by designers. Perhaps the most famous example is not emotion based: texting via mobile phones. Who knew that was going to become a thing? When texting was created it was just a tiny little add-on. But it became one of the most central services that a mobile phone offers.

Seth> And from texting we got emoticons and emoji and then gifs and stickers and all that stuff that add layers of emotional richness to those communications.

Your EmotionalAI.org site lists collaborators who include a digital artist, a legal scholar, design experts, and others. What are your goals?

Andy> In terms of the site and the project itself, we’re working with a number of people in the UK and Japan, with people of legal expertise, artists, and people on the government side. I’m not quite sure that we have a collective aim as such, but rather it’s a disparate set of projects that cluster around this interest in emotions, empathic computing, and so on.

Seth> Otherwise, you’re professor of digital life at Bangor University and you consult. What is digital life?

Andy> That’s a good question. So in the UK, when you’re awarded a chair or a professorship in the UK, you get to pick your own title. When I looked across all the various things that I’m interested in — relationship advertising, privacy, the wider philosophical and social implications of new technologies — the thing that conjoined all of this was digital. So then you think about the application layer — I’m interested in advertising, I’m interested in philosophy, I’m interested in social science — and again it was really broad, so do I choose culture, do I choose policy, what do I choose? And the idea of just life itself seemed most appropriate. So digital life fit the bill.

Seth> Are you studying how our digital lives are changing in the face of the Covid-19 crisis and our being forced to distance?

Andy> Particularly with the Japanese team, we’re looking very very closely at it, and it’s fascinating, isn’t it? When you think about how each of us are managing children, managing parents — governments are managing civic emotion, listening to civic emotion through sentiment analysis — Covid-19 involves emotion in so many ways, mediated emotion that informs behavior and decisions. So we’re looking very closely at this both in relation to the centrality of emotion in digital culture and some of the issues it throws up. I’m sure it’s the same in the US in the UK. As we start building new services and add surveillance, this raises ethical questions. So our interests in emotion are not immune to privacy concerns either.

Seth> We do have widespread surveillance, obviously by video camera, and online as well physical-world location tracking, plus digital monitoring and potential for voice surveillance. We have emotion AI data sources that include text, speech, facial expressions, devices that measure physiological states. You mentioned Rosalind Picard at MIT earlier; her work is largely in wearable devices. How well do emotion understanding technologies work? There are going to be varying levels of performance so if you wish, you can break your response out by the measurement context, by the type of data source, however you want. How well do emotion AI technologies work?

Andy> Before we get into the specifics of which technologies work best and are most effective or least effective, I think it’s important to recognize that we’re just at the start of something here.

I’ve been looking at this area for 5 to 10 years. When I got into this area, I expected things to move quickly. I expected us to be further along by now than we are. But I counterbalance that with the observation that emotions are innately, commercially, and politically valuable. We will see a net rise in technologies that engage and interact with human emotion, but it’s going to happen more slowly than at first we thought. The technologies in many ways just aren’t there yet. Some of that’s down to the sensing; some of that’s down to the psychological suppositions that sit behind the sensing. One of the most obvious tech controversies is around facial coding: lots of debate, lots of discussion going on about whether Eckman’s basic emotions are an appropriate way forward. Do we need to be looking at more-core affective states rather than putting psychological models in, and so forth? It’s interesting to note too, when it comes to voice — such a prime means of expressing and conveying emotion — Amazon have been working on this for a long time now but have still not released a stable product — it will happen but I don’t think we’re there. In terms a one-to-ten scale, for each modality, I don’t think we can really do that, but across all modalities, there are improvements and lessons being learned as we go along.

Seth> The Emotion AI Conference will feature a panel on data, bias and, ethics. You’re not part of the panel but you are giving a framing talk earlier in the conference that is titled Empathic Technologies: Landscape, Ethics, and Citizens. Can you say just a bit more about ethical and public considerations that you will discuss?

Andy> It’s really important that we have these ethical debates, these ethical conversations, but in addition to professionals — people who work in the industry, people who study the industry — it’s really important that citizens have some sense of say. Some of the work I’ve been doing both in the UK and in Japan is looking at citizen reactions to emotional AI across a range of services including cars and sentiment analysis and political terms, advertising, and voice assistants. I want to talk about data, about change over the five years I’ve been collecting data in this area, to give a sense of the primary concerns important for developers and industries working in this area.

Seth> Andy, thanks. Those who want to check out Andy McStay’s work, please visit EmotionalAI.org.

Leave a Reply