Emotionally Intelligent Design: An Interview with Design Visionary Pamela Pavliscak

Emotionally intelligent design is design that recognizes and respects human emotion, and it’s especially important in today’s tech-infused world. The designer aims to humanize technology, sensitive to the fact that emotion is an integral part of human make-up.

We want the technologies we build and use to respond supplely to human needs. Emotionally intelligent design is the way. It’s better design, responsive design.

Of course technology can be part of the designer’s toolset, specifically emotion AI technology that measures, interprets, and enables machines to convey emotion, augmenting human capacity rather than (simply) automating.

I’ve paraphrased and interpreted the words of tech emotionographer, author, consultant, and Pratt Institute faculty member Pamela Pavliscak, an expert on emotionally intelligent design. I interviewed Pamela recently in the run-up to the May 5, 2020 Emotion AI Conference, where she’ll be speaking. Our conversation went as follows…

Seth Grimes> Let’s start off with a quote. You’ve written, “Now, more than ever, we need technology that respects what it means to be human.” Would you please explain? Which technology, how can it be more human, and how will that make a difference?

Pamela Pavliscak> People are beautifully complicated. And that includes our life with technology. We can’t compartmentalize our tech use. We use technology to make sense of our lives, to form relationships, and to create bonds. It’s fundamental to how we work, play, and love. Right now that’s mostly screens, and that feels, frankly, flat. 

With the global pandemic, that’s taken on a new dimension. Being connected is more important than ever. Technology is that tenuous thread. It’s enabling a sense of solidarity. At the same time, technology is exhausting us. 

Technology is not this neutral conduit of feeling. It shapes our experience and it’s changing our emotional life in ways that maybe we’re not always perceiving. Right now on Zoom, unless you’ve hidden your image, you’re seeing yourself as you’re talking to me and making adjustments in how you express yourself. I’d argue that changes us. Designers and developers need to  consider these things. 

So, for me, humanizing technology is not creating bots that have human characteristics. It’s about being sensitive to the fact that emotion is a part of our make-up. We can’t separate it out. There’s no logical self and emotional self, it’s all a wonderful tangle of who we are. We are going to impact inner life no matter what we do, so let’s do it wisely. 

Emotionally Intelligent DesignSeth> You wrote a book on Emotionally Intelligent Design, and you teach design at the Pratt Institute. What differentiates emotionally intelligent design from, what should we call it, insensitive design, the generic variety, and how should we go about it?

Pamela> Emotionally intelligent design is design that recognizes and respects emotion. Most of the time, emotion is in the background. Designers aim to make things friction-free and blandly pleasant. I sometimes joke that we only have two unspoken ideal states of emotion in design—delight and calm. On the surface, that sounds fine. 

But design is this layer we create between people and the world. In doing so, we effect affect. 

So emotionally intelligent design looks at how might we design for, let’s say, anxiety. Do we always want to alleviate it? Of course, you might say. Whatever we can do to make people feel better. That’s why we have pink jail cells or blue street lighting. We’ve made a connection between design and emotion, and done our best to test our assumptions and refine the approach.

Is anxiety always negative though. Is it ever useful? We can whip up anxiety to sell things, of course, and that can veer into manipulation. But there’s more to it than that. For instance, anthropologist George Marcus studied anxiety’s effect on voting in the US and found that people were more attuned to the issues if they were a little bit anxious. Since design can mitigate or stir anxiety, we need to look at it from all angles. 

So, what I hope to see is not just a recognition that what we do has a powerful effect but that we use that power wisely by giving more thoughtful attention to emotion. Designers and developers love toolkits, so that’s one of the things I’m working on now. And I can share a bit of that in the talk.

Seth> How do you measure the impact of emotion in design?

Engagement is the stand-in for emotion currently. 

And so far we’ve been measuring emotion in 3 ways. 

  1. Is it pleasant enough? Have we caused no pain and maybe added something a little special to the experience? Are people satisfied? Would they return? This ultimately leads to an interchangeable experience, nothing too special. But maybe a little better than something else.
  2. Have we temporarily solved a problematic emotion, like boredom or loneliness, with a reward that keeps you coming back? This can lead to an addictive experience like social media and games.
  3. Do we stir up an intense emotion to persuade or engage? We measure this with virality, or how much does this emotion stir up engagement. This can leave us with a kind of emotional hangover, since typically those levels of intensity are not meant to be sustained for long periods of time. 

On all three counts, we need a reset. These might be good outcomes for businesses but they are out of sync with how we want to live. By overvaluing these short-term goals, we are maxing people out. Engagement isn’t the only way to make emotion meaningful for business either.

I’d love to see a longer view that emphasizes attachment with a gradual build toward trust like you’d see in a good relationship. 

Seth> How deeply does your work get into particular emotion AI technologies or applications? Regarding emotion AI and affective technologies more generally, what works well right now and what’s as-yet unreliable?

When we’re talking about AI detecting something about our emotional life, all kinds of red flags go up. Will it read my mind? Will it reveal  innermost feelings without my knowing it? Will I be rewarded or punished for certain kinds of feelings? Will it tell me something uncomfortable about myself or others in my life that I’m not ready to know?

The immediate answer to all these questions is “no”. And in the longer term, it’s probably a “no” too. Does this mean emotion AI is useless? Also “no”. 

Think of it this way. These technologies are at a toddler phase of development. Toddlers tend to see emotions in bold strokes. When a toddler is angry, she stamps her foot and cries. When a toddler sees a big smile, she associates that smile with “happy”. Actually even a toddler knows more than “a smile equals happy” though, because she already knows a great deal more about context. A smile from a parent is different than a smile from a stranger. 

Emotion AI doesn’t know any of this. Even with all these micro-expressions carefully encoded and trained on diverse data sets, it can recognize an exaggerated smile and label it happy. And it doesn’t know what that label happy means. Is it content, or mischievous, or grateful, or sly? It doesn’t have enough context on a personal, social, or cultural level. 

And it doesn’t have the categories. It’s working from one theory of emotion that focuses on discrete, basic emotion. Other theories say that emotion gradually becomes more meaningful because it’s constructed. It’s one of those nature/ nurture arguments that will probably never be solved. I suspect most people think it’s a bit of both. 

That’s just facial expression. Other forms of emotion AI like voice or biometrics, are similarly basic as far as emotion categories. In other ways, emotion AI is attuned to aspects of emotions that humans aren’t so great at detecting. Because AI is all about patterns and doesn’t mind continuously paying attention, it can pick up on things we might otherwise miss. 

Recently, it uncovered mouse emotional expressions. That seems so out there, but the implications are extraordinary. If we can detect mouse pain, will it lead to more compassionate treatment of animals? I mean wow! 

So, perhaps it gives us some hints about what people or other creatures might be feeling. And then, rather than saying “aha, we know this person or this group of people are angry”, we have to use treat it as a clue. It’s a little glimmer, it’s not an absolute truth. 

Seth> Your talk at the Emotion AI Conference on May 5 is titled Design for an Emotionally Intelligent Future. What might that future look like — high points?

Adorable robots, hug transmitting mugs, and mood changing clothing? Ha, I wish!

It’s a big unknown for everyone—right now especially. How will this global pandemic affect our mental health? What about our social practices? Will emotion AI be used to support or to surveil? Will technologies that make use of emotion AI, like voice assistants or companion bots, have more of a role? 

Emotion AI has positive potential, if we aim to augment human capacity rather than automate it. 

Let’s use this pause to explore the kind of futures we’d like to see. There are many possible futures. Design gives us the power to vividly imagine, and really feel, what these futures might be like. And that’s what we’ll explore together in May!


I recorded video of our conversation. Note that it does deviate from the text above. Enjoy!

Leave a Reply