How We Can Bring Emotional Intelligence to Technology

BY Angelica Frey | November 20, 2022

A smile can be complex, and not just because we possess 45 facial muscles that create thousands of emotional and mental-state expressions. “If you look at how humans communicate, only 10% of communication is based on words,” said Rana el Kaliouby. “Ninety percentis nonverbal, split between facial, body posture, prosody.”

El Kaliouby is the deputy CEO of Smart Eye, the co-Founder of Affectiva, which provides emotion-recognition software and analysis, and the author of Girl Decoded: A Scientist’s Quest to Reclaim Our Humanity by Bringing Emotional Intelligence to Technology. She is a pioneer of Emotion AI, in which machines study non-verbal cues to determine a person’s emotional state. El Kaliouby has spent the last 20 years building emotional intelligence tools, leading the technology with studies on school-aged children on the autism spectrum, then branching out to work for the likes of Bank of America and the Coca-Cola Company. “I noticed the emphasis in AI was on IQ, nobody was trying to build EQ in AI,” said el Kaliouby. The product consists of a video-testing solution which measures and quantifies how people respond to advertisements. El Kaliouby spoke during a fireside chat at From Day One’s Boston conference, where she was interviewed by Boston Globe reporter Katie Johnston.

The technology shares similarities with Paul Ekman's Facial Action Coding System, where each facial muscle was mapped as an action unit. Following this method results in five minutes spent decoding every minute of video footage when it comes to emotions. “It’s very laborious, so, instead, we use computer vision, machine learning, deep learning, and tons of data,” she said. “And we train these algorithms to do that–basically, in real time now.”

El Kaliouby has fond memories of technology. When growing up in the Middle East as the daughter of parents who worked in tech, she remembers huddling around the Atari console with her sisters, and fiddling with a parent’s or a relative’s camcorder. “Tech can help divide us, but also keep us together,” said el Kaliouby. She does acknowledge, though, that it is a neutral force. “It can be a double-edged sword, and even though I am passionate about use cases, there are pitfalls, and there are areas where we, as a company, have stayed away from,” she said.

For example, how does it factor into the talent-acquisition pipeline? “There are companies that ask candidates to submit a video instead of a standard resume, which I think is actually cool because it’s an opportunity for people to tell their story,” el Kaliouby said. “Then the algorithms can use a combination of your vocal and facial expressions to select the videos and then a human interviewer can watch them in order basically.”

The unfortunate truth is that humans are pretty biased in their hiring as well. This is a problem el Kaliouby is optimistic the technology can help solve. “At least the algorithm isn’t biased against a particular gender or age or ethnic group. It’s blind to all humans,” she said.

Similarly, using Emotional AI technology in the workplace requires caution. “If deployed in the wrong way, it could be like Big Brother,” said el Kaliouby. “But as a leader, I would love to know if my team is engaged or not, if people are stressed or depressed or lonely. Are people excited about our mission?”

Having clearly-outlined core values helps. El Kaliouby is committed to consent as a baseline principle. “We want to be clear that there’s a camera and that people consent to that,” she said. The integrity of science is another important core value for el Kaliouby. In 2011, Affectiva was two months away from failing to make payroll when they got a call from a major intelligence agency, who offered them $40 million in funding, on one condition: Her technology would pivot to surveillance. “I remember thinking we might not exist in a couple of months,” el Kaliouby said. “But I also went back to the core values, and that was not in line with our core values. So we hunkered down and we were able to raise money from other investors.”

Similarly, a few years ago, the company had a heated internal debate when they were approached to enter the public-safety space. “So this city basically wanted to install cameras everywhere and they wanted to use our technology to quantify outlier behavior, and suspicious behavior,” she said. “Yes, we had a debate: Shootings in school are an issue and tech can be helpful, but it can also be used to discriminate against other people.” They ultimately concluded that the technology wasn’t ready for that type of application yet.

Overall, el Kaliouby remains a strong advocate for thoughtful regulation, and is optimistic about the applications of this technology in the future. “There’s amazing work being done with this technology to diagnose depression, anxiety, and suicidal intent,” she said.  “We hide behind technology, but technology can see facial and vocal biomarkers.”

Angelica Frey is a writer and a translator based in Milan and Boston.