In April, startup Hume AI released a demo of what it calls the first emotionally intelligent artificial intelligence voice, “EVI” (short for “Empathic Voice Interface”).
Talking to EVI is similar to chatting with a new generation of AI-powered voice assistants. Think of it as Siri or Alexa with the ChatGPT feature added. Ask EVI to read a poem or explain the causes of the French Revolution, and a soft, gentle voice will recite a haiku or launch into a short lecture about 18th century France. EVI also suffers from the bugs and lags that often occur with AI voice bots.
What makes EVI different is that when a user requests a poem, a screen appears with a list of subtle emotions that the AI has detected in the user’s voice, ranging from awkwardness and confusion to disdain and surprise. Hume AI says it can analyze up to 48 different emotions.
“These 48 emotion models go far beyond what emotion scientists have explored before because we are able to collect much larger sets of data to train these models and they are able to capture a much greater variety of nuances,” said Alan Cohen, founder of Hume AI.
Hume’s empathetic AI is trained on recordings of podcasts, other media, and psychology experiments the company conducts. The company also sells AI that analyzes facial expressions, all of which is expected to help businesses and chatbots better understand their customers. EVI charges about $0.10 per minute, but varies by client.
“We have clients in the customer service space like Lawyer.com, as well as large technology companies that are using various technologies that we develop,” Cohen said.
Lawyer.com uses Hume to improve its 1-800 lines Call centers are perfect for technology that recognizes human frustration.
But Cowen has bigger ambitions: a personal AI assistant that truly understands what you want and is optimized for your happiness.
“As it learns from you over time, it becomes more personalized for you,” Cohen said.
An AI that has learned your voice and facial expressions might one day ask you, “Hey, have you noticed that you get tired and irritable around 3pm every day?” This sounds helpful, but so does a charming robotic voice gently reminding you that Starbucks Frappuccinos are half price until 4pm.
“We are more likely to make unnecessary purchases when we are in certain emotional states or at certain times of the day,” says Ben Brand, who helped develop industry ethical standards for empathetic AI at the Institute of Electrical and Electronic Engineers.
He also worries that AI will have the same effect on our emotions that smartphones have had on our attention spans.
“If you have a computer game that adjusts settings to guess how excited you are or how much you’re enjoying the game, you could develop an addiction to that game,” Brand says. “It could cause emotional desensitization.”
Of course, all of Brand’s nightmare scenarios assume that robots can actually infer emotional expressions in a scientifically reliable way, which is still up for debate.
“I’m not convinced these technologies work as well as marketers make them out to be,” says Andrew McStay, director of the Emotion AI Lab at Bangor University in the UK.
McStay noted that existing research into emotion recognition technology assigns more negative emotions to black male faces than white male faces, perpetuating harmful stereotypes.
But there are also broader questions about what emotions actually are and how different people and cultures express them, he said.
“So when it comes to the common assertion about these technologies – that there is a identifiable biological program and we’re not looking at society and culture – that’s problematic,” McStay said.
There’s a lot going on in the world, and Marketplace is here for you.
Marketplace helps you analyze world events and bring you fact-based, easy-to-understand information about how they affect you. We rely on your financial support to keep doing this.
Your donation today will power the independent journalism you trust. Help sustain Marketplace for just $5 a month so we can continue reporting on the things that matter to you.