Amazon_Human_Emotion_Decive

Amazon Testing Emotion Recognition Gadget

The Alexa voice software team is developing a health and wellness device that recognizes human emotions through microphones and voice-discerning software. The voice-activated gadget is worn on the wrist and works with a smartphone app to detect the wearer’s emotional state. The eventual mission of the project, according to internal documents reviewed by Bloomberg, is to improve the wearer’s interactions with others. Amazon has not commented on the project, code named Dylan, but beta testing is underway in collaboration with Lab126, the developers behind Amazon’s Fire and Echo.

Amazon_Human_Emotional-Intelligence

Image Source: Thrive Global

Artificial Emotional Intelligence Not Going Anywhere

Advanced emotion AI technology detects, analyzes and processes emotional states through facial coding, gestures, body language and tone of voice. As Voicebot reported in March, emotion recognition technology is on the rise. Affectiva, an emotion measurement technology company founded in 2009, analyzes speech through changes in paralinguistics, tone, loudness, tempo, and voice quality to provide deeper insight into the human expression of emotion. Their facial algorithm works on any optical sensor, like a simple web cam for example, and measures seven emotion metrics: anger, contempt disgust, fear, joy, sadness and surprise. Annette Zimmermann, research VP at Gartner, believes emotional intelligence devices will continue to be a trend:  “By 2022, your personal device will know more about your emotional state than your own family.”

Putting the Data to Use

Emotional findings are used for market research to better understand consumer behavior for marketing campaigns, improved targeted advertising and product recommendations. Political polling is even utilizing emotion AI to forecast campaigns. ZimGo Polling is the first election analytics AI cloud-computing tool utilizing Artificial Emotional Intelligence (AEI). ZimGo was released in the US last year after proving successful in the recent South Korean presidential election.  “In today’s connected world, data analytics must get smarter and faster to serve the real-time demands of election campaigning,” said Oh, Sanggyoon, creator of ZimGo. “Next generation artificial intelligence must assess human emotional sentiment. People vote on emotion.”

Today, many voice assistants can customize user interactions based on time of day, geography, account profile, and past usage information. Emotional state is one more signal that may enable assistants to better align their responses with both the user’s mood and needs. Nuance Automotive demonstrated emotion recognition of drivers at CES 2019. In that instance, a facial recognition solution scanned the driver’s face to assess the emotional state. The voice assistant would then adjust the verbal responses to be terser and to the point if the driver appeared to be frustrated or angry while it would use a more colorful and verbose response if the driver appeared to be happy.

Nuance Automotive Demonstrates Driver Emotion Detection at CES and Shows How Virtual Assistants Can Become Proactive

Analyzing Emotion in Customer’s Voices: Rosbank and AI Startup Neurodata Lab

Amazon Files for Patent to Detect User Illness and Emotional State by Analyzing Voice Data