rb-nl-feat-img

Analyzing Emotion in Customer’s Voices: Rosbank and AI Startup Neurodata Lab

rb-nl-feat-img

Photo Credit: Neurodata Lab

Rosbank, a Russian universal bank, announced last week it was testing emotion recognition technology at its call center, with plans to deploy the technology as a pilot project.  The technology was developed by Neurodata Lab, a startup that develops AI-based technology for real-time emotion analytics and analysis of consumer behavior, as well as collecting and processing behavioral data. Rosbank is a universal bank, part of the Société Générale Group, with more than 4 million private customers. Société Générale is a French multinational investment bank and financial services company.

Automatically Detecting Customer Satisfaction by How They Speak

The press release states that Neurodata Lab’s technology can recognize customer emotion by considering parameters such as the number of pauses in the operator’s speech, the change in voice volume, and the total conversation time in order to do a real-time calculation a Customer Satisfaction Index. This is applicable to both vocal and speech-to-text services. The announcement states that managers will receive statistics covering the recognized customer emotions, how the dynamics of the Customer Satisfaction Index change over time, as well as comparative indicators of the effectiveness of the provided service, for every call. Vasiliy Voronov, acting Innovations Director of Rosbank commented on the technology,

Today, biometrics technologies are becoming more and more popular in various fields, including banking. Rosbank is constantly working to improve the quality of customer service, and we hope that the introduction of emotion analytics in the customers’ voices will help us to bring the service up to a new level.

This announcement comes after Nuerodata Lab presented their Emotion AI solutions for CX Management and Robotics and IoT at CES 2019. A Nuerodata Lab press release from CES stated:

Neurodata Lab’s technology can be used in banking, insurance, retail, HoReCa to gather reliable real-time analytics and quickly manage customer experience and quality of service. It can be used in call centers to monitor customer satisfaction based on voice analysis.

The partnership with Rosbank is an example of how Neurodata Lab’s Emotion AI can be used in banking or other call center operations. The Emotion AI technology is also able to analyze real-time video. A clear use-case for this is to analyze real-time streaming video from security cameras, in order to track the quality of customer service and how well customers perceive changes.

Emotion Recognition Technology is on the Rise

Neurodata Lab’s Emotion AI is applicable to a number of fields. At CES, Neurodata Lab demonstrated Promobot, a robot that is able to recognize emotions and react accordingly, as well as measure how satisfied a user is with the interaction. The robot is able to accurately recognize 7 emotions: happiness, sadness, surprise, anger, anxiety, disgust, and a neutral state. It also has two emotion recognition applications: for business and personal use. Business use means Promobot is calculating a Customer Satisfaction Index, and personal use means Promobot is calculating a Smile Index. In either scenario, the Promobot adapts its answers and reactions according to the index calculated.

promobot

Photo Credit: Neurodata Lab

The concept of using artificial intelligence to recognize and analyze emotion in speech is not new. Consider a blog post from the IBM Cloud Blog in November of 2016, written by Anton McConville, discussing measuring emotion with IBM Watson speech to text and tone analysis. The post details combining Watson Speech-to-Text and Watson Tone Analysis to produce a tool that is applicable to a variety of fields.

Affectiva is a company that spun out of the MIT Media Lab in 2009, producing Emotion AI technology that is now used in gaming, automotive, robotics, education, healthcare, experiential marketing, retail, human resources, video communication, and more. The company’s technology is also used by market research firms to measure consumer emotion responses to digital content, such as ads and TV programming. Rosebank’s implementation is the latest in a series of enterprise experiments around emotion detection and how it can be used to improve service.

Voice Assistant Technology to Include Emotion AI in the Future

In October of 2018, Amazon filed a patent with the U.S. Patent and Trademark Office related to detecting the physical end emotional wellbeing of users based on interactions captured in voice assistant data. The innovation was labeled as “Voice-Based Determination of Physical and Emotional Characteristics of Users,” and detecting illness was the primary example use case given. The patent application depicts a user coughing and telling Alexa they are hungry. Alexa detects the cough, suggests eating chicken noodle soup, in addition to suggesting the user order cough drops.

At CES 2019, Nuance demonstrated its inclusion of Emotion AI for the car. The dashboard-housed voice assistant adjusted its dialogue based on analyzing driver emotions from facial expression cues. An example of this is how the voice assistant spoke formally and concisely when detecting a frown, and less formally, in a more verbose manner when detecting a smile.

One way to make voice assistants more personalized and humanlike is to enable them to detect and adjust to a user’s emotional state just like humans do automatically in daily interactions. Deployment of these features is largely experimental today, but you can see how emotion recognition will become more commonplace as the technology advances and new use cases are adopted for everything from customer service and your in-car experience to smart speaker personalization.

Nuance Automotive Demonstrates Driver Emotion Detection at CES and Shows How Virtual Assistants Can Become Proactive

Amazon Files for Patent to Detect User Illness and Emotional State by Analyzing Voice Data