Rana Gujral is the CEO of Behavioral Signals, a company bridging the communication gap between humans and machines by introducing emotional intelligence, from speech, into conversations with AI.
Behavioral Signals is a relatively young startup having launched in 2016. Could you share the genesis story?
Driven by a passion to bring the company’s ground-breaking patented speech-to-emotion and speech-to-behaviors technologies to market, CTO, Alex Potamianos and Chief Scientist, Shri Narayanan founded Behavioral Signals in 2016. Shri is an Andrew J. Viterbi Professor of Engineering at the University of Southern California (USC). He founded and currently directs the Signal Analysis and Interpretation Laboratory (SAIL) at USC. Alex is a well-regarded innovator in the field of speech and natural language processing, interactive voice response systems and behavioral informatics. He has over 20 years of leadership experience both in the corporate and entrepreneurial sides of business, while his background includes working at AT&T Labs-Research, Bell Labs and Lucent Technologies.
With the goal of enhancing and forever changing the world of business, we believe technology is at the core of what can be achieved. Behavioral Signals’ algorithms analyze human emotions and behaviors, transform data into usable information, and lead to making better business decisions and increasing profits. Until now, human emotion has been considered impossible to quantify, and impossible to measure. With our patented analytics engine, we measure and interpret the “how” part of human interactions.
Behavioral Signals relies on a type of machine learning affective computing (also known as Emotion AI). Could you explain what this is?
Emotional artificial intelligence, also called Emotion AI or affective computing, is being used to develop machines that are capable of reading, interpreting, responding to, and imitating human affect—the way we, as humans, experience and express emotions. What does this mean for consumers? It means that your devices, such as your smartphone or smart speakers, will be able to offer you interaction that feels more natural than ever before, all by simply reading the emotional cues in your voice.
As our reliance on AI grows, so does the need for emotionally intelligent AI. It’s one thing to ask your virtual assistant to read you today’s game scores, but it’s entirely another thing to entrust your aging parents to the care of an AI-driven bot. Currently, AI may be able to do incredible things, like diagnose medical conditions and outline treatments, but it still needs emotional intelligence to communicate with patients in a more humane way.
What other types of machine learning technologies are used?
When it comes to machine learning we mainly utilize deep learning and NLP in our Behavioral Signals Processing analytics models. To explain this a little better, we’ve pioneered a field, Behavioral Signal Processing, based on over a decade’s worth of award-winning and patented research, to automatically detect information that is encoded in the human voice from audio and measure the quality of human interaction. It is an emerging discipline that bridges engineering with behavioral sciences and aims to quantify and interpret human interaction and communication through the use of engineering and computing innovations. Deep learning is the tool that helps create better predictive models.
What type of data do you collect from the tone of voice?
Our deep learning AI technology analyzes what and how something is being said, on both sides of a conversation, measuring emotions and behaviors. The range of emotions is quite diverse but what really matters is the aggregate intelligence of this analysis. To give you an example, consider a conversation between a bank employee and a customer; we can capture and measure politeness, composure (calm vs agitated), empathy towards the customer, customer’s reactions, and overall speaking style like slow, fast, engaged or disengaged, in order to calculate the conversation quality score, the effectiveness of the outcome, and the performance of the employee.
What type of data analysis is done to predict intent?
Intent prediction is very similar to what has already been mentioned. We use behavioral signals in the voice to predict a customer’s intent on purchasing a product, renewing a subscription, or whether a debtor will pay their debt. Intent prediction can help companies increase their sales and collection ratios, lower their costs, and ultimately improve customer satisfaction.
Behavioral Signals has been a 6-time Gold winner of the INTERSPEECH quality of human interactions & computational paralinguistics challenge. What is this challenge and how significant of an achievement is this?
Interspeech is the world’s largest technical conference focused on speech processing and application. It boasts of the largest attendance in this domain and a significant number of research papers. The conference emphasizes interdisciplinary approaches that address all aspects of speech science and technology, ranging from basic theories to advanced applications. Its challenges are considered the Turing Award in speech recognition and natural language processing disciplinaries. Winning it is an important recognition of our scientific work and unique ability to detect signals from audio data associated with behaviors and traits that drive human decision making.
How quickly can Behavioral Signals adapt to different languages, and how big of a dataset is needed?
Our technology is language agnostic. We listen to how something is being said rather than what is actually said. We listen to the expressed emotions, which are pretty universal in all languages. Of course, each language has its own unique traits that may call for a tweaking of our algorithms, but the difference on our predictive analytic models is generally small.
Can you discuss Behavioral Signals latest solution the AI-Mediated Conversation?
AI-Mediated Conversations(AI-MC) is an automated call routing solution that uses emotion AI and voice data to match the customer to the best-suited employee to handle a specific call. If we go back to the above mentioned example, of the bank employee and customer, our technology can guide the conversation dynamic with the ultimate goal to improve the outcome, whether that has to do with better customer experience, increased collections, or faster resolution times. Whatever the objective, there is always a catalyst that would allow both sides to reach the desired result. That contributing factor is usually a simple and naturally occurring human process: the affinity or rapport developed between people. Regardless of the type of business communication (sales call, support, collection), it will always be an interaction between real humans, where rarely is the affinity identical between two pairs of people. We have specific behaviors and traits that help us get along with some people, better than with others. This match is based on profile data and our superior algorithms developed from years of research and experience in NLP and Behavioral Signal Processing.
We recently implemented Behavioral Signals’ AI-MC solution to boost the effectiveness and efficiency of an EU Bank’s call center. The case-study was recognized by Gartner and included in its Emotion AI Adoption Report. The solution demonstrated a significant ROI with a 20% increase of active debt restructuring applications. In addition, this improvement was achieved with 7.6% fewer calls, leading to additional cost reductions. In absolute numbers, these results corresponded to a $300M upside for the bank.
Is there anything else that you would like to share about Behavioral Signals?
While we take a lot of pride in our research accomplishments, we’re equally thankful for the industry accolades. In fall of 2019, our technology was listed as a use-case leader in Gartner’s coveted Maverick Research that profiles bleeding edge technologies. Earlier this year, we were included in Gartner’s Hype Cycle, where our technology was rated as ‘transformational’. Last month, we were listed as a Gartner Cool Vendor 2020.
Thank you for the great interview, readers who wish to learn more should visit Behavioral Signals.