The sales rep may not be the only one watching you at your next virtual meeting. Some companies are employing ’emotion AI’, a subset of AI that can detect human emotions, to monitor people on sales calls, provide feedback on their reactions and highlight the most compelling parts of speech.
The market for such sales enablement platforms is growing, with consultancy Verified Market Research claiming it could be worth $7.3 billion globally by 2028.
Zoom is one of the latest entrants into this burgeoning market, launching Zoom IQ for Sales last month. Described as conversational intelligence software, it claims to deliver “meaningful and actionable insights into customer interactions to improve salesperson performance and enhance customer experiences.”
However, the announcement was met with opposition from various human rights groups and privacy advocates. An open letter to Zoom CEO Eric Yuan, co-signed by the American Civil Liberties Union and the Electronic Privacy Information Center, called on the video communications company to halt its plans to move the feature forward. “This move to extract emotional data points from users based on the false idea that AI can track and analyze human emotions is a violation of privacy and human rights.” Zoom declined to be interviewed for the article.
According to Uniphore, one of the companies behind the technology, its ‘Q for Sales’ emotional AI assistant can “help salespeople ‘read the room,’ detect emotional cues and improve engagement.”
Patrick Ehlen is vice president of AI at the company. He thinks fears about AI emotion stem from a misunderstanding of the technology.
“Personally, I’m not a big fan of the term emotion AI,” he says. “There seems to be a concern that when people talk about emotional AI, they are talking about a computer system that can actually read your internal emotional state with a high degree of accuracy. The truth of the matter is that there is no AI system that can do that.”
Make AI understand context
Psychological research supports this claim. The way people communicate emotion is not universal, and facial expressions are much more context-sensitive than previously thought, according to a 2019 article by professors of psychology at the California Institute of Technology, Northeastern University, the California Institute of Technology. University of Glasgow and the University of Wisconsin-Madison.
Lisa Barrett, a psychology professor at Northeastern University and one of the paper’s authors, believes the findings “call into question the scientific justification used by some emotional AI technologies.”
Therefore, Ehlen is reluctant to make bold claims about the technology’s capabilities. He argues that “it is an unreliable technology for making very important and critical decisions”. However, he thinks it could come in handy on sales calls. In situations where a salesperson may be talking to multiple people in a video, it can be difficult to determine who is engaging and who is not.
“You’re looking at people in this little window, so it’s not as easy to read the room and see people’s facial expressions,” he says. “If you can have a machine that gauges their reactions and determines what point the CFO seems to be interested in, that can be a useful tool.”
To do this, visual AI software analyzes a number of elements of the conversation, including people’s facial expressions, tone of voice and gestures. “From that information, we can get much closer to having a 360-degree view of what people are doing when they’re having a conversation to better understand them,” adds Ehlen.
The platform can then give sellers real-time feedback on sentiment and engagement to help them tailor their responses and, in theory, improve their sales conversions.
behavioral AI
Sybill is another platform that claims to use emotional intelligence to speed up the sales process. However, co-founder and CEO Gorish Aggarwal is reluctant to say that the program can identify people’s emotional state. “We think of it as behavioral AI, which is different from emotion because emotions are very subjective,” he says. “You can’t tell if a person is fearful, angry, or dismissive just by looking at their face.”
Instead, the software seeks to identify an individual’s body language or facial expression to highlight key moments, says Aggarwal: for example, if someone nods during a conversation or smiles. Sellers can then review the recording to see when people were most engaged and which parts they should follow.
Although Aggarwal claims his AI can determine if someone is nodding with up to 95% accuracy, the impact of the technology in the context of sales has yet to be proven. The Uniphore and Sybill platforms launched earlier this year and are currently conducting benchmark studies to determine how well their AI programs can improve sales performance.
Jason Bell is an Associate Professor of Marketing at Saïd Business School and works on AI models for computer vision and natural language processing. He believes that there is a lot of ‘overdemand’ in the market for emotional artificial intelligence.
You can’t tell if a person is fearful, angry, or dismissive just by looking at their face.
“A lot of webcams and front-facing smartphone cameras are low-quality, so even if you have a great predictive model, the signal you get isn’t great,” he says. “In principle it is possible, but current technology has not convinced me.”
Relying on this technology in a sales context could lead people to dead ends, according to Bell. His biggest concern is with the accuracy of the technology. “It dramatically simplifies emotional states,” he adds. “If you’re monitoring one or both parties involved in the sales process and categorizing their emotions, it can be pretty raw… It could create more complications for the sales process without adding a ton of value.”
Privacy concerns, as raised in the open letter to Zoom, have also made people wary of implementing the technology. Uniphore and Sybill are aware of the concern.
“We’re very aware that some people see AI as creepy,” says Ehlen. “So we’ve tried to build in as many protections and safeguards as possible to make people feel comfortable.”
As a result, Q for Sales is an optional experience. With Sybill, users are notified that “the call is being recorded and notes are being taken,” though there is no reference to the AI unless the user chooses to reveal it.
The iterative process of using AI
Aggarwal does not recommend using the program for internal calls or in situations where there is an imbalance of power. For example, using it as part of an interview process would be “ethically wrong as they could not refuse.”
There is also a risk that people will change their behavior when they realize they are being recorded. Ehlen admits this could mean “the AI won’t work as well as it used to” and will have to go through a retraining process.
“It’s an iterative process,” he says. “As these technologies make their way into the mainstream, people will become more comfortable with them and this won’t be a problem.”
Whether these conversational AI programs become a standard part of the sales process remains to be seen, but convincing customers to be monitored by AI could be the hardest sell.
Source link