How AI sentiment analysis can improve the customer experience

Machine learning and AI sentiment analysis have improved. That’s good news for customer service leaders and people managers

AI sentiment analysis
  • Machine learning techniques are expanding the possibilities of sentiment analysis and affective computing
  • New cognitive recognition tools can monitor drivers’ facial expressions for signs of drowsiness or stress
  • Emotion‑recognition tools can identify early signs of Parkinson’s disease

After 10 hours on the road, your attention is wavering, your eyelids are drooping, and the yawns more frequent. Just before you drift off, the car’s autopilot flicks on. “You appear drowsy. Let’s take a break,” the dashboard computer announces, as it takes the wheel and pulls over to the side of the road.

That scenario isn’t far off. Affectiva, a Boston‑based AI startup, has spent the last several years developing AI sentiment analysis technology that can detect the cognitive and emotional states of drivers.

Equipped with cameras and machine learning algorithms that find patterns in the minutiae of eye movement and facial expressions, these systems can include safety functions such as drowsiness monitoring or entertainment features that can keep drivers engaged.

Kia, for example, recently tested a feature that offers different music, podcast, or lighting options in response to someone’s expressions. If, for instance, the driver appears to be having a good time, the system plays livelier songs and cranks up the volume.

These in‑car systems belong to an emerging computer‑science field called affective computing, which expands on the concept of traditional sentiment analysis. Whereas sentiment analysis focuses on detecting attitudinal signals in text, affective computing can analyze images, video and audio, and detect a broader range of human feelings like excitement, anger, or, in the case of the car, drowsiness.

Breakthroughs in affective computing—combined with AI‑fueled advances in traditional text‑based sentiment analysis—are now leading to a surge in marketplace applications, designed to meet rising demand for tools that can discern the true feelings of customers and employees.

“If you know how to understand sentiment, it can be used across a bunch of areas, like marketing, sales, or service,” says Richard Socher, an AI researcher who is now chief scientist at Salesforce. “People today talk to algorithms as much as they talk to people. It’s increasingly useful to figure out the sentiments expressed.”

Sentiment analysis evolved

Sentiment analysis has a long track record in marketing. Early generations of sentiment software relied on scanning large volumes of text for keywords that indicated a positive or negative reaction, a blunt approach known as “bag of words.”

A given piece of text may include positive terms (awesome, cool, love) or negative ones (painful, annoyed, gross). By identifying those words, sentiment analysis tools could often, but far from always, identify whether a piece of prose, such as a product description or review, was a hit or a dud with consumers.

The bag of words approach has well‑known flaws. Text without obviously positive or negative words often gets classified mistakenly as neutral. Euphemisms and subtle litotes such as “not bad” can soften negatives or flip around the meanings of short phrases.

Consider a phrase like “This is a great movie for frat boys.” A comment like that is likely meant ironically, playing off negative stereotypes about fraternity members. Today’s sentiment analysis tools can often pick up that level of linguistic nuance.

Researchers are now using neural networks to teach algorithms to expand their focus beyond words, starting with more complex grammatical structures like sentence fragments. As a result, the accuracy of sentiment analysis has improved and is now pushing 90%, experts say.

Machine learning and sentiment analysis

Modern sentiment analysis enables far more precise measurements at scale. The tools can tell you whether 10,000 reviewers genuinely liked a product, at a scale beyond the feasibility of using human analysts.

AI sentiment analysis has applications beyond marketing. Investors have long known that market optimism and pessimism influence the rise and fall of stock prices. Startups such as AlphaSense attempt to measure those positive and negative signals by applying machine learning techniques to daily trawling of market coverage, press releases, and analyst reports.

HR is also fertile ground for modern sentiment analysis tools. Big companies often struggle to capture the honest opinions of thousands of workers through surveys or other means. One sentiment analysis tool, UltiPro Perception, analyzes answers on employee questionnaires and creates a map showing collective employee sentiment about the company and its managers.

AI gets emotional

While sentiment analysis expands to new use cases, a related field is emerging: the ability of computers to recognize a wider spectrum of human emotion through text, image recognition, natural language processing, audio or video.

Affective computing (also called emotional AI) is innately more complex than sentiment analysis. Affectiva has been in the field for about a decade, building software that was first used by marketing and advertising agencies to detect how consumers in different geographies reacted to their ads. “If you do that at scale online, you get quite a lot of data. We’ve analyzed 7.5 million faces in 87 countries, and close to 40,000 ads,” says Gabi Zijderveld, Affectiva’s chief marketing officer.

That data has helped improve Affectiva’s algorithms, but moving into new industries isn’t always a fast process. The auto manufacturers implementing Affectiva’s in‑car technology won’t release those models for another three years, Zijderveld says.

Meanwhile, AI is opening up enough new possibilities to rationalize investment in the core technology. In healthcare, Affectiva researchers have tested emotional AI for applications like early detection of Parkinson’s disease, which has characteristic facial signifiers, and helping autistic children identify other people’s emotions.

One hope of researchers is that the tech will eventually find its way into our smartphones, or into AI voice platforms like Alexa or Cortana, which may soon be able to understand a lot more than your words.