Solutions

  • Products
  • Use Cases
  • Industries
  • EBOOK
  • Making it #EasyForEmployees
  • A guide with best practices for transforming the employee service experience.
  • WHITE PAPER
  • Modernizing government via ITSM
  • A research doc about government agencies’ digital transformation challenges.

Platform

  • REPORT
  • Gartner names ServiceNow a leader
  • 2018 Magic Quadrant for Enterprise High-Productivity Application PaaS.

Customers

  • CUSTOMER STORY
  • General Mills transforms HR
  • Global employee service experience shows entire corporation how it’s done.

Explore

  • PERSPECTIVE
  • Change starts with people
  • Successful companies invest in helping employees through major transitions.

Why bots need backstories

Many successful AIs are relatable, not just intelligent

By Christopher Null

  • Companies are hiring creatives to build backstories for chatbots and virtual assistants
  • Users are more likely to forgive relatable AIs for slow performance or inaccurate results
  • The right backstory can help drive engagement and boost sales

Companies have long prized sales people with engaging personalities that can put customers at ease. Increasingly, they’re prioritizing the same quality in chatbots and other virtual assistants.

That’s why leading tech companies are hiring screenwriters, anthropologists, journalists, comedians, and even poets to help craft smart, appealing AI personas that reflect their brands and business goals. The bet is that the right virtual attitude can improve the customer experience and boost bottom‑line results.

An effective persona can keep people using AI assistants, even when they aren’t perfectly accurate or fast. When Alexa or Siri can’t find the right song on a playlist, people tend to be a bit more tolerant than they are of impersonal search tools that fail at the same task.

The persona is not a panacea, however. “When it comes to expectations, the more personality there is, the more a customer will think it can react like a human,” says Chris Butler, director of AI for Philosophie, an AI design shop. An outsized AI persona, paired with poor performance or lack of transparency, can easily turn off users and destroy trust.



Creatives wanted

Making the right decision about what kind, and how much, personality to bestow on an AI bot can be the difference between success and failure. That’s why chatbot development teams often include creative professionals who can develop compelling characters, storylines and dialogue.

Google recruited a team of creative writers from Pixar, the Onion and elsewhere to help develop early versions of the Google Assistant persona. Capital One put a former filmmaker in charge of building the character behind Eno, its virtual banking assistant.

Creatives are landing AI persona gigs throughout the tech industry. Diana Lee, a “conversation designer” for tech consultancy Wizeline, came to the field through journalism. Today she helps clients develop AI personalities.

The creative team typically starts by building a backstory for the chatbot. This isn’t usually the fictional biography that Hollywood screenwriters write to help flesh out their human characters. Rather it’s a profile of the chatbot’s personality traits, values, speech and dialogue idiosyncrasies. (Sample consideration: Should the character be more businesslike, or more warm and gregarious?)

“Chatbots need backstories that highlight the value they provide,” says Lee.

The backstory helps developers and UX designers tackle the bot’s primary functional challenges: Giving it both thorough knowledge of the customer’s needs and pain points, and the ability to resolve those issues quickly and efficiently.

Lee worked on a Wizeline team that created a chatbot for the 2018 Australian Open. The team included engineers, UX designers, dialogue specialists, and NLP trainers. They based the bot’s backstory on veteran Aussie tennis star Lleyton Hewitt, imbuing it with Hewitt’s Aussie humor and vernacular. The team created a fun experience that delivered match results, video, and giveaway content.

Developers then worked to merge the bot’s natural‑language interface with the Open’s content database. The results were impressive: 57% of users returned daily to the bot to check for updates and get answers to their Open questions, says Lee. And 64% opted to receive real‑time tournament updates.

The backstory boost

Other companies that have long relied on human‑powered chat tools are also turning to chatbots with engaging personas.

oMelhorTrato.com, an online insurance firm based in Argentina, spent nine years using live chat to answer customer queries, says CEO Cristian Rennella. Late in 2017, the company started developing its own AI chatbot, powered by the open‑source machine learning platform TensorFlow.

The move helped drive a 24% boost in sales, mostly by directing customers more quickly and intuitively to the right insurance vendors.

oMelhorTrato.com operates in Argentina, Brazil, Colombia, and Mexico. As part of an effort to improve conversions, Rennella decided to go the extra mile and give the bot a backstory.

Actually, several backstories. “[We] gave a personality to our chatbot according to the location of the client,” Rennella says. “Our chatbot takes a specific backstory that generates synergy with them.” Localizing the chatbot’s language and behavior helps clients feel more comfortable with the experience, despite knowing that it’s not a human who is helping them.

The new chatbot now handles 57% of user queries without human intervention. Renella says that adding backstories boosted sales an additional 8%.

Too much faith in the machines

It’s important for chatbot developers to recognize that AI can’t do everything. “Having a personality for AI actually isn't appropriate in most cases,” says Philosophie’s Butler. “What a personality or personification does is increase potential trust in the system, and expectations for what it can perform. Too much trust can cause people to use the system in inappropriate ways.”

AI researcher Raja Parasuraman described this phenomenon in a 1997 paper, noting that users often trust automated systems even when they malfunction. This blind faith in technology was implicated in several plane crashes where flight crew failed to notice a disengaged autopilot.

“One place where personality seems to do well is when these systems are humble and upfront about the mistakes they’ll make,” says Butler. “This lowers expectations, so that people are delighted when something goes right rather than disappointed when something is slightly off.”

Christopher Null is a longtime technology and business journalist who contributes regularly to TechHive, PCWorld, Wired, and other publications.

Thank You

Thank you for submitting your request. A ServiceNow representative will be in contact within 48 hours.

form close button

Contact Us

I would like to hear about upcoming events, products and services from ServiceNow. I understand I can unsubscribe any time.

  • By submitting this form, I confirm that I have read and agree to the Privacy Statement.