news

AI “‘personalities’ could become even more famous than their creators

Posted: 29 June 2017 | By Charlie Moloney

“Over time Siri, Alexa, and Cortana, and their individual ‘personalities’, could become even more famous than their parent companies”, claimed three tech strategists from global management consultancy group, Accenture, in an article for the Harvard Business Review this Wednesday 27/06.

The trio highlighted the way in which recognisable brand mascots are rapidly changing in their nature due to the introduction of intelligent AI, and “three new types of decisions executives face at the intersection of technology, personality, and strategy”:

What form will your AI take?

The authors said, “a spectrum of intelligent personalities” and forms exist in which to present your AI technology, and brands are not limited to chatbots. They named “screens, voices, physical “boxes” like Amazon Echo”, and “text” as examples of ways to deliver an AI experience for customers.

“a spectrum of intelligent personalities” and forms exist in which to present your AI technology, and brands are not limited to chatbots

“Cognitive agents”, they said, are continually developed into new manifestations, examples being a “virtual person” who appears on a computer screen, hologram technology which could project AI into a room, and a corporeal AI, built in a machine body with human features. These robots, the article said, could be “literal front-office brand ambassadors”.

Companies were reminded that customers form impressions of AI systems in the same way they will of human customer service reps. But the consequences of bad customer experiences with an AI, which “can theoretically interact with tens of thousands of people at once”, can be profound.

Choose your bot’s personality carefully

As an anthropomorphised AI becomes a brand ambassador, it is important that its personality embodies the values of the company it represents. “Consider how a technology like Siri or Alexa has already become so closely associated with the Apple and Amazon brands”.

the future of AI in customer service will require bots with much richer personalities, which can even demonstrate complicated emotions like sympathy

Will your AI be “helpful, like a nerdy friend”, “confident and considerate”, or “smart and witty with a slight edge”? The article used Siri as an example of an AI which can be cutting, jealous, and cheeky, in keeping with Apple’s brand values: “individuality over conformity”.

Tech companies should learn “how best to attract and retain different types of talent”, such as creatives, because the future of AI in customer service will require bots with much richer personalities, which can even demonstrate complicated emotions like sympathy, e.g. when a customer misses a flight.

Keep the big decisions in-house

As certain AI’s develop and become more proficient, businesses will be keen to build capabilities on to the most prominent bots and let the AI handle customer transactions on their behalf. Take Amazon Alexa, who can order pizza from Domino’s, check the status of Delta flights, and tell Capital One customers their bank balance.

These companies previously “owned the entire customer experience with their customers”, and a risk of letting an external AI take an active role in that process, highlighted by the Accenture strategists, is that the bot is then allowed to tackle sensitive, ethical issues whilst representing the company.

make sure they are controlled, and don’t get “ahead of businesses’ ability to address the various ethical, societal, and legal concerns involved”

One problematic scenario the authors raise, among others, is that a bot has to be able to respond appropriately if it is given an indication that a user might be struggling with a mental health issue, something which major AI bots are failing to do, a study has warned.

The article advised companies considering adopting personified AI to represent them to make sure they are controlled, and don’t get “ahead of businesses’ ability to address the various ethical, societal, and legal concerns involved”

 

Related functions

Related organisations

, ,

Related key players