You are here: Home » Adult Webmaster News » Automating Customer Service has its Risks; Just...
Select year   and month 
 
February 19, 2024

Automating Customer Service has its Risks; Just Ask Air Canada’s Chatbot

BRITISH COLUMBIA, Canada – At a time when businesses are embracing automated solutions in a customer service context, be it driven by sophisticated artificial intelligence or simpler form of old-school “chatbot,” a cautionary tale has arisen out of Canada.

Thankfully for the company involved (Air Canada), the liability the company incurred thanks to its automated customer service representative was minor and inexpensive, but it doesn’t take much imagination to think up more dire scenarios flowing from similar circumstances.

At issue in the dispute between Air Canada customer Jake Moffat and the airline, the company’s chatbot gave Moffat inaccurate information regarding bereavement fares, most crucially telling Moffat he could apply for a bereavement refund within 90 days of purchasing his ticket, by completing an online form.

Air Canada didn’t dispute that the chatbot gave Moffat the wrong information, but the company did dispute the question of whether Moffat was due his bereavement refund due to the misinformation provided by the chatbot.

Moffat sued Air Canada for the difference between the fares, with the issue landing in front of the Civil Resolution Tribunal, which functions as a small claims court within British Columbia’s public justice system.

Air Canada argued it wasn’t liable for the chatbot’s mistake, because “it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot,” as Tribunal member Christopher Rivers put it in his decision.

Rivers, perhaps unsurprisingly, was not swayed by the airline’s defense.

“(Air Canada) does not explain why it believes that is the case,” Rivers wrote. “In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”

Reading through the decision, it’s probably fair to say that Air Canada didn’t give much thought to how it went about arguing against Moffat’s claims – or much thought as to whether the negative publicity which could result from the dispute might outweigh the roughly $870 that simply issuing the refund would have cost the company.

“In its boilerplate Dispute Response, Air Canada denies ‘each and every’ one of Mr. Moffatt’s allegations generally,” Rivers observed in his ruling. “However, it did not provide any evidence to the contrary.”

(Pro tip: When denying an opponent’s claims, providing evidence to the contrary is, generally speaking, not such a bad idea.)

“When a party fails to provide relevant evidence without sufficient explanation, an adjudicator is entitled to draw an adverse inference,” Rivers wrote. “An adverse inference is where an adjudicator assumes a party has failed to provide relevant evidence because the missing evidence would not support their case.”

Now, to my knowledge, there are no adult websites that offer “bereavement memberships,” but the point here isn’t that it’s bad to have chatbots offering misinformation about specific types of refunds. The point is it’s bad – and potentially liability-inducing – to have chatbots offering your customers incorrect information of any kind.

The $812.02 the company has been ordered to pay Moffat obviously isn’t going to break Air Canada’s bank, but the hit to the company’s reputation (not least because it seems rather ghoulish to bicker with a customer over a legitimate bereavement refund to begin with) certainly feels like it would have been worth $812.02 to avoid.

So, if you use chatbots or automated systems of any other sort to assist with your customer service, what’s the moral of the story here?

First, it’s a good idea to occasionally put yourself in your customer’s shoes as literally as possible, by using the automated system yourself. Ask questions to which you know the correct answer, and to which the chatbot should also know the correct answer, and if it gets the answer wrong, you know somebody has some coding to do.

Another consideration is sometimes the correct answer is subject to change. In that case, you need to be sure you’re updating your systems accordingly, so your customer service representatives, robot or otherwise, have the latest information to offer your customers.

Finally, if something does go awry because your system provided inaccurate information, whether that misinformation was offered by a human employee or a bot, simply own the mistake and do what’s needed to make things right with your customer(s). Fighting for every last nickel might seem fiscally responsible in the short term, but you might feel differently about things later, if your miserly ways wind up going viral.

Robot image by Pavel Danilyuk from Pexels



 
home | register | log in | add URL | add premium URL | forums | news | advertising | contact | sitemap
copyright © 1998 - 2009 Adult Webmasters Association. All rights reserved.