Air Canada will have to compensate a customer, after a chatbot provided inaccurate information about bereavement fares.
On Thursday, the Civil Resolution Tribunal of British Columbia (CRT) forced Air Canada to honour a fake refund policy made up by its own chatbot, in what Christopher Rivers of the CRT labelled as a “remarkable submission”.
In November 2022, Jake Moffatt, a Canadian resident grieving the loss of his grandmother, inquired about bereavement fares for a flight using the airlines online chatbot. The chatbot, instead of directing him to the correct information, suggested he book a regular ticket and request a partial refund under the airline’s bereavement policy within 90 days.
Moffatt followed the bot’s advice and purchased a full price ticket from Vancouver to Toronto. However, when applying for the refund, he learned from Air Canada employees that the airline did not permit retroactive applications.
Moffatt provided screenshots of the interaction with the chatbot to the CRT, which read, “Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family.”
“If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”
Air Canada acknowledged that the chatbot responded with “misleading words” but argued that Moffatt “did not follow proper procedure to request bereavement fares and cannot claim them retroactively” and that the company “cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot.”
Rivers ruled in Moffatt’s favour, stating that Air Canada did not explain why it believes it isn’t liable. “In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.”
“While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”
This landmark decision highlights the increasing accountability companies face for the actions of their AI systems and raises concerns about the reliability of chatbots and the potential consequences of their malfunctions.
Air Canada has since acknowledged the issue and pledged to review and improve its chatbot’s training and accuracy. The airline also stated it would work on clarifying its bereavement fare policy on its website and within the chatbot itself.
The CRT ordered Air Canada to reimburse Moffatt within 14 days of the ruling.