Air Canada has been ordered to honour the hallucination.
Air Canada has been ordered to honour the hallucination.
As airlines increasingly turn to AI to field passenger questions, alarm bells have been raised after a hallucinating Air Canada chatbot landed the carrier with a $1000 warning.
Air Canada lost the small claims court case after a passenger said they had been misled on bereavement fare rules by anautomated chatbot.
After the carrier tried and failed to disavow the fare terms quoted by the AI-powered service, they were ordered to pay CA$812 ($970) in damages and court fees.
Following the death of their grandmother, the passenger had been researching whether they could apply for bereavement fares retroactively, having already bought fares.
A screenshot of the chatbot’s response was taken to the tribunal showing that the bot was under the impression you could. Its response sent the traveller to the refund claims process.
“Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family … If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”
It failed to state the airline’s policy, which was available elsewhere online, was to not provide refunds “for travel that has already happened”.
Air Canada’s claim that it was up to the passenger to verify the bot’s response was dismissed by the court, which ruled that it was reasonable to trust the information provided by the AI Chatbot.
According to Forbes the airline previously had offered a $200 flight voucher to the bereaved traveller, which they refused.
The tribunal found that the website’s chatbot had been liable for “negligent misrepresentation” on the airline’s behalf.
The court filings make worrying reading for airlines.
Tribunal member Christopher Rivers wrote that the carrier’s claim was a “remarkable submission”.
“Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.”
The tribunal ruled that information provided by a chatbot should be treated as any other information on an airline’s website.