“Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family … If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”
It failed to state the airline’s policy, which was available elsewhere online, was to not provide refunds “for travel that has already happened”.
Air Canada’s claim that it was up to the passenger to verify the bot’s response was dismissed by the court, which ruled that it was reasonable to trust the information provided by the AI Chatbot.
According to Forbes the airline previously had offered a $200 flight voucher to the bereaved traveller, which they refused.
The tribunal found that the website’s chatbot had been liable for “negligent misrepresentation” on the airline’s behalf.
The court filings make worrying reading for airlines.
Tribunal member Christopher Rivers wrote that the carrier’s claim was a “remarkable submission”.
“Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.”
The tribunal ruled that information provided by a chatbot should be treated as any other information on an airline’s website.