Air Canada (TSX: AC) may be the first company to try to wiggle out of a refund by attempting to make its own chatbot liable for making a mistake. Nice try, but a small claims court said no.
Artificial intelligence chatbots, as many now already know, are prone to “hallucinations,” or returning false, misleading, or seemingly made-up information that they present as fact. This was what happened to the Air Canada customer service chatbot when it explained how the airline’s bereavement rates worked to a passenger.
Science fiction writers: The legal case for robot personhood will be made when a robot goes on trial for murder.
— Chris Farnell (@thebrainofchris) February 17, 2024
Reality: The legal case for robot personhood will be made when an airline wants to get out of paying a refund. https://t.co/aTGdErEr9g pic.twitter.com/4JiVLZRhq2
Jake Moffatt, who had just lost his grandmother, was told by the chatbot that he could book a flight from Vancouver to Toronto immediately and just request for a refund in 90 days. And that’s what Moffatt did, only to find out later that it was not the case, as airline policy explicitly states that it does not provide retroactive refunds for bereavement travel.
Moffatt then reached out to the airline and sent screenshots of its conversation with the chatbot to ask for a refund. Instead of issuing a refund, though, the airline noted that the chatbot linked to the actual page of the bereavement travel policy and basically that Moffatt should have based his action on that instead.
Air Canada then offered a $200 coupon he can use on a future flight and said that they would update the chatbot. Moffatt refused and brought the matter to Canada’s Civil Resolution Tribunal.
In trying to wiggle out of the liability and the resulting refund, the airline suggested in its defense that its chatbot “is a separate legal entity that is responsible for its own actions.” But the airline failed to say why it believes this is the case. Note that chatbots rely on the information that they are trained on.
“I find Air Canada did not take reasonable care to ensure its chatbot was accurate,” Christopher C. Rivers, the Tribunal Member, wrote in his decision. He found Moffatt “has made out their claim of negligent misrepresentation and is entitled to damages” which would be on top of the partial refund of $650.88 and Moffatt’s tribunal fees.
Air Canada, hopefully embarrassed by the mishap, did not contest the ruling.
Information for this story was found via Ars Technica, X, and the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.
One Response
TPTB want to replace us. TPTB have no fear of us replacing them.