Skip to content

Archives

Air Canada found responsible for chatbot error

  • Air Canada found responsible for chatbot error

    I predict this’ll be the first of many such cases:

    Air Canada has been ordered to compensate a man because its chatbot gave him inaccurate information. […] “I find Air Canada did not take reasonable care to ensure its chatbot was accurate,” [Civil Resolution Tribunal] member Christopher C. Rivers wrote, awarding $650.88 in damages for negligent misrepresentation. “Negligent misrepresentation can arise when a seller does not exercise reasonable care to ensure its representations are accurate and not misleading,” the decision explains. Jake Moffatt was booking a flight to Toronto and asked the bot about the airline’s bereavement rates – reduced fares provided in the event someone needs to travel due to the death of an immediate family member. Moffatt said he was told that these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued, and submitted a screenshot of his conversation with the bot as evidence supporting this claim. He submitted his request, accompanied by his grandmother’s death certificate, in November of 2022 – less than a week after he purchased his ticket. But his application was denied […] The airline refused the refund because it said its policy was that bereavement fare could not, in fact, be claimed retroactively. […] “In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website,” Rivers wrote.
    There’s no indication here that this was an LLM, but we know that LLMs routinely confabulate and make shit up with spurious authority. This is going to make for a lucrative seam in small claims courts.

    (tags: ai fail chatbots air-canada support small-claims chat)