Overview
In 2022, a passenger asked Air Canada’s website chatbot about bereavement fares. The chatbot advised him to buy a full‑fare ticket for his grandmother’s funeral and claim a bereavement discount after travel, even though the airline’s policy requires applications before travel. After being denied the discount, he sued. A British Columbia Civil Resolution Tribunal ruled in February 2024 that Air Canada was liable for its chatbot’s misinformation and ordered the airline to reimburse about CA$812 (US$600). The case set a precedent that companies must stand behind their AI agents.
What Went Wrong
Air Canada’s chatbot was not updated with the correct bereavement fare policy and confidently provided incorrect legal advice. The company argued the chatbot was a "separate legal entity," but the tribunal rejected that.
How It Was Fixed
Following the ruling, Air Canada updated its website and chat systems to ensure bereavement information is accurate. The incident highlighted the need for human oversight and for companies to clearly state that AI responses are not authoritative.