Chatbot Chaos: Air Canada Ordered To Refund Grieving Customer After AI Blunder
Following months of legal proceedings, a civil court ordered Air Canada to refund a customer who received inaccurate refund information from an AI chatbot on the company's website.
Air Canada had contended that the chatbot was a distinct legal entity, absolving the company of responsibility for its actions.
Jake Moffat, the petitioner, visited Air Canada's website after his grandmother's death, seeking clarification on bereavement rates from the chatbot.
However, the chatbot provided misleading information, instructing him to book a flight and request a refund within 90 days, which was not the correct procedure.
Air Canada provides a bereavement travel option for customers needing to travel due to the imminent or recent death of an immediate family member, but the refunds are not permitted for completed travel under this policy.
Mofatt filed a complaint in Canada’s Civil Resolution Tribunal, after the airline refused to refund while suggesting that the chatbot is a separate legal entity that is responsible for its own actions.
Finally the court ruled, “It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.”
Neuralink's First Brain-Chip Recipient Now Able To Control Mouse With Thoughts:Elon Musk
Click here