Jake Moffat, a resident of Canada, back in 2022 faced the need to urgently book a ticket due to a sad event – he had to be in time for his grandmother’s funeral. The airline’s support AI chatbot, which helped Mr. Moffat choose a flight, assured the man that anyone traveling due to the loss of a loved one can receive compensation for the ticket in the amount of 20% of the cost, even retroactively.

Air Canada does provide such a service, but not retroactively – compensation should have been issued at the same time as the ticket. This is exactly what the company’s employees said to the surprised client when he approached them for money. And when Jake Moffat sent a screenshot of the correspondence with the chatbot, a customer service representative said that the AI assistant’s advice was “misleading” and assured that the company would take these issues into account.

Mr. Moffat was not satisfied with this answer and sued Air Canada, demanding that he be paid compensation in the amount of 440 Canadian dollars. It is unlikely that it was only a matter of money – rather in principle. And the Canadian Civil Resolution Tribunal sided with the client. Air Canada’s allegations that “it is not responsible for the information provided by its chatbots, agents and employees, and that the company’s chatbot is a separate legal entity responsible for its own actions” were called meaningless by the court and ordered the airline to pay Moffatt the funds for which he had been fighting for almost a year and a half.

Літературний редактор.

Commentary