Air Canada's argument that its AI-powered customer chatbot was solely liable for its own actions didn't hold up in civil court (thank goodness),is there any videos with dillion harper doing anal sex with a guy? and now the airline must refund a customer who was given the incorrect information about being comped for his airfare.
The 2022 incident involved one Air Canada customer, Jake Moffatt, and the airline's chatbot, which Moffatt used to get information on how to qualify for bereavement fare for a last-minute trip to attend a funeral. The chatbot explained that Moffat could retroactively apply for a refund of the difference between a regular ticket cost and a bereavement fare cost, as long as it was within 90 days of purchase.
SEE ALSO: Reddit has reportedly signed over its content to train AI modelsBut that's not the airline's policy at all. According to Air Canada's website:
Air Canada’s bereavement travel policy offers an option for our customers who need to travel because of the imminent death or death of an immediate family member. Please be aware that our Bereavement policy does not allow refunds for travel that has already happened.
When Air Canada refused to issue the reimbursement because of the misinformation mishap, Moffat took them to court. Air Canada's argument against the refund included claims that they were not responsible for the "misleading words" of its chatbot. Air Canada also argued that the chatbot was a "separate legal entity" that should be help responsible for its own actions, claiming the airline is also not responsible for information given by "agents, servants or representatives — including a chatbot." Whatever that means.
"While a chatbot has an interactive component, it is still just a part of Air Canada’s website," responded a Canadian tribunal member. "It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot."
The first case of its kind, the decision in a Canadian court may have down-the-road implications for other companies adding AI or machine-learning powered "agents" to their customer service offerings.
Topics Artificial Intelligence
Apple Maps will help drivers avoid redYouTube bans David Duke, Richard Spencer, and other white nationalist personalitiesEverything coming to Disney+ in July 2020How to recognize if you're being racially gaslightedCarl Reiner, beloved comedy legend, is dead at 98'The Office' stars break down Jim and Pam's memorable phone call sceneJimmy Kimmel issues statement apologizing for Blackface sketchesThe 10 best onThe online lesson plan marketplace boomed when the pandemic hitEverything coming to Amazon Prime Video in July 2020 What 'The Sims 4' Lovestruck expansion pack gets wrong about polyamory Meta Quest headsets are bricking after updates, but a trick can revive them Bluesky adds trending topics Hackers take over Google Chrome extensions in cyberattack Too many browser tabs? You should just close them. NYT Connections Sports Edition hints and answers for December 27: Tips to solve Connections #95 Bumble AI photo picker tool in the works Pet tech deals [September 18, 2024] Dallas Mavericks vs. Phoenix Suns 2024 livestream: Watch NBA online Best home security deal: Save $120 on the Ecobee Total Security bundle
0.1486s , 14424.3984375 kb
Copyright © 2025 Powered by 【is there any videos with dillion harper doing anal sex with a guy?】Enter to watch online.Air Canada loses court case after its chatbot hallucinated fake policies to a customer,