this post was submitted on 17 Feb 2024
48 points (100.0% liked)

TechTakes

1436 readers
157 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] mozz@mbin.grits.dev 21 points 9 months ago (1 children)

This entire article is a treasure trove.

According to Air Canada, Moffatt never should have trusted the chatbot and the airline should not be liable for the chatbot's misleading information because Air Canada essentially argued that "the chatbot is a separate legal entity that is responsible for its own actions," a court order said.

Tribunal member Christopher Rivers, who decided the case in favor of Moffatt, called Air Canada's defense "remarkable." ... Rivers found that Moffatt had "no reason" to believe that one part of Air Canada's website would be accurate and another would not.

Last March, Air Canada's chief information officer Mel Crocker told the Globe and Mail that the airline had launched the chatbot as an AI "experiment." ... Over time, Crocker said, Air Canada hoped the chatbot would "gain the ability to resolve even more complex customer service issues," with the airline's ultimate goal to automate every service that did not require a "human touch."

Experts told the Vancouver Sun that Air Canada may have succeeded in avoiding liability in Moffatt's case if its chatbot had warned customers that the information that the chatbot provided may not be accurate.

[–] froztbyte@awful.systems 14 points 9 months ago (1 children)

As an experiment, it certainly produced some findings

[–] Evinceo@awful.systems 7 points 9 months ago

Least annoying A/B test.