Air Canada Found Liable For Chatbot's Bad Advice On Plane Tickets 72
An anonymous reader quotes a report from CBC.ca: Air Canada has been ordered to pay compensation to a grieving grandchild who claimed they were misled into purchasing full-price flight tickets by an ill-informed chatbot. In an argument that appeared to flabbergast a small claims adjudicator in British Columbia, the airline attempted to distance itself from its own chatbot's bad advice by claiming the online tool was "a separate legal entity that is responsible for its own actions."
"This is a remarkable submission," Civil Resolution Tribunal (CRT) member Christopher Rivers wrote. "While a chatbot has an interactive component, it is still just a part of Air Canada's website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot." In a decision released this week, Rivers ordered Air Canada to pay Jake Moffatt $812 to cover the difference between the airline's bereavement rates and the $1,630.36 they paid for full-price tickets to and from Toronto bought after their grandmother died.
"This is a remarkable submission," Civil Resolution Tribunal (CRT) member Christopher Rivers wrote. "While a chatbot has an interactive component, it is still just a part of Air Canada's website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot." In a decision released this week, Rivers ordered Air Canada to pay Jake Moffatt $812 to cover the difference between the airline's bereavement rates and the $1,630.36 they paid for full-price tickets to and from Toronto bought after their grandmother died.
Attorney fees (Score:3)
Wonder how much Air Canada paid in attorney fees to avoid an $812 refund?
Re:Attorney fees (Score:5, Informative)
Wonder how much Air Canada paid in attorney fees to avoid an $812 refund?
That's not the point. They didn't want to set a precedent for all the bad advice the chatbot gives in the future.
Re:Attorney fees (Score:5, Insightful)
That's not the point. They didn't want to set a precedent for all the profitable advice the chatbot gives in the future.
Re: (Score:3)
Nor for the effects of going cheap on labor. They want to hire the cheapest idiot in the room (an AI) rather than pay properly for an employee who would know better.
Re: (Score:2)
If you don't want to set a precedent, going to court is the worst thing you could do. This wasn't quite a court, so the precedent isn't binding, but it's a lot closer than giving someone you just ripped off the bereavement rate and apologizing.
Re: (Score:3, Insightful)
That's not the point. They didn't want to set a precedent for all the bad advice the chatbot gives in the future.
They also could have done that by refunding the difference before getting sued.
Not everything is a Law & Order episode. (Score:3)
Whatever legal precedent is attached to a small claims court ruling is so slight as to effectively be non-existent for all practical purposes.
Companies uniformly fight all small claims cases without so much as glancing as the specifics beforehand, because the individual cases aren't worth the corporate time it'd take to come to individual go/no-go decisions on fighting each one, and not fighting small claims at all is more expensive (both in the form of the default judgements and in the form of reputation f
Re: (Score:2)
Small claims, yet, here we are reading about it on /.
Now the whole world has seen their frankly idiotic argument that I would have been able to shoot down at age 13 just from idly reading my dad's law books.
Re: (Score:2)
You've got that right. No big company is willing to admit wrongdoing, even when they did it.
Like the two years, and 33% of the money to the lawyer, I spent to get 98% of my late wife's life insurance, when *they* had outsourced processing of benefits documents, and two years in a row screwed it up. And they paid for an outside lawyer all that time, so you *know* they paid much more than they would have if they'd just said "oops".
Re: (Score:2)
They probably know of some real howlers, and don't want the courts to find out about them.
Re:Attorney fees (Score:5, Informative)
Air Canada is a shitty company that will do anything it can to avoid compensating customers, including trying to persuade customers to accept much less [www.cbc.ca] or taking them to court [nationalpost.com] after a CTA ruling.
Re:Attorney fees (Score:5, Informative)
Wonder how much Air Canada paid in attorney fees to avoid an $812 refund?
From TFCourt-Document:
5. Mr. Moffat is self-represented. Air Canada is represented by an employee.
This was a small claims court, and reading through the finding, it looks like they just sent some middle manager armed with a boilerplate response, which most large companies have on hand for dealing with small claims cases. Boilerplate responses are mostly intended to be an auto-fire response that will suffice for 80% of the complete bullshit "you put ketchup on my Big Mac when I asked for no ketchup, I demand $1,000 restitution for emotional trauma" claims that come up, while not being so all-encompassing as to become tl;dr for the small claims judges. The middle manager is mostly there as a warm body and to answer any questions the judge might ask - being that this is small claims, the judge's questions are plain English sort of stuff, basically Judge Judy without the sass.
Unfortunately for Air Canada, their boilerplate response is evidently somewhat out of date, and geared more towards situations where a customer gets crappy/unclear advice from a live agent:
27. Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.
So, their thoughtless boilerplate creates an incongruity in this particular situation, which the judge had a bit of fun with. It's abundantly clear that Air Canada wasn't really consciously making the argument, it was just a byproduct of their lazy approach to small claims cases.
Re: (Score:1)
Someone got $1000 for a bad burger? Fuck, I only got $970....
Re: (Score:3)
To be fair, from my understanding of small claims court, the company treating it too seriously would be counter-indicated.
1. A company sending an actual lawyer is frowned upon, it's viewed as unfair, when the whole intent is avoiding the cost of lawyers. It's one thing if, say, you're suing your neighbor and he happens to be a lawyer, or if you're suing Dewey, Cheetum, and Howe, and Howe happened to be the one involved in the event you're suing over, but if you're, say, an apartment rental company and you
Re: (Score:2)
Correct, as alluded to by "while not being so all-encompassing as to become tl;dr for the small claims judges".
Re: (Score:2, Insightful)
I almost ejected from my toilet because my ass made a seal and the internal air pressure built up.
Looks like this was not so much due to there being beans on the burrito, but more about its overall size, and the size of all the burritos before it: a lean person's ass makes no seal with a standard bowl.
That and "do you always shit with the seat up?" or the "beans" would simply have vented through the gap between seat and bowl.
Re: Attorney fees (Score:2)
People pay good money for a colon cleanse. I think a you got a fantastic deal.
Re: (Score:2)
Re: (Score:2)
Eh, it's slashdot, I can't edit my posts, the meaning got across, and I can't see grammar errors without walking away for a while because otherwise my brain just papers over them, so it'd take too long. So the occasional mistake slips through.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Eh, your reasoning seems to hold water, but I can't help but remember all the times people posting here have made basically the same exact argument in regards to chatbots violating copyright laws. This was even happening here just last night. Strangely those guys are all oddly silent on this one...
Re: (Score:2)
The problem is, they ARE legally responsible for what their employees say and do as part of their employment and they should know that.
Otherwise they could go into a bar next to an AA meeting, hire the first drunk guy who says he can fly a plane and then disclaim responsibility for the inevitable crash just by saying that's all on the guy that crashed the plane.
Re: (Score:2)
Probably zero. For small claims as well as risk management and compliance reasons companies normally have legal teams on staff.
Re: (Score:2)
A party cannot be represented by a lawyer in small claims court. So they may have paid a bit for advice, but probably nothing exorbitant.
Re: Attorney fees (Score:2)
The chatbot is a separate legal entity and Air Canada should sue it to recoup the attorney fees.
Good (Score:5, Insightful)
They deserve that. And it is another nail in the coffin of the myth that "AI" is more than just software running a computer. No idea why all these morons thing AI is somehow an "entity". It clearly is not. Now, AGI may be a different question, but nobody (besides the physicalist fanatics) knows whether that is even possible to build.
Re:Good (Score:4, Insightful)
No idea why all these morons thing AI is somehow an "entity". It clearly is not.
And even if it was sentient, it was clearly working for the airline. So the airline should assume responsibility for its bad advice, the same way they would (should) if one of their live agents gave bad advice.
Re: (Score:2)
Good point.
Re: (Score:2)
I wouldn't rate this as a nail, given that it was a small claims court. More like a staple. A cheap lightweight one.
Re: (Score:2)
With _that_ reporting? Surely you jest.
Re: (Score:2)
It depends what you mean by "possible". There is certainly no physical law preventing construction of AGI; humans are proof of that. Whether humans will actually succeed in constructing one is a different question. To me it seems inevitable. Either AI will continue improving indefinitely, or it will stop improving at some point short of AGI, and I cannot see any reason why the latter would occur.
Re: (Score:2)
You are just a physicalist fanatic and do not understand Science. Humans are not proof of anything, because it is not understood how a human mind works. _That_ is the actual Science here.
Re: (Score:2)
it is not understood how a human mind works.
As long as it's agreed that humans are a form of natural general intelligence, then it doesn't matter. They're proof that general intelligence is possible, and from a physics perspective, there's no difference between natural and artificial. We may end up creating AGI without ever proving how a mind works.
Re: (Score:2)
That is not understood until we know how it works. (I will gloss over your use of "natural" in an undefined, dishonest and manipulative way.) Learn how Science actually works. Arguments by elimination (which you essentially just used in the form of "What else could it be?") only works if you have a complete and totally accurate description of the system you are arguing about. We do not have that for physical reality. Stop making the same stupid mistakes the religious fuckups are using to "prove" the existen
Re: (Score:1)
> myth that "AI" is more than just software running a computer. No idea why all these morons thing AI is somehow an "entity".
I've worked with people with whom I question their sentience status. They seem mechanically driven by a few simple motivations, and only "reason" via simplistic canned slogans they've picked up at troll-sites. Vapid bastards and bastarettes.
Re: (Score:2)
I do not doubt that. The thing is however, humans get sentience status per default, because anything else has been abused far too much in the past.
That the average person is basically only in possession of miniscule amounts of general intelligence and capability for insight is also not in dispute. And then you have those less capable than average. Hence the average person basically understands almost nothing and many do understand really nothing. I remember a number that says that only roughly 20% of all pe
Who is "their?" (Score:1, Troll)
You mean Jake? Are we at the point where an obviously male name should be referred to with an ambiguous pronoun? Jake is a dude's name, unless otherwise specified. Is Jake multiple people? Otherwise, it is *incorrect* to use "their" to refer to Jake here.
Re: Who is "their?" (Score:2)
I can assure you, the only people who truly care about such things, are people who think some invisible (but tasty) dictator in space is calling all the shots.
Re: (Score:2)
I don't think the FSM gives a shit. Do you mean Jesus? Because I've eaten a chunk myself and it was unpleasantly dry. The blood was okay, but nothing special.
Re:Who is "their?" (Score:5, Informative)
There's a lot of history of using singular they when the gender of the person being discussed is in doubt or can be variable (eg., when a person pays for their groceries).
Re: (Score:3)
It's even more fundamental, "they" is plural
no [oed.com]
Re: (Score:2, Funny)
Re: (Score:2, Troll)
In short: yes, we're beyond that point, and this thing is out of control. Jake is "they" until otherwise explicitly specified. I've just received an email from Greg who needs to put in a different color in his signature that he's "he/him". But by default he would be "they". What to do when you want to use the "real"/multiple "they"? There's no more "they helped me" when multiple, you either enumerate the persons you mean or you say "Team X helped" or similar!
Obligatory Larry David: https://www.youtube.com/w [youtube.com]
Re: (Score:2)
What's the matter, snowflake?
Do you address Nikki Haley by her given pronouns? Her birth name is Nimarata Nikki Randhawa.
Re: (Score:3)
English lacks a specific common-gender third-person singular pronoun and there are examples of "their", "they", "them" and "themselves" being used as singular third-person pronouns going back hundreds of years. It is certainly not incorrect.
Re: (Score:1)
Someone knocks at the door. You ask "Who is it?". Someone calls. You're told the phone is for you. You ask "Who is it?"
You do not think the entity knocking or calling is a dog, a chair, or an alien from outer space. You use 'it' to mean a 'human of unknown sex, age, defined characteristics'. "It" is the non-gender pronoun.
All three words used to have an apostrophe, but it disappeared over time, and to reduce confusion:
He's -> Hes - His -> male possessive
Her's -> Hers - Hers -> female
It's
Re: (Score:2)
"It" is not used to refer to a person because it has the connotation of referring to an inanimate object or, if an animate object, at most a non-human animal. Hence the singular "they" was co-opted to refer to a human in a non-gender-specific way.
Language is not logical. If you don't like singular "they", you are free to file your complaint in triplicate to the Board that Oversees English.
Re: (Score:2)
Do you need a therapy dog to help with your triggered state?
Re: (Score:2)
Your assumption is wrong. Gender can be known and it is still fine to use "their" as a pronoun.
From A Comedy Of Errors, Shakespeare, first published in 1623 - Act 4 Scene 3:
There’s not a man I meet but doth salute me
As if I were their well-acquainted friend,
So feel free to claim that I'm encouraging "pronoun dog whistle games". I don't care. You are wrong, and your foghorn of false indignation is far louder.
First clain in court of chatbot personhood? (Score:5, Interesting)
Re:First clain in court of chatbot personhood? (Score:5, Interesting)
What if the airline hired a human guide who lied to customers? I don't know Canadian law, but it seems the airline would likely still be responsible.
Otherwise, the bot could lie about prices and nobody can be punished in practice. There's be no incentive against activating a lie-bot.
"It's Bender's fault, we didn't know he was a liar when we hired him."
Re: (Score:3)
Indeed. You are responsible for the actions of your representative. In this case, a refund without any excessive damages.
However! If you *know* your representative is acting incorrectly, and do nothing, and the behaviour continues, then it can move into punitive damages! If this chatbot is on their website, and the have not take steps to quickly correct the situation, they could be inline for 10x or more damages.
Re: (Score:1)
So let's see if I'm interpreting this correctly. A company is only liable for the actual customer losses of a rogue employee (representative), UNLESS they A) didn't do sufficient background checks before hiring sensitive positions*, and/or B) ignored warning signs during employment, and/or C) encouraged the employee to lie/misbehave.
If A, B, and/or C, the company is also subject to punitive damages, not just customer compensation costs.
* Probably wouldn't be at issue here.
Re: (Score:2)
They didn't claim that the chatbot is a person. They claimed that they weren't responsible for what their agents say. The judge then inferred that that meant they were calling the chatbot a person.
It's a fair inference under the circumstances, but they didn't actually say it outright.
Re: (Score:2)
They claimed that they weren't responsible for what their agents say.
The only legal rationale for this claim is that agents are individual legal entities. Since Air Canada failed to offer any legal rationale to why they cannot be held responsible, the only possible argument they are making is that "the chatbot is a separate legal entity that is responsible for its own actions." There are no other logical alternatives.
The judge then inferred that that meant they were calling the chatbot a person.
Just because they didn't say it didn't mean they didn't imply it. Likewise, if you label someone a "bastard child" then you are implying their mother became pre
Interesting (Score:2)
Re: (Score:2)
Lawyers are paid to try it on (Score:3)
In this case trying to get the idea of the AI as not being the responsibility of the site owner. They got smacked down, but don't be surprised they tried it...
Re: (Score:2)
Lawyers are generally forbidden in small claims court.
Who else will a company send to a court? (Score:2)
If it is defending a claim, then it's got to have someone to represent it. Inevitably this will be a lawyer!
Be careful what you wish for Air Canada (Score:3)
Re:Be careful what you wish for Air Canada (Score:4, Insightful)
Governments and courts don't allow private citizens to take advantage of those kinds of shenanigans, only corporations.
Re: (Score:2)
No wonder companies are excited about AI (Score:2)
Good! Now stop being lazy and hire staff. (Score:2)
Remember that truck which got sold for $