48 Advocacy Groups Call On the FTC To Ban Amazon Surveillance (vice.com) 32
An anonymous reader quotes a report from Motherboard: On Thursday, a coalition of 48 civil rights and advocacy groups organized by Athena asked the Federal Trade Commission to exercise its rulemaking authority by banning corporate facial surveillance technology, banning continuous corporate surveillance of public spaces, and protecting the public from data abuse. "The harms caused by this widespread, unregulated corporate surveillance pose a direct threat to the public at large, especially for Black and brown people most often criminalized using surveillance," the coalition wrote in an open letter. "Given these dangers, we're calling on the Federal Trade Commission (FTC) to use its rulemaking authority to ban corporate use of facial surveillance technology, ban continuous surveillance in places of public accommodation, and stop industry-wide data abuse."
While a number of firms offer networked surveillance devices to try and make homes "smart," the coalition uses Amazon as a case study into how dangerous corporate surveillance can become (and the sorts of abuses that can emerge) when in the hands of a dominant and anti-competitive firm. From Amazon's Ring -- which has rolled out networked surveillance doorbells and car cameras that continuously surveil public and private spaces -- to Alexa, Echo, or Sidewalk, the company has launched numerous products and services to try and convince consumers to generate as much data as possible for the company to eventually capitalize on. "Pervasive surveillance entrenches Amazon's monopoly. The corporation's unprecedented data collection feeds development of new and existing artificial intelligence products, further entrenching and enhancing its monopoly power," the coalition letter argues.
From this nexus of monopolistic power and unchallenged power, the coalition draws a long list of abuses committed by Amazon that have harmed consumers, communities, and total bystanders. Ring's surveillance devices have been hacked multiple times, have leaked owners' Wi-Fi passwords, and shared locations over the Neighbors App. Vulnerabilities in Alexa risked revealing personally identifiable information, and all this takes place within the context of a lack of transparency around security protocols that force consumers to opt out of surveillance conducted without their consent. On Ring's Neighbors App, racial profiling has been gamified to encourage and escalate surveillance of "suspicious" people. The company collects personal information on children -- a potential violation of the Children's Online Privacy Protection Act -- but has also seen the adoption of its various surveillance devices increase in schools, libraries, and communities across the country. Paired with Amazon's development of deeply biased facial surveillance technology and its partnerships with the police and fire departments of over 2,000 cities, the group argues the potential for abuse outstrips a threshold anyone should be comfortable with. "This type of surveillance is illegal under the FTC Act in Section 5 and in particular the section that talks about unfair and deceptive practices," said Jane Chung, the Big Tech Accountability Advocate at Public CItizen, in an interview. "There's a list of three things that have to be true in order for a practice to be unfair and deceptive according to the FTC. Number 1: it has to cause substantial injury. Number 2: the injury can't be avoidable. And number 3: the injury isn't outweighed by benefits."
"Rulemaking is needed to stop widespread systematic surveillance, discrimination, lax security, tracking of individuals, and the sharing of data. While Amazon's smart home ecosystem, facial surveillance technology, and e-learning devices provide a good case study, these rules must extend beyond this one technology corporation to include any entity collecting, using, selling, and/or sharing personal data."
While a number of firms offer networked surveillance devices to try and make homes "smart," the coalition uses Amazon as a case study into how dangerous corporate surveillance can become (and the sorts of abuses that can emerge) when in the hands of a dominant and anti-competitive firm. From Amazon's Ring -- which has rolled out networked surveillance doorbells and car cameras that continuously surveil public and private spaces -- to Alexa, Echo, or Sidewalk, the company has launched numerous products and services to try and convince consumers to generate as much data as possible for the company to eventually capitalize on. "Pervasive surveillance entrenches Amazon's monopoly. The corporation's unprecedented data collection feeds development of new and existing artificial intelligence products, further entrenching and enhancing its monopoly power," the coalition letter argues.
From this nexus of monopolistic power and unchallenged power, the coalition draws a long list of abuses committed by Amazon that have harmed consumers, communities, and total bystanders. Ring's surveillance devices have been hacked multiple times, have leaked owners' Wi-Fi passwords, and shared locations over the Neighbors App. Vulnerabilities in Alexa risked revealing personally identifiable information, and all this takes place within the context of a lack of transparency around security protocols that force consumers to opt out of surveillance conducted without their consent. On Ring's Neighbors App, racial profiling has been gamified to encourage and escalate surveillance of "suspicious" people. The company collects personal information on children -- a potential violation of the Children's Online Privacy Protection Act -- but has also seen the adoption of its various surveillance devices increase in schools, libraries, and communities across the country. Paired with Amazon's development of deeply biased facial surveillance technology and its partnerships with the police and fire departments of over 2,000 cities, the group argues the potential for abuse outstrips a threshold anyone should be comfortable with. "This type of surveillance is illegal under the FTC Act in Section 5 and in particular the section that talks about unfair and deceptive practices," said Jane Chung, the Big Tech Accountability Advocate at Public CItizen, in an interview. "There's a list of three things that have to be true in order for a practice to be unfair and deceptive according to the FTC. Number 1: it has to cause substantial injury. Number 2: the injury can't be avoidable. And number 3: the injury isn't outweighed by benefits."
"Rulemaking is needed to stop widespread systematic surveillance, discrimination, lax security, tracking of individuals, and the sharing of data. While Amazon's smart home ecosystem, facial surveillance technology, and e-learning devices provide a good case study, these rules must extend beyond this one technology corporation to include any entity collecting, using, selling, and/or sharing personal data."
Re:It's just a smart speaker, gramps, chill... (Score:5, Insightful)
Re: (Score:2)
Re: It's just a smart speaker, gramps, chill... (Score:2)
Needs proper regulation, ban is too extreme (Score:4, Insightful)
Banning is too extreme, but there should be regulations about this. For example, I never consented to 24/7 surveillance of my front door or open windows. Many of these devices are small enough that it would be impossible to even know if you're being watched.
OK, I'm a boring person with no mistresses or vices. So are my neighbors. We have nothing to hide. However, are you comfortable with Amazon having access to surveillance information and selling it to 3rd party marketing/scam firms?
"Hey, we noticed you're walking with a slight limp...have you tried Dr Scholl's orthotic inserts?"
"Hey, we noticed your light is on at 2AM, would you like some melatonin?"
"Hey, we noticed your chair shaking from the window at 10pm every night...pornhub is having a sale on premium, this weekend only!"
"We noticed you're driving a Toyota, Joe Douchebag's Honda is having a President's day sale."
"We've overheard you speaking Spanish, do you need an immigration attorney?"
I don't want Amazon selling my data without my consent because my neighbor installed a ring doorbell or surveillance cam for his own reasons. I want strong regulations on what law enforcement and commercial entities can do with the footage. I commit no crimes. However, I also don't trust the accuracy of these, especially for 3rd party services. Imagine services that could use this data?:
What if a shifty facial recognition company buys the footage to stalk people? LOTS of people would pay good money for a service that catches their partners in the act of infidelity. What if it mistakes me for someone else?...and this other person is entering a woman's house when husband isn't there? Now this service is telling her husband, I entered his house when he was on a business trip? Yeah, for law enforcement, I could get the case dismissed. However, tell that to a jealous husband who shows up on your doorstep and demands answers in front of your wife and kids...because some service said that I am the guy banging his wife. There's no regulation to forbid this. Any laws broken would require an expensive court case I cannot afford. What if an insurance company uses this info to invalidate insurance claims?...and a mistake was made?...well, that's illegal, but you have to go to court to get it sorted out. Even when the law is on your side, you can be bankrupted getting justice.
Sorry, you're using the old way of thinking. This is a new world and a new era. In the old days, we could be lax because footage was stored on video tape and only used for it's intended purpose. Now it's uploaded to the cloud and monetized for reasons we could have never fathomed, which neither the camera owner nor his neighbors consented to nor understood. When used for the intended purpose, I am generally fine with it. It's fine to use a ring doorbell to catch package thieves. It's a huge issue to sell the footage for marketing or other services. It's also a huge issue if people use this cheap, ubiquitous surveillance for blackmail or data harvesting without the consent of those being observed.
Re: Needs proper regulation, ban is too extreme (Score:2)
Re: (Score:2)
Seriously, what the fuck is happening to America where everyone feels every view they have should be enacted into law and forced on everyone else?
You do not have an expectation of privacy
So we have to abide by your definition of what the "true" expectation of privacy is?
Oh, wait, you said 'You do not have an expectation of privacy legally speaking. I.e., we must abide by the definition imposed by the government.
fuck people who believe their views should be enforced by governmental fiat.
Emmm..
Do not come for my rights.
No need. you've already relinquished them, it appears.
Re: Needs proper regulation, ban is too extreme (Score:2)
Re: (Score:2)
the constitution of the United States is not the federal government
This is very true. But my copy of the Constitution must be missing the "Expectation of Privacy" clause somehow. Where in your copy is it defined?
Your thinking is dated (Score:2)
Take the grindr priest example. He thinks he's usin
Re: (Score:2)
"I thought I saw you going into an apartment with some lady. But that lady wasn't your wife!"
Sure, a computer might say that in 2021, but some human might have said it in 1921. So WTF does Amazon have to do with this?
scale + a service is not a random human (Score:3)
Remember 15 years ago when purchase verification sucked?...and your CC was declined for all sorts of stupi
Re: (Score:2)
Seriously, what the fuck is happening to America where everyone feels every view they have should be enacted into law and forced on everyone else?
Authoritarianism is popular. Both parties want to force their ways on you.
Wait, what? (Score:3, Insightful)
One would suspect that the behavior also has to be unfair and deceptive in the first place, yet TFS and TFA are awfully short on detail about why this kind of surveillance is deceptive or an unfair business practice.
A lot of things check those three boxes. Shooting someone during a robbery is an example. Given that the FTC does not prosecute that kind of robbery, I suspect the law requires something more than just that.
Re: (Score:2)
A lot of things check those three boxes. Shooting someone during a robbery is an example.
It seems like what they have is 48 advocacy groups who have a level of paranoia regarding privacy and concern 1000-fold that of consumers, so those organizations ignore the not insubstantial benefits of surveillance and overvalue the extent of injury.
The major barrier they have to cross is the FTC is they have to meet a standard of proof before they can prohibit a practice: the FTC are disempowered from banning practice
Re: (Score:2)
Note that the unfair-or-deceptive statute does not even apply to all unfair or deceptive methods, acts or practices in commerce -- it only applies to commerce between states, to import commerce, and to export commerce when an unfair or deceptive method (but not act or practice!) has "a direct, substantial, and reasonably foreseeable effect" on someone else in the US engaging in export commerce.
The only bit of the letter that even comes close to addressing the unfairness or deception is this part:
Re: (Score:2)
The thing is, the benefit to "me" over the injury to "we".
Surveillance cameras can benefit "me", but they can injure "we" if done wrong. I mean, why do we fear Facebook selling cameras? Maybe Facebook can sell cam
Simple: Accuracy and Ubiquity (Score:2)
On what grounds? Why is it legal for a security guard, a secretary, and any other employee of the coproration to recognize you — and testify about it in court, if need be — but not Ok for a machine?
A human being has MUCH better vision than a device. They also have a legal obligation to tell the truth and accurately identify you, under penalty of law. Also, remember a device is fixed in location. A security guard can move around to get a better look if he's unsure who's committing the crime. So accuracy is a huge reason. We can also question our accusers as can the police.
The other factor is what if the info is wrong? The facial recognition is shitty...what if it flags you as a rapist? The po
Great cases make bad law (Score:2)
> the coalition uses Amazon as a case study into how dangerous corporate surveillance can become (and the sorts of abuses that can emerge)
And that's where we may have a problem. It's been said "great cases make bad law". They are using Amazon as the example, getting people to think "we don't want Amazon to do this on a huge scale", in an effort to make it illegal for anyone to do it on any scale.
You don't want Amazon putting cameras all over the place in public places? A agree with you. Therefore make
Re: (Score:2)
CCTV that records, keeps the footage for a week, the footage is never analysed and data-mined, no facial or gait recognition done, and then that footage is deleted after a week or two: that is not a problem.
The problem is when big corporations with an interest in tracking every aspect of peoples lives starts recording every last thing they possibly can and then figuring out who's who, determining what they are doing with AI and then storing the data in perpetuity, that is a big brother society that George O
Re: (Score:2)
On what grounds? Why is it legal for a security guard, a secretary, and any other employee of the coproration to recognize you — and testify about it in court, if need be — but not Ok for a machine?
Bzz, what makes it "corporate" — the term clearly a dirty word for these "advocacy groups"? People install these devices willingly — and legally.
In a few years it's possible that you could have zero business relationship with Amazon, yet Amazon could have an almost complete record of your daily movements outside of your residence.
I don't know the full consequences of that state of the world, but it's not a world I want to walk into without some second thought.
What is most surprising (Score:3)
Location tracking without consent (Score:2)
Thanks to ring doorbells and facial recognition, Amazon can see wherever you are...and I don't care much about Amazon, but I am worried about people buying the data from them. Some guy gets turned down a woman and uses the data to stalk her....like that newspaper did to that grindr priest they reported about last week. At least the priest installed the app and can opt out, if not in the app, by disabling location services, uninstalling the app, or shutting down his phone. You can't do that with your face
Don't buy Amazon spy junk (Score:1)
Let's all hope they lose (Score:2)
And then right after that, they list a bunch of reasons why you shouldn't buy Amazon's shit. I didn't see a single example to persuade me that it should become illegal for you (even if aided by your computer) to pay attention to your own surroundings, or to ask for someone else's help (even if the "someone else" is incompetent or evil) in doing that.
Hate Amazon if you must, but hating
Re: (Score:2)
Not that they can prove there is an injury, that said injury would be unavoidable, or that it would be greater than the benefits. They list exactly one demonstrable type of injury, but
Those are some awful arguments that should fail. (Score:3)
What are the other arguments? Well the one beginning with, "the coalition uses Amazon as a case study into how dangerous corporate surveillance can become (and the sorts of abuses that can emerge) when in the hands of a dominant and anti-competitive firm", is also garbage. Why? For one, they aren't talking about the technology, they're criticizing the behavior of a company. That's like saying we have to get rid of all cars because someone drove drunk and crashed into a building. It's another garbage argument that doesn't actually support what they want.
And what exactly do they want again? Is it even what they're asking for? Let's look at that first quote again - "The harms caused by this widespread, unregulated corporate surveillance pose a direct threat to the public at large". Well, they don't actually demonstrate the claim that the technology causes any harm, just that Amazon's Ring is very successful despite having had some problems. "Pervasive surveillance entrenches Amazon's monopoly. The corporation's unprecedented data collection feeds development of new and existing artificial intelligence products, further entrenching and enhancing its monopoly power" - I wasn't aware that Amazon was the sole player in the market. Probably because they are not, Ring is just the piece of a larger market where Amazon currently has the most popular product. And that they use a popular product to leverage further R&D is hardly a bad thing the FTC should prevent. Much the opposite.
The biggest and clearest problem with their argument is again in that first quote. What happens if we accept every element of the argument - "unregulated corporate surveillance" causes black and brown people to become criminals and Amazon to be the only company in the world and they spend all days spying on everyone? Well, are they demanding a solution to the stated problem?
No, they aren't. They are identifying "unregulated corporate surveillance" as the problem, but they are demanding that, "the Federal Trade Commission (FTC) to use its rulemaking authority to ban corporate use of facial surveillance technology, ban continuous surveillance in places of public accommodation, and stop industry-wide data abuse." The argument and the demand do not match. According to the argument, the problem is a lack of regulation, so the demand should be the creation of regulations to prevent abuse, right? According to their demand, the technology is inherently bad and must be banned. That's an insincere argument, it should therefore be ignored.
That someone else is quoted separately asking for what the core argument indicates is a good sign that someone involved isn't dishonest or stupid, but it does contradict what the "coalition" is seeking. This contradiction is stark and direct - if the FTC, (or Congress, where that authority actually rests), can impose regulations to prevent harmful misuse, then clearly the technology does not present unavoidable injury as required by the 2nd prong of the FTC's test.