Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Privacy The Courts Businesses United States

Lawsuits Claim Amazon's Alexa Voice Assistant Illegally Records Children Without Consent (seattletimes.com) 91

An anonymous reader quotes a report from The Seattle Times: A lawsuit filed in Seattle alleges Amazon is recording children who use its Alexa devices without their consent, in violation of laws governing recordings in at least eight states, including Washington. "Alexa routinely records and voiceprints millions of children without their consent or the consent of their parents," according to a complaint filed on behalf of a 10-year-old Massachusetts girl on Tuesday in federal court in Seattle. Another nearly identical suit was filed the same day in California Superior Court in Los Angeles, on behalf of an 8-year-old boy. The federal complaint, which seeks class-action status, describes Amazon's practice of saving "a permanent recording of the user's voice" and contrasts that with other makers of voice-controlled computing devices that delete recordings after storing them for a short time or not at all.

The complaint notes that Alexa devices record and transmit any speech captured after a "wake word" activates the device, regardless of the speaker and whether that person purchased the device or installed the associated app. It says the Alexa system is capable of identifying individual speakers based on their voices and Amazon could choose to inform users who had not previously consented that they were being recorded and ask for consent. It could also deactivate permanent recording for users who had not consented. "But Alexa does not do this," the lawsuit claims. "At no point does Amazon warn unregistered users that it is creating persistent voice recordings of their Alexa interactions, let alone obtain their consent to do so."
The lawsuit goes on to say that Amazon's failure to obtain consent violates the laws of Florida, Illinois, Michigan, Maryland, Massachusetts, New Hampshire, Pennsylvania and Washington, which require consent of all parties to a recording, regardless of age.

"The proposed class only includes minors in those states "who have used Alexa in their home and have therefore been recorded by Amazon, without consent,'" reports The Seattle Times. "The suit asks a judge to certify the class action and rule that Amazon violated state laws; require it to delete all recordings of class members; and prevent further recording without prior consent. It seeks damages to be determined at trial."
This discussion has been archived. No new comments can be posted.

Lawsuits Claim Amazon's Alexa Voice Assistant Illegally Records Children Without Consent

Comments Filter:
  • The Future (Score:5, Funny)

    by SuperKendall ( 25149 ) on Thursday June 13, 2019 @04:59PM (#58757766)

    "Hey network connected speaker assistant..."

    DO YOU CONSENT TO BE RECORDED FOR ALL TIME EVEN IF YOU ARE UNAWARE OR DRUNK OR YOU JUST GOT OUT OF THE SHOWER AND YOUR EARS ARE ALL PLUGGED UP AND YOU DIDN"T HEAR ME SAYING I WAS ACTIVE?

    "Em, I guess, what is the current temperature?"

    72 degrees and sunny, BUT THIS BY NO MEANS IS AN ENDORSEMENT THAT YOU LEAVE THE SAFETY OF YOUR DWELLING, OR THAT YOU STAY IN YOUR DWELLING WHICH COULD BE FILLED WITH EXLPOSIVE GASSES I CANNOT SENSE.

    • That would be really boring to listen to
  • by Anonymous Coward on Thursday June 13, 2019 @05:01PM (#58757780)

    When you bought your Amazon Echo and installed it in your house next to your children, are you not implicitly consenting to having Alex record utterings from said children? Don't buy Echo and don't install it anywhere near your children if you want to keep Alex from recording them. Duh. I hope the court throws this out as fast as possible. No rewards for stupid people.

    • by TWX ( 665546 )

      What if your child has friends that come over?

      Oh wait, I forgot, your child doesn't have any friends. Sorry.

      • by Anonymous Coward

        Guess what, I have video cameras around my house too. That kid will be on my video cameras too. When my kid goes to the local convenience store they are on cameras and their voice is recorded too. As OP said, this is just another frivolous lawsuit that appeals to tin foil nutters but will mostly benefit the lawyers.

        • Talking about which..... If 2 under-aged kids starting doing something intimate in public space in front of a store, and said store is one of the chain stores with high resolution securities camera has a corporate policy of not erasing their security footage (nothing extraordinary), would the whole corporation already have broken multiple child pr0n related laws that that can lead to multiple federal arrest warrants all the way to HQ?

          In the unlikelihood that lawsuit sticks, then every other random person

      • What if your child has friends that come over?

        In loco parentis [wikipedia.org]

        • by TWX ( 665546 )

          There are a lot of crazy parents.

          • There are a lot of crazy parents.

            If you don't trust them, then don't let your kid go to their house.

            When my kid is at his friend's house, his friend's parents will make in loco parentis decisions, including what snacks he can eat, what TV shows he can watch, and, yes, even what devices he can talk to. That is legal and proper as long as their decisions are reasonable and don't conflict with my stated desires.

      • by ranton ( 36917 )

        What if your child has friends that come over?

        What if you take a video of your kid playing with that friend. Can you be sued for having that video on your phone without written consent?

    • by darkain ( 749283 ) on Thursday June 13, 2019 @05:12PM (#58757822) Homepage

      I recently stayed at a hotel in Downtown Seattle. The hotel rooms are equipped with always-on Alexa speakers. I didn't consent to this, nor knew about it before staying at the hotel, nor did I purchase the device at all!

      • I didn't consent to this, nor knew about it before staying at the hotel

        1. Unless you speak the keyword, it doesn't record anything.

        2. If you don't like it, you can unplug it.

        • Unless you speak the keyword, it doesn't record anything.

          Usually you're a lot smarter than this.

        • 1. Unless you speak the keyword, it doesn't record anything.

          I only trust Amazon as far as I can throw them.

          2. If you don't like it, you can unplug it.

          Five minutes later, two guys from Building Maintenance knock on your door, and explain that there is a problem with your Alexa, and need to take a quick look at it.

          They are able to quickly fix the problem.

          • 1. Unless you speak the keyword, it doesn't record anything.

            I only trust Amazon as far as I can throw them.

            If they were actually recording everything, hundreds of people at Amazon would be aware of it. These people would be aware that they are breaking numerous state and federal laws. They would also be exposing themselves to consumer backlash, shareholder lawsuits, and the loss of many billions in market capitalization.

            It makes no sense whatsoever for them to take these risks to record inane kitchen conversations.

            • That assumes the snooping is not mandated by (secret) bad laws and (secret) kangaroo courts. Not at all a safe assumption in Soviet America.

            • by AmiMoJo ( 196126 )

              Don't waste your time, Bill. Slashdot has decided that Amazon and Google are evil cartoon villains and there is nothing you can do to convince them that they aren't listening to everything you say.

            • If they were actually recording everything,

              Who said anything about everything? Just want they are instructed to:

              https://en.wikipedia.org/wiki/... [wikipedia.org]

              You stated that the listen mode can only be activated by a keyword.

              Systems like these have a maintenance mode, where the listen mode can be activated remotely . . . for . . . well . . . maintenance purposes. Google the Intel Management Engine for details. Hmmm . . . Alexa Management Engine . . . ?

              hundreds of people at Amazon would be aware of it.

              One of the most important operational security practices in the Big Intelligence business is compartmenta

      • The hotel outfitted their rooms and did not tell you, Amazon is not responsible for that.
    • by sjames ( 1099 )

      No, why would you be? Most people expect that it will interpret what they say, answer the question/obey the command, and then forget about it .

      People who know more about how voice recognition works might expect that the successful interaction would result in use of a TEMPORARY recording of the interaction being used to update training of the recognition, and then the no longer necessary recording is discarded to save storage space and perhaps to avoid exactly the sort of potential liability TFA is talking a

    • When you bought your Amazon Echo and installed it in your house next to your children, are you not implicitly consenting to having Alex record utterings from said children? Don't buy Echo and don't install it anywhere near your children if you want to keep Alex from recording them. Duh. I hope the court throws this out as fast as possible. No rewards for stupid people.

      NO KIDDING! That and what Kendall said above.....

      To quote a line that Eddie Murphy said one time, "SOMEONE LOOKIN' TO GET PAID!!" That is all this is...

  • by Anonymous Coward

    Alexa routinely records and voiceprints millions of children without their consent or the consent of their parents,"

    Exactly what do people think these devices do?

    Parents gave consent when they purchased the device, hooked it to their network, and deposited in domicile interior.

    You don't want it recording you, by all means! Do not put microphone controlled by another in middle of house owned by you.

    Is it really so hard? Have a shred of accountability! If you put recording device in house do not then become surprised when recorded!

    • by Calydor ( 739835 )

      Conversely, Alexa kinda has to record and sample your kid's voice to understand that it's a kid and to drop any future recordings of that voice. Alexa does not have magical knowledge.

  • Didn't the parent buy it and set it up?

  • So, Alexa, stop breaking the law.

    (side note: it's literally in our State Constitution, along with the ability to impose a flat income tax on all AIs living in our state, which means present or doing business here)

  • "Another nearly identical suit was filed the same day in California ..."

    Yeah.

    This is some sleezebag lawyer looking to loot "big pockets" Amazon for all the booty he can loot. This is the sort of frivolous suit that should result in the lawyer's immediate disbarment.

  • Ok... WTF? Just... WTF?

    If you want Amazon to fix this, just make some noise, publish an article or something. This is just opportunism.

    You donâ€(TM)t need to file a lawsuit for everything unless you happen to be a lawyer trying to make a buck.

    And people wonder why the court system is as backed up as it is :/
  • I can just imagine all the borderline paranoid schizophrenics screaming and ranting about listening devices etc... The rage as they use their web browser and phones that were already being used to "spy" on them anyway.
  • This is why we can't have nice things. As horrible of an idea I think these devices are, it's really up to the parents to decide whether or not to install them, and thereby implicitly give permission.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...