Amazon Shares Data With Arkansas Prosecutor In Murder Case (ap.org) 54
An anonymous reader quotes a report from Associated Press: Amazon dropped its fight against a subpoena issued in an Arkansas murder case after the defendant said he wouldn't mind if the technology giant shared information that may have been gathered by an Amazon Echo smart speaker. James Andrew Bates has pleaded not guilty to first-degree murder in the death of Victor Collins, who was found dead in a hot tub at Bates' home. In paperwork filed Monday, Bates said Amazon could share the information and Amazon said it handed over material on Friday. The Echo "listens" for key words and may have recorded what went on before Collins was found dead in November 2015. Amazon had fought a subpoena, citing its customers' privacy rights. A hearing had been set for Wednesday on whether any information gathered was even pertinent.
NCIS and LawAndOrder thank you (Score:4, Funny)
is it 3 laws safe? (Score:2)
I think he made Alexa promise to kill him.
Wrong headline (Score:5, Insightful)
Starting with "Amazon shares..." makes it look like Amazon is the story. It's not. Headline should have started with "Defendant agrees to share...".
Not sure why it's a big deal, though - the search and seizure amendment prohibits doing that without due process. Looks like due process was being followed here.
If Amazon Echo records voice in the home , and there is a law in California that you are not allowed to record voice without consent , are all Amazon Echo customers in California breaking the law? Probably their guests being murdered didn't know they were also being recorded.
Re:Wrong headline (Score:4, Informative)
If Amazon Echo records voice in the home
It only records the sentence following the keyword, which by default is "Alexa".
there is a law in California that you are not allowed to record voice without consent , are all Amazon Echo customers in California breaking the law?
No. It is illegal to intentionally record someone without their consent or notification. Incidental unintentional recordings are not illegal. So an inadvertent recording by a false trigger would not be illegal.
Probably their guests being murdered didn't know they were also being recorded.
If they did, they could just say "Alexa, I am being murdered. Please call 911!
Anyway, this murder took place in Arkansas, which is a "one party" notification state. So you can record without consent or notification as long as you are a participant in the conversation.
Re: Wrong headline (Score:1)
Alexa is always listening and their users know it. Therefore, unintentional recordings are impossible.
Re: Wrong headline (Score:4, Informative)
https://xkcd.com/1807/ [xkcd.com]
How could you know this? (Score:5, Insightful)
Are you making claims beyond your knowledge? The device runs on proprietary software. By default we have no idea when the device is listening (most likely all the time, otherwise how would it know when someone uttered the 'wake word'?), we have no idea if there's a recording made, and we have no idea where that recording goes (users certainly don't get to control where the recordings go somewhere or if any such recordings are made).
Perhaps this is why it's a better idea to manually bring up a website & order something, or (by extension for TVs which are now "smart") not get a TV running proprietary software with a camera and mic aimed at the user...often in their bedroom aimed at squarely at their bed.
How many unwitting porn stars are there now? Just give us a round figure, so to speak.
Re: (Score:3)
By default we have no idea when the device is listening (most likely all the time, otherwise how would it know when someone uttered the 'wake word'?)
Per my understanding, they claim that it's always listening, but only recording after it hears the 'wake word'. I tend to believe them - Not because I trust them or undervalue what it would be worth to them to have access to everything said in the home, but because I think they fear the backlash should a case like this come up and it be revealed that they were recording more than they reported.
Re: (Score:2)
And it's proper to wonder *why* people come out of the woodwork, since we know that such things as astroturfers exist. But we also know that fanbois exist, so certainty isn't available.
Re: (Score:2)
Starting with "Amazon shares..." makes it look like Amazon is the story. It's not. Headline should have started with "Defendant agrees to share...".
No, I think there are two things worth noticing here: 1) Amazon records what you say in the privacy of your home, and 2) The court felt they had to ask the defendant for permission.
None of these items are actually news, I think, but 1) is remarkable for not being more at the front of people's minds - I suspect most people are not really aware of it. And 2) is remarkable because it runs counter to what a lot of the loudest noise on the web seem to believe, namely that "They" (ie. the covernment and the state
Re: (Score:2)
Except that they did try to get the records without asking the defendant, it would be difficult for Amazon to drop its fight against subpeona if there was no subpoena after all.
It's the normal bog standard case of the prosecution/police asking for someone's private data held by a third party by serving a search warrant on that third party. With the little less typical case of the third party pushing back and challenging the warrant. And the very uncommon case of the person just saying "go ahead" - one can o
Re: (Score:2)
It's called following up. Or in it's abbreviated form: FU!
/joke
And the truth comes out! (Score:3)
*playing Halo*
<VC> Alexi, have someone kill this bastard who keeps blowing me up with sticky grenades!
<Ekho> ok
*VC laughs*
And that's why you don't buy the knockoff version on eBay! ;)
Re: (Score:2)
You joke about knockoff versions, but I've been watching Schitt's Creek lately and there's a character named Alexis on the show and my Echo Dot wakes probably twice an episode due to hearing someone say Alexis (usually it's when Catherine O'Hara yells it). Every once in a while, it'll respond with something random, usually though it just says "Sorry, I couldn't understand what you are asking."
SUBJECT REQUIRED (Score:1, Interesting)
And that's why these companies need to take a leaf out of Apple's book - don't collect data if you don't want to share it with law enforcement. Siri doesn't record you, and doesn't transmit anything to a server until it locally recognises "Hey Siri", or you physically press a button.
Of course, all these other companies are busy trying to mine all your user data, so they're busy transmitting and storing everything they possibly can. Then they try to pull the "user privacy" defence when they themselves have
Re: (Score:2)
So exactly what the article says that echo does then. If you believe Apple you believe Amazon, surely.
Re: (Score:2)
How do you know Siri actually works that way? Have you examined the code? That it is claimed to work that way I believe, but that's a very different statement.
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
I think he does. For the same reason that good lawyers tell you [vice.com] to never speak to the police [popehat.com] when you've been arrested.
Re: (Score:1)
Re: (Score:2)
Yes he does have something to lose. And no it doesn't score points with a jury.
Hopfully he has a good lawyer with a plan and he's not just trying to be nice.
Re:Dumb move by defendant (Score:5, Informative)
This guy does have a lawyer though, and the lawyer knows all the tricks, and has determined that it is ok to let the recordings be released. I can't really comment wisely on that topic, but I can pretend to:
*) The lawyer may think that the recordings will help the case.
*) The lawyer may have decided that the recordings would probably be released anyway, so might as well cut to the chase.
*) The lawyer might think the openness will help the case.
*) The lawyer might think his client is guilty and hopes he goes to jail.
Re: (Score:2)
Conceding this point now closes a lot of avenues of appeal should his client be found guilty
Does it? Which avenues of appeal does it close off?
Re: (Score:2)
What if after listening to the recordings themselves, the recordings go a long way to prove his innocence? I highly doubt they would concede this point just because "whatever" when amazon was fighting their fight for them.
Latest XKCD is on this topic (Score:2)
Here [xkcd.com]
Are our phones safe? (Score:2)
Both droids and Iphone have Sirii and Googlle Now
Re: (Score:3)
Deleting it will decrease their effectiveness at recognizing what you say. They use your voice history to "learn" your specific speech patterns, helping recognize future voice queries. But as with most things Google, they leave the choice to you, unlike the other services.
Re: (Score:2)
And, as with the other services, they may be telling the truth, but you can't verify.
This is super strange... (Score:2)
...it's almost like that thing experts have been screaming at the top of their lungs for a decade (or longer!) have been warning people about.
You sucked on the SaaS titty, and now you get to pay for it with your butthole.
And, as always, it always "begins" with the worst criminals. Pedophiles. Murders. "Think of the children!" they'll say, as they inch closer and closer to sticking a camera inside your anus while calling it "progress" and "freedom."
Next Headline (Score:2)
The next headline we can expect:
"James Andrew Bates Faces New Charges from Alexa Disclosures"
This man is an idiot.
Rule #1: If you're guilty, it's imperative to give prosecutors nothing not absolutely required by law.
Rule #2: There are so many laws on the books, you're guilty of something; even if you have done absolutely nothing wrong.
Re: (Score:2)
Or, echo recordings exonerate man? You do realize that the best they could hope to get from this was support from the timeline and possible someone saying "alexa, play katy perry roar"/etc and not the recordings from inside the house nonstop. There is a security risk from these devices being exploited but if amazon was recording outside of being activated by the keyword they would fold as a company overnight.
As I said before, I highly doubt they would concede this point just because "whatever" when amazon