Supreme Court To Consider Data Aggregation Suit Against Spokeo 62
BUL2294 writes: Consumerist and Associated Press are reporting that the Supreme Court has taken up the case of Spokeo, Inc. v. Robins — a case where Spokeo, as a data aggregator, faces legal liability and Fair Credit Reporting Act violations for providing information on Thomas Robins, an individual who has not suffered "a specific harm" directly attributable to the inaccurate data Spokeo collected on him.
From SCOTUSblog: "Robins, who filed a class-action lawsuit, claimed that Spokeo had provided flawed information about him, including that he had more education than he actually did, that he is married although he remains single, and that he was financially better off than he actually was. He said he was unemployed and looking for work, and contended that the inaccurate information would make it more difficult for him to get a job and to get credit and insurance." So, while not suffering a specific harm, the potential for harm based on inaccurate data exists. Companies such as Facebook and Google are closely watching this case, given the potential of billions of dollars of liability for selling inaccurate information on their customers and other people.
From SCOTUSblog: "Robins, who filed a class-action lawsuit, claimed that Spokeo had provided flawed information about him, including that he had more education than he actually did, that he is married although he remains single, and that he was financially better off than he actually was. He said he was unemployed and looking for work, and contended that the inaccurate information would make it more difficult for him to get a job and to get credit and insurance." So, while not suffering a specific harm, the potential for harm based on inaccurate data exists. Companies such as Facebook and Google are closely watching this case, given the potential of billions of dollars of liability for selling inaccurate information on their customers and other people.
How about this.. (Score:3, Insightful)
Re:How about this.. (Score:5, Insightful)
All the EULAs will just get updated to include a clause saying, "And you give us the right to share." The problem seems to be that if you *can* give permission then you will be coerced into giving permission. That implies that, just like a contract of enforced slavery is illegal in USA even if signed willingly, any agreement that gives permission for sharing is illegal. That would solve the problem of coercion but would wipe out a whole bunch of services that people actually like and want.
I don't have an answer, but I am pretty sure that just barring the data sharing would not actually change anything except the EULAs.
Re: (Score:2)
Re: (Score:1)
All the EULAs will just get updated to include a clause saying, "And you give us the right to share.".
What!? Don't forget that they already do. It's baked into the system. "We reserve the right to share with our affiliates who may offer you products from time to time" and so forth have been mainstream for decades.
The Do Not Call loophole of being unable to STOP calls from people you already have a business relationship with is one of the issues.
Voters get congressman robocalls for their party just by being registered $party members.
People subscribing to magazine A get offers from Magazine J from a completel
Re: (Score:2)
First, most people have never heard of "Spokeo", so how would Spokeo have an EULA that applies to the public at-large? What, they claim that an EULA applies to anyone they collected data on?--NO WAY. Second, if an EULA can trump a right provided by Federal law (in this case the Fair Credit Reporting Act, "FCRA"), then the "big 3" credit-reporting Agencies (Equifax, Experian, TransUnion) would have used EULAs long ago to stamp out FCRA violation lawsuits & their need
So, data CAN be owned? (Re:How about this...) (Score:3, Interesting)
I find it rather ironic, that the same site, which shouts down any attempts to reason that an idea can be owned — and that using it without the owner's permission is illegal and immoral — would be so respectful towards other kinds of information.
If, as the opinion prevailing here holds, "information can not be stolen" because you still have your original copy, what grounds are there to prohibit somebody else to share, what they know about you, with others? On that matter, will you also outlaw g
Re: (Score:2)
Re: (Score:1)
EULAs have nothing to do with this. These parasites are not "sharing" information, they are selling it, not caring who they hurt. They need to be held to the same standards as Experian, TransUnion, and Equifax which (albeit rarely) are held responsible for bad information that they know is bad but refuse to remove.
Re: (Score:2)
Re: (Score:2)
I grant a 3rd party limited use of ONLY the data that I supply them. They cannot in anyway, shape, or form share this data with any other party (without a court order).
If you want to unilaterally control what other people say and publish about you, you might have some problems working around this annoying document [wikipedia.org].
Re: (Score:2)
"Annoying documents" don't negate the concepts of libel and slander. Such documents also usually include ideas like "privacy".
Re: (Score:2, Insightful)
"Annoying documents" don't negate the concepts of libel and slander.
To qualify as libel or slander a statement must be not only false, but knowingly false, and used with malicious intent. I doubt if you can convince anyone that Google is libeling you.
Such documents also usually include ideas like "privacy".
If you type info into my website, then it is silly to later claim that it is "private". If you don't want Google to have your data, then don't use Google.
Re: (Score:3, Informative)
To qualify as libel or slander a statement must be not only false, but knowingly false, and used with malicious intent. I doubt if you can convince anyone that Google is libeling you.
The Malicious Intent test you're citing is from New York Times v. Sullivan and it only applies to public figures. SCOTUS made it clear in Gertz v. Robbin Welch that negligence is sufficient to support Libel/Slander/Defamation/etc. in the case of a private person.
Re: (Score:3)
Actually, no. The defendant must either know it is false or show a reckless disregard for the truth of the statement. There need not be malice, only some expectation of an adverse effect.
Re: (Score:2)
I don't think it's that simple. Your next door neighbor states she knows you, and says where you live to someone, and she must pay you $100,000? Your housepainter lists you as a reference, but you own the data so he must pay you?
It bothered me when I lived in a very small town and everyone in the town was a busybody and gossiped about where I went. It bothers me that technology makes that nosiness scaleable to worldwide proportions. But I can't see making how to monetize privacy without all the benefit
Re: (Score:3)
There's a difference though, your busybody neighbors still have some accountability to you. they're out in the open -- they don't get to hide behind 1x1 invisible pixels spying on your web use habits. Further, if they got out of line, you could punch them in the nose (figuratively). Google or a CC company? They get to hide behind layers of customer service representatives and EULA's.
The sad truth is until the advertising bubble pops (and it will) we're stuck with this massive invasion of privacy, and stu
Re: (Score:2)
Your data, you keep it private then. Anything you leave publicly knowable should be fair game imho.
Re: (Score:2)
I posted a reply to a post by SirGeek.
Is the above sentence my data or yours?
Re: (Score:3)
There's many more such social/white page/life/people tracking websites that are started by leeches and sell information to others. Posting people's information without their consent is a violation of privacy and all of these websites should be challenged and laws should be passed to protect the consumer.
We'll give you ten free transactions in Farmville if you "Like" us and allow us to go through your contacts list to see who else you know that likes us!
This seems backwards. (Score:3)
Um, what? All these inaccuracies would help him get a job, unless he's trying for a very low position.
Re: (Score:2)
Um, what? All these inaccuracies would help him get a job, unless he's trying for a very low position.
This was my thought, as well. The plaintiff is either a privacy advocate (something that I support in general), or someone just looking for a payday (something that I oppose in general). In either case, his reasoning is highly suspect.
Re: (Score:3)
Re:This seems backwards. (Score:5, Informative)
Or as is very likely the case, a company passes up on hiring him for something.
That's nifty and all, but that's not the actual lawsuit.
The key feature of the lawsuit is that the individual cannot show any specific harm was done, only that their legal rights were infringed. Most aspects of civil law require that the person show some sort of injury. In this case the specific law does not require damage. Damage to consumers is assumed as automatic if the company does not comply with the law. The wording of the law is only about compliance, not about harm.
The big data companies absolutely want to forbid standing in the case. If he could show specific harm he'd have a strong case but it would be a different case. This is about data aggregators being compelled to follow the law.
The first court dismissed it, claiming since he had no specific "actual or imminent harm" he couldn't sue.
The appeals court observed that the law required specific actions by the company, and the law tied failure to comply with the rules to a $100-$1000 fine for noncompliance. That's even the name of the section: "Civil liability for willful noncompliance". Again, the law specifies damages for failing to comply, not damages for actual harm. The appeals court ruled that since the law as written does not require any actual damages -- the law is about compliance by the company, with damages assigned to "any consumer" affected by non-compliance -- he can sue. He qualifies under the definition of "any customer", and the law is only about compliance, not about actual harm.
Because of the exact wording of the law, my money is on Robins on this one. The actual law does not rely on harm to the individual. The wording of the law is based entirely on compliance, with noncompliance resulting in liability. [cornell.edu] Additional harm is not mandated.
But let's turn it around. Frequently the courts will examine the consequences if the court rejects the arguments. If they turn it down, if they say consumers cannot have standing unless there is real harm, then they would effectively void sections 1681n and 1681o. There would only be civil liability for actual harm, there would not be any civil liability for noncompliance. Generally the SCOTUS relies on a Constitutional reason to void large chunks of law like that, but in this case there are several solid reasons for Congress to pass the law. If he doesn't have standing then SCOTUS is voiding the law since no other method is available for liability. The Justices tend to be careful about voiding the law, generally only voiding laws when it falls outside what the Constitution allows. I'm absolutely certain that will come up in the oral arguments: if they deny standing how else can the noncompliance law be applied? If they deny standing they seem to be voiding the law without a constitutional reason.
Re: (Score:2)
The key feature of the lawsuit is that the individual cannot show any specific harm was done, only that their legal rights were infringed. Most aspects of civil law require that the person show some sort of injury. In this case the specific law does not require damage. Damage to consumers is assumed as automatic if the company does not comply with the law. The wording of the law is only about compliance, not about harm.
And that is great and all, but also at point, how do you prove you were passed up on a job because of this without a company coming straight out and telling you the reason? There is no law that says a company needs to tell you why you were not chosen for a position. Your resume may simply be tossed into the "do not trust due to lies" pile automatically by the automated compute checking system that performed the check of the information in the resume with the information in the automated background checks, a
Re: (Score:2)
Unless, of course, his resume goes into the round file because it doesn't agree with "known" information about him.
Re:This seems backwards. (Score:4, Interesting)
Unless, of course, his resume goes into the round file because it doesn't agree with "known" information about him.
I once failed a background check because a former employer gave me a more _senior_ title than the one I actually held. I had to get them to correct it before I could be hired (into my present position).
The hiring company HR department behaves as if their job is to help the line manager get and keep the right people, so they were willing to sort out the mess. In other places, I would never have known what happend.
So this is a real problem.
Re: (Score:2)
A large amount of education in a different field could be a problem as well... information saying he is an MD is not going to help him if his actual education is a BS Computer Sciences
Re:This *seems* backwards. (Score:5, Insightful)
Think about it this way:
Mr. Robins applies for a job, filling out his resume with his accurate information. The company then goes to these 'information businesses' for a routine check and finds apparent discrepancies between his resume and the other company's records.
It is not a stretch to infer that your ability to find a job would be damaged by companies sharing false information about a person, whether the information would be considered better or worse.
What is truly disheartening is that both the sellers and buyers of this information currently have no liability for spreading false information (and hence no reason to worry about it).
In the end, it is left to the individual to monitor the spread of information about themselves, a virtually impossible task considering the sheer number of companies that buy/sell your information.
These companies should be responsible for the accuracy of the information as they are the ones profiting from the sale/purchase of said information.
Wrong semi-positive information does harm too (Score:2, Interesting)
It doesn't work that way.
If you are listed with a title of CEO/President at a small company then apply for a project manager job in a Fortune 50 company, HR there will think you aren't serious or need any job badly or some other bad thing happened at the former company/employer. What if you never worked at that company at all?
All of these things weaken your ability to negotiate for reasonable and fair compensation if a job offer even comes. Since you didn't work there, it won't be on your resume and if you'
Does Google actually sell this sort of data? (Score:3)
"Companies such as Facebook and Google are closely watching this case, given the potential of billions of dollars of liability for selling inaccurate information on their customers and other people."
I was under the impression, and perhaps naively that Google did not under any circumstances sell personally identifiable data, or other information to 3rd parties. I know MS has been found guilty of breaching this, but what if at any, would Google be on the hook for here?
Re: (Score:2)
"Companies such as Facebook and Google are closely watching this case, given the potential of billions of dollars of liability for selling inaccurate information on their customers and other people."
I was under the impression, and perhaps naively that Google did not under any circumstances sell personally identifiable data, or other information to 3rd parties. I know MS has been found guilty of breaching this, but what if at any, would Google be on the hook for here?
IANAL, and I don't know what the specifics of the FCRA are, but the summary says "providing" and not "selling." It's not a stretch to see how someone like Google could fall afoul of this (as a test case, Google "spacepimp" or your real name and see if you recognize anything "personally identifiable." My guess is the answer is an emphatic "yes").
This case is not about Spokeo or data (Score:4, Insightful)
Before everyone gets upset about data collection: This Supreme Court case is not about Spokeo's data collection. It is about who has the right to sue and under what circumstances. Even if the Supreme Court rules in favor of this individual, all it means is that the individual can continue their suit. It is not a ruling for or against Spokeo's data.
Re: (Score:2)
Spokeo is anything but free. Search your name on their site, they will show you some basic information, but they want $8/mo for three months or $5/mo for six months to access the real data. There is also no way to correct the information, which is what this lawsuit is about.
Interesting (Score:5, Interesting)
... "attributable to the inaccurate data Spokeo collected on him." If a company is in the clear for publishing inaccurate data about an individual, are they also in the clear for just fabricating data? What's to say that any of the names in their lists represent real and physical people with the same name? In theory the users entered the data at one point or another and that should be enough to tie the data to a real human. BUT any coder knows it's not rocket science to write a script to fill in a form and submit it. Consider tools like Faker https://github.com/fzaninotto/... [github.com].
I was looking up my name to see what would come up (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
It would actually be against the agreements with the credit processors for them to share anything about your credit card.
Spokeo works by aggregating public information, and everything you enter on web sites that it partners with. In the EULAs or other agreements is something about third parties.
Re: (Score:3, Insightful)
and i remember going through this Spokeo site and a bunch of others. It had my name, correct address and some other information, all of which, i never gave them permission to use on their site. Right off the bat, it seems to me that is breaking the law.
What many don't realize is the amount of publicly shared information data aggregators have available - phone books, home loans recorded with your county, court cases, graduation information from universities, membership listings for organizations, what you list on Facebook or LinkedIn, public DMV records, public notices in newspapers, marriage recordings, birth recordings, professional licensing. There's an assumption many people have similar to, "How would anyone access my information unless they knew whe
Re: (Score:3)
It's not illegal, but it should be. Spokeo, I think actually has opt-outs, but others don't. Exactly how in America you don't have the right to say no to having your address published is beyond me. Don't make any enemies in this world, they sure won't have any trouble finding you. There is way too much personal info available through public records. This 1970s era pseudo honor system we have is unprepared for the age of the internet, big data and sophisticated cyber crime.
Data Brokers and Your Privacy
https: [privacyrights.org]
eula? spokeo? never clicked on one (Score:1)
So exactly when did I submit to a eula. Spokeo has my data wrong as well. I never said they could use it or make representation of information. This suit is one to watch, has huge precedent setting implications for privacy.
Re: (Score:2)