FTC Finalizes Rule Banning Fake Reviews, Including Those Made With AI (techcrunch.com) 35
TechCrunch's Lauren Forristal reports: The U.S. Federal Trade Commission (FTC) announced on Wednesday a final rule that will tackle several types of fake reviews and prohibit marketers from using deceptive practices, such as AI-generated reviews, censoring honest negative reviews and compensating third parties for positive reviews. The decision was the result of a 5-to-0 vote. The new rule will start being enforced 60 days after it's published in the official government publication called Federal Register. [...]
According to the final rule, the maximum civil penalty for fake reviews is $51,744 per violation. However, the courts could impose lower penalties depending on the specific case. "Ultimately, courts will also decide how to calculate the number of violations in a given case," the Commission wrote. [...] The FTC initially proposed the rule on June 30, 2023, following an advanced notice of proposed rulemaking issued in November 2022. You can read the finalized rule here (PDF), but we also included a summary of it below:
- No fake or disingenuous reviews. This includes AI-generated reviews and reviews from anyone who doesn't have experience with the actual product.
- Businesses can't sell or buy reviews, whether negative or positive.
- Company insiders writing reviews need to clearly disclose their connection to the business. Officers or managers are prohibited from giving testimonials and can't ask employees to solicit reviews from relatives.
- Company-controlled review websites that claim to be independent aren't allowed.
- No using legal threats, physical threats or intimidation to forcefully delete or prevent negative reviews. Businesses also can't misrepresent that the review portion of their website comprises all or most of the reviews when it's suppressing the negative ones.
- No selling or buying fake engagement like social media followers, likes or views obtained through bots or hacked accounts.
According to the final rule, the maximum civil penalty for fake reviews is $51,744 per violation. However, the courts could impose lower penalties depending on the specific case. "Ultimately, courts will also decide how to calculate the number of violations in a given case," the Commission wrote. [...] The FTC initially proposed the rule on June 30, 2023, following an advanced notice of proposed rulemaking issued in November 2022. You can read the finalized rule here (PDF), but we also included a summary of it below:
- No fake or disingenuous reviews. This includes AI-generated reviews and reviews from anyone who doesn't have experience with the actual product.
- Businesses can't sell or buy reviews, whether negative or positive.
- Company insiders writing reviews need to clearly disclose their connection to the business. Officers or managers are prohibited from giving testimonials and can't ask employees to solicit reviews from relatives.
- Company-controlled review websites that claim to be independent aren't allowed.
- No using legal threats, physical threats or intimidation to forcefully delete or prevent negative reviews. Businesses also can't misrepresent that the review portion of their website comprises all or most of the reviews when it's suppressing the negative ones.
- No selling or buying fake engagement like social media followers, likes or views obtained through bots or hacked accounts.
Re:nuance? (Score:5, Interesting)
Re: (Score:1)
Re: (Score:2)
Government Rules and Laws... (Score:2)
Jurisdiction (Score:2)
Re: (Score:3)
is the store in the usa? is the store selling to people in the usa from the usa?
Re: (Score:1)
And many of them don't have bricks and mortar on US soil.
And many of the reviews are generated out of the middle east where establishing a link to the original business may be somewhat problematic.
Don't get me wrong. Its nice in principle. I just can't see it being enforcable.
Re: (Score:2)
Re: (Score:2)
It's OK the WEF is loading the One World Government in 3...2...1....
Re: (Score:2)
I mean, I wish it could
Re: (Score:2)
In the specific case you laid out... the FTC would hold Amazon responsible for policing reviews on their site.
Amazon would be responsible for preventing this. And they can, via statistical analysis of the review posts. As for the individual sellers individual website in bumfuckistan? Not the FTCs problem. Although they might require Amazon to not link to external sites with unverified reviews (or accept responsibility for those reviews...)
So that solves 99% of the issue. The edge cases? who cares?
Re: (Score:2)
Re: (Score:2)
We as techy-nerds tend to get too easily caught up in trying to solve edge cases, and forget that a simple majority solution is pretty damn good in real life.
Re: (Score:2)
Re: (Score:1)
The number quoted is the maximum fine. Presumably predicated on both the cost of the product and volume of sales..
Re: (Score:1)
Amazon, Walmart, (Score:2)
Car manufacturers, Solar/Wind power installation companies, and Rotten Tomatoes hardest hit.
What About Youtube and Google (Score:2)
Real identity (Score:4, Insightful)
At some point all of this will coalesce around, real, verifiable identity on the web for legitimate commerce sites.
Sure this is not a popular concept on /. but the reality is that itâ(TM)s already occurred. Takes all of 30min to dox someone , or a couple bucks and 1 minute so the idea of anonymity is just a convenient mirage.
If your situation is perilous, then yeah you can figure out some anonymity on the web, and you shouldnâ(TM)t be posting reviews on toaster ovens.
But encouraging real identity online will remove a sh.. ton of nonsense on the net. Sure your crypto/porn/dark web sites wonâ(TM)t ask for a legit identity, but why not Amazon, Home Depot, FB, etc. ? If you want to be a jackâ"s on the web, by all means do so. Just stand by your comments.
Re: (Score:2)
Re: (Score:2)
The only way to actually enforce this law - to prove a review was fake - is to prove WHO made the review.
Not really. That is one way to do it. But not the one that will happen.
As written, it is the responsibility of the manufacturer/seller/website to prevent false reviews. Statistical analysis will show what is probably fake. This is not a court of law, requiring proving beyond a reasonable doubt to a jury of your peers. If it is likely fake, that is good enough -remove it. This can be easily automated and will give a reasonable level of compliance on large sites (amazon, ebay, wallmart, etc.) Manufactur
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
You can lie and there could be consequences for those lies.
found the MAGA
What about tongue in cheek humour? (Score:2)
Surely they wouldn't try to prosecute the author of the Nair review that was on Amazon for ages. Or maybe they would???
I like that last one (Score:2)
But they stopped short. They should also require engagement farmers to provide a detailed list in a timely manner to each platform they have targeted.
If possible under statute, with each failure to identify a bot being a separate criminal charge.
politicians too? (Score:2)
"Businesses can't sell or buy reviews, whether negative or positive."
Would that include presidential candidates buying online influencers ?
Asking for a friend.
I'll just leave this here (Score:2)