Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Government The Internet

FTC Finalizes Rule Banning Fake Reviews, Including Those Made With AI (techcrunch.com) 35

TechCrunch's Lauren Forristal reports: The U.S. Federal Trade Commission (FTC) announced on Wednesday a final rule that will tackle several types of fake reviews and prohibit marketers from using deceptive practices, such as AI-generated reviews, censoring honest negative reviews and compensating third parties for positive reviews. The decision was the result of a 5-to-0 vote. The new rule will start being enforced 60 days after it's published in the official government publication called Federal Register. [...]

According to the final rule, the maximum civil penalty for fake reviews is $51,744 per violation. However, the courts could impose lower penalties depending on the specific case. "Ultimately, courts will also decide how to calculate the number of violations in a given case," the Commission wrote. [...] The FTC initially proposed the rule on June 30, 2023, following an advanced notice of proposed rulemaking issued in November 2022. You can read the finalized rule here (PDF), but we also included a summary of it below:

- No fake or disingenuous reviews. This includes AI-generated reviews and reviews from anyone who doesn't have experience with the actual product.
- Businesses can't sell or buy reviews, whether negative or positive.
- Company insiders writing reviews need to clearly disclose their connection to the business. Officers or managers are prohibited from giving testimonials and can't ask employees to solicit reviews from relatives.
- Company-controlled review websites that claim to be independent aren't allowed.
- No using legal threats, physical threats or intimidation to forcefully delete or prevent negative reviews. Businesses also can't misrepresent that the review portion of their website comprises all or most of the reviews when it's suppressing the negative ones.
- No selling or buying fake engagement like social media followers, likes or views obtained through bots or hacked accounts.

This discussion has been archived. No new comments can be posted.

FTC Finalizes Rule Banning Fake Reviews, Including Those Made With AI

Comments Filter:
  • Are so good at stopping illegal and wrong activity Bet there is enough loopholes for the right people
  • How does the FTC intend to police reviews created outside the US?
    • is the store in the usa? is the store selling to people in the usa from the usa?

      • Is Temu? Is Shein? Is Amazon? Is Ebay? Is Apple? Is Meta? They all have head offices internationally.

        And many of them don't have bricks and mortar on US soil.

        And many of the reviews are generated out of the middle east where establishing a link to the original business may be somewhat problematic.

        Don't get me wrong. Its nice in principle. I just can't see it being enforcable.
        • Amazon, eBay, Apple, and Meta are all American companies under the jurisdiction of the United States government.
    • by labnet ( 457441 )

      It's OK the WEF is loading the One World Government in 3...2...1....

    • that was my first thought also. I mean, it is simply too big a task merely to try this with Amazon, much less all the other places online where one can buy things. I mean as an example a seller in another country, selling through say, Amazon, pays a bunch of bots for fake reviews, maybe some are actual flesh and blood people, but none have actually used or purchased the product... They are also employed by an organization outside the USA... How would the FTC then work to stop this?

      I mean, I wish it could

      • In the specific case you laid out... the FTC would hold Amazon responsible for policing reviews on their site.

        Amazon would be responsible for preventing this. And they can, via statistical analysis of the review posts. As for the individual sellers individual website in bumfuckistan? Not the FTCs problem. Although they might require Amazon to not link to external sites with unverified reviews (or accept responsibility for those reviews...)

        So that solves 99% of the issue. The edge cases? who cares?

  • Car manufacturers, Solar/Wind power installation companies, and Rotten Tomatoes hardest hit.

  • Why have they been above the law?
  • Real identity (Score:4, Insightful)

    by edi_guy ( 2225738 ) on Thursday August 15, 2024 @01:46AM (#64707600)

    At some point all of this will coalesce around, real, verifiable identity on the web for legitimate commerce sites.

    Sure this is not a popular concept on /. but the reality is that itâ(TM)s already occurred. Takes all of 30min to dox someone , or a couple bucks and 1 minute so the idea of anonymity is just a convenient mirage.

    If your situation is perilous, then yeah you can figure out some anonymity on the web, and you shouldnâ(TM)t be posting reviews on toaster ovens.
    But encouraging real identity online will remove a sh.. ton of nonsense on the net. Sure your crypto/porn/dark web sites wonâ(TM)t ask for a legit identity, but why not Amazon, Home Depot, FB, etc. ? If you want to be a jackâ"s on the web, by all means do so. Just stand by your comments.

    • Please upvote above. The only way to actually enforce this law - to prove a review was fake - is to prove WHO made the review. Which is impossible until you get some kind of cyber identity. So once again, China appears to be leading the way with their cyberID: https://www.voanews.com/a/chin... [voanews.com]
      • The only way to actually enforce this law - to prove a review was fake - is to prove WHO made the review.

        Not really. That is one way to do it. But not the one that will happen.

        As written, it is the responsibility of the manufacturer/seller/website to prevent false reviews. Statistical analysis will show what is probably fake. This is not a court of law, requiring proving beyond a reasonable doubt to a jury of your peers. If it is likely fake, that is good enough -remove it. This can be easily automated and will give a reasonable level of compliance on large sites (amazon, ebay, wallmart, etc.) Manufactur

  • Surely they wouldn't try to prosecute the author of the Nair review that was on Amazon for ages. Or maybe they would???

  • But they stopped short. They should also require engagement farmers to provide a detailed list in a timely manner to each platform they have targeted.

    If possible under statute, with each failure to identify a bot being a separate criminal charge.

  • "Businesses can't sell or buy reviews, whether negative or positive."

    Would that include presidential candidates buying online influencers ?

    Asking for a friend.

Help me, I'm a prisoner in a Fortune cookie file!

Working...