Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Government Social Networks

California Governor Signs Ban On Social Media 'Aiding or Abetting' Child Abuse (theverge.com) 70

Adi Robertson reports via The Verge: California Governor Gavin Newsom has signed AB 1394, a law that would punish web services for "knowingly facilitating, aiding, or abetting commercial sexual exploitation" of children. It's one of several online regulations that California has passed in recent years, some of which have been challenged as unconstitutional. Newsom's office indicated in a press release yesterday that he had signed AB 1394, which passed California's legislature in late September.

The law is set to take effect on January 1, 2025. It adds new rules and liabilities aimed at making social media services crack down on child sexual abuse material, adding punishments for sites that "knowingly" leave reported material online. More broadly, it defines "aiding or abetting" to include "deploy[ing] a system, design, feature, or affordance that is a substantial factor in causing minor users to be victims of commercial sexual exploitation." Services can limit their risks by conducting regular audits of their systems. As motivation, the bill text cites whistleblower complaints that Facebook responded inadequately to child abuse on the platform and a 2022 Forbes article alleging that TikTok Live had become a haven for adults to prey on teenage users.

This discussion has been archived. No new comments can be posted.

California Governor Signs Ban On Social Media 'Aiding or Abetting' Child Abuse

Comments Filter:
  • Wtf (Score:3, Interesting)

    by backslashdot ( 95548 ) on Monday October 09, 2023 @05:25PM (#63913305)

    This was legal before?

    • This was legal before?

      It was nebulous.

      If you put up a bulletin board and let people post messages, should you be legally liable if some people use your board to solicit sex?

      At least in California, we now know the answer is "Yes". Uncensored forums are illegal.

      • by Osgeld ( 1900440 )

        until pretty recently it was yes across the board, cause you are holding that material on your private equipment

      • Re:Wtf (Score:4, Interesting)

        by Entrope ( 68843 ) on Monday October 09, 2023 @06:48PM (#63913539) Homepage

        Do you think you accurately described the law? What you described would blatantly violate 47 U.S. Code section 230(c)(1).

        • Do you think you accurately described the law?

          I don't know since I haven't actually read it.

          What you described would blatantly violate 47 U.S. Code section 230(c)(1).

          I agree. We'll see if the courts agree.

          • by Entrope ( 68843 )

            Well, that's quite a walk-back of your earlier confident assertions.

            Here's a hint: TFS says you're wrong about what the law makes illegal. I suspect that courts will find that "knowingly facilitating, aiding, or abetting commercial sexual exploitation" of children must be read to involve more than running an uncensored forum.

            • Well, that's quite a walk-back of your earlier confident assertions.

              Here's a hint: TFS says you're wrong about what the law makes illegal. I suspect that courts will find that "knowingly facilitating, aiding, or abetting commercial sexual exploitation" of children must be read to involve more than running an uncensored forum.

              The wording is troubling. It could mean seeing exploitive material and not deleting it. But it could also mean providing the facilities that allowed them to upload those materials. You know, if the regulators feel like cutting you or your site down. I do believe there's a need to crack down on exploitive material involving minors. I don't know that giving the government more tools to deal with an issue that isn't nearly as sweepingly huge as government officials would like us to believe it is is the right m

      • There is no way I see this facing the mustard if someone bringe it against the Supreme Court. Think of the children miscarriages of justice only get you so far.
  • be that commercial or otherwise? Then: what about abuse of anyone, no matter their age or be it sexual or otherwise ?

  • by Anonymous Coward

    Especially if they want to ban cartoons and "AI" where no actual children are being abused.

  • I'm confused (Score:3, Interesting)

    by ArhcAngel ( 247594 ) on Monday October 09, 2023 @05:44PM (#63913363)
    Newsom signs law making mutilating children as young as 12 without their parent's knowledge and immediately signs law that makes his previous law illegal!
  • I'm all for tracking down perpetrators of actual child abuse. I doubt this bill is about that.
    • Approved censorship system must be in place or you're a criminal... for the children, and we'll call you a pedo...
      No way that could go wrong from the start, right? Twitter files certainly didn't reveal any gov't demanded censorship turned out to be political... much...
      • Approved censorship system must be in place or you're a criminal... for the children, and we'll call you a pedo...

        Don't forget:

        • If you have nothing to hide, you have nothing to fear.
        • Think of the children!
        • WE MUST RID THE WORLD OF THIS SCOURGE (that we barely actually see but talk about as if it's the biggest problem facing America today)!
        • THINK OF THE CHILDREN!
        • If you have a single argument against this, your name will be added to the suspected pedophile list. You have been warned.
        • THINK OF THE CHILDREN! I know I do! Wait. Don't look at my computer. That's research material. Yeah, that's it. I'm researching the problem.

        It

  • Honest Question: (Score:5, Insightful)

    by sursurrus ( 796632 ) on Monday October 09, 2023 @06:23PM (#63913453)

    Did Newsome suddenly sign or veto something like several dozen bills? Why has my news feed blown up with at least 10 items today? It raises many strange possibilities, including distracting from some other bill they want kept under the radar, or some kind of controversy around Feinstein's replacement

    • Did Newsome suddenly sign or veto something like several dozen bills?

      Yes. He had a stack of bills from the most recent legislative session on his desk that he finally signed/vetoed. This was done in quick succession, so there are a bunch of new laws that are suddenly passing or failing.

      • by mjwx ( 966435 )

        Did Newsome suddenly sign or veto something like several dozen bills?

        Yes. He had a stack of bills from the most recent legislative session on his desk that he finally signed/vetoed. This was done in quick succession, so there are a bunch of new laws that are suddenly passing or failing.

        Erm... Surely a governor doesn't have the power to just disregard a law that has passed both houses?

        I thought it was like Royal Ascent, where the law has already passed and it's just a bit of theatre for tradition's sake.

        • He has 3 (4?) choices.

          He can sign the law, giving his assent to the law.

          He can not sign the law, in which case it becomes law without his assent.

          He can veto the law, sending it back to the state assembly/senate for revision or override (a veto can be overridden by a 2/3 vote).

          In some states (including California) the governor has line-item veto power, which allows them to modify a bill before signing it by striking individual clauses from the law. Sometimes this is a minor effect, sometimes it can complete

    • The legislature frequently argues and debates throughout the session then sends the majority of new laws to the governor at the end of the session en masse.

      "ok, we did our part.. You deal with it all now. We are going home."

  • by markdavis ( 642305 ) on Monday October 09, 2023 @07:07PM (#63913573)

    If I haven't said it once, I'll say it 1,000 times more. Good intentions don't matter, it is the outcomes. And every one of these schemes will somehow lead to:

    1) Age verification schemes
    2) Which actually means positive ID schemes
    3) Which leads to zero privacy online for EVERYONE

    That that is on top of just outright censorship. It doesn't matter if it is coming from the Left or the Right. You simply cannot "protect the children" from the server-side without creating extreme negative consequences for adults and all of society.

    Protecting children is the job of parents. If you want to do something positive, then empower parents to take that responsibility seriously and make sure CHILDREN have protection on the CLIENT side.... Devices that CANNOT GO to places they shouldn't or interact with people they SHOULD NOT (unknown or unapproved contacts, by the parents). Make locked-down, white-list, child-friendly devices and abilities easier and more prevalent. Allow sites to flag things instead of banning them or hiding them behind ID logins, that can be picked up by client devices, if configured to do so.

    NOTHING will be perfect. But destroying privacy and freedom for adults should not be the answer to the problem.

    • There might be a technological solution that allows age verification without identity verification, I'm not sure. But its seems very unlikely the government will want such a solution since the ability to track people's behavior is of such high value. Also it will be difficult to convince users that they are anonymous

      That said, the whole idea of privacy seems to be evaporating, so maybe no one will actually care that they can ID'd. (until they run for political office of course...)
    • by AmiMoJo ( 196126 )

      You don't need age verification to stop the sexual abuse of children on your platform. What you need are some technical measures (blocking images with known hashes, provided by law enforcement), and moderators who can quickly intervene when material is brought to their attention.

      What caused PornHub to mass-delete user uploaded videos is that they had been ignoring requests from children and their parents to remove child abuse images. Often it was revenge porn or stolen video/photos. They made unreasonable d

  • Ohh, won't somebody please think of the children?!

  • Can parents sue TikTok for self harm videos their kid watched?

  • In California, everything not forbidden is compulsory. The only things we still have to do is find things we missed to either forbid or mandate.

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...