Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
The Courts Communications The Internet Technology

DMCA 'Safe Harbor' Up In the Air For Online Sites That Use Moderators (arstechnica.com) 96

"The Digital Millennium Copyright Act's so-called 'safe harbor' defense to infringement is under fire from a paparazzi photo agency," reports Ars Technica. "A new court ruling says the defense may not always be available to websites that host content submitted by third parties." The safe harbor provision "allow[s] websites to be free from legal liability for infringing content posted by their users -- so long as the website timely removes that content at the request of the rights holder," explains Ars. From the report: [A] San Francisco-based federal appeals court is ruling that, if a website uses moderators to review content posted by third parties, the safe harbor privilege may not apply. That's according to a Friday decision in a dispute brought by Mavrix Photographs against LiveJournal, which hosts the popular celebrity fan forum "Oh No they Didn't." The site hosted Mavrix-owned photos of Beyonce Knowles, Katy Perry, and other stars without authorization. LiveJournal claimed it was immune from copyright liability because it removed the photos. Mavrix claimed that the site's use of voluntary moderators removed the safe-harbor provision. The 9th U.S. Circuit Court of Appeals sided with Mavrix to a degree, but the court wants to know how much influence the moderators had on what was and was not published. With that, the court sent the case back to a lower court in Los Angeles to figure that out, perhaps in a trial. The highly nuanced decision overturned a lower court ruling that said LiveJournal was protected by safe harbor. The lower court said LiveJournal does not solicit any specific infringing material from its users or edit the content of its users' posts.
This discussion has been archived. No new comments can be posted.

DMCA 'Safe Harbor' Up In the Air For Online Sites That Use Moderators

Comments Filter:
  • by sexconker ( 1179573 ) on Monday April 10, 2017 @07:09PM (#54210927)

    Shaking in their booties! Imagine if Twitter were held responsible for the terrorist shit they allow?

    • Re: (Score:3, Insightful)

      by phantomfive ( 622387 )
      Most likely they'll make deals with content companies, much like Youtube has done. It's the smaller websites that really need to worry, because they have no negotiation power.
      • by Holi ( 250190 )
        So just don't moderate, I wonder how that would effect user moderation though.
    • Imagine if Twitter were held responsible for the terrorist shit they allow?

      Assuming the terrorists intentionally distribute those videos on every available medium, there would be no problem.

      The DMCA deals with copyright, and if the content owners do not complain then there are no legal issues at all.

      It would have to be the original video, however. A newscast which shows the video during a segment is owned by the studio.

  • Question 1: Do you find celebrities interesting?

    If "No", proceed to Question 2.

    If "Yes", GTFO.
  • by Anonymous Coward

    If each post has to be approved by moderators I can see the safe harbor being eroded, but if it's a post facto moderation then I don't see how that changes anything.

    • by Calydor ( 739835 )

      Even with moderator approval required before posts go public, are they seriously expecting VOLUNTARY MODERATORS to be able to recognize if a random picture in a random post belongs to the poster or not?

      • Even with moderator approval required before posts go public, are they seriously expecting VOLUNTARY MODERATORS to be able to recognize if a random picture in a random post belongs to the poster or not?

        No, Mr. Poster. They expect you to shut up.

      • There are many images on the web, and tons of source code, for which I don't know the license. Therefore I don't use them, since I don't know whether I'm allowed to or not. (Or I first find out what the license is, such as by asking.) That's the general rule - if you don't know whether you are allowed to use some content in a particular way, either find out, or don't use it. As a general rule, that's more or less reasonable.

        If the editor of a newspaper doesn't know the license status of a particular image,

        • Also, in this case the watermarks on the images provide a pretty strong hint that the are professionally produced photos and the owner cares about their copyright. The editors / moderators would have seen the watermarks before approving the submission.

  • Last week I was like 'Don Rickles was still alive? Oh, bummer.' Now I'm like 'LiveJournal is still a thing? Oh, bummer.'

  • Isn't removing infringing content at the request of the rights holders its-self an act of moderation?
    You must do that in order to be considered a safe harbor, but having moderators means you don't have safe harbor protections.

  • by Anonymous Coward

    Active moderation - along with many other things like 'karma', community moderation (*ahem*), filters, algorithms to select featured posts, and even named user accounts - introduces social pressures into online speech, heavily restricts the range of allowable opinions, and encourages a toxic, boring, self-reinforcing groupthink to develop.

    Better IMO to follow the 4chan model where all posts are unedited, chronological, and anonymous, and nothing is deleted unless it is outright illegal (there posts are auto

    • Re:Fucking good. (Score:4, Insightful)

      by iris-n ( 1276146 ) on Tuesday April 11, 2017 @07:50AM (#54212569)

      Because 4chan is such a shinning example of interesting arguments made by knowledgeable parties, that does not degenerate into name-calling or trivial logical fallacies.

      • by bvimo ( 780026 )

        faggot

        and some text to get around the filter

        Please try to keep posts on topic.
        Try to reply to other people's comments instead of starting new threads.
        Read other people's messages before posting your own to avoid simply duplicating what has already been said.
        Use a clear subject that describes what your message is about.
        Offtopic, Inflammatory, Inappropriate, Illegal,

    • "restricts the range of allowable opinions"

      Bullshit. You can see everything at -1. To see everything, or almost nothing, or anything in between, is a user choice, not the site's choice. Obviously, your -1 post is still viewable, so just how is the range of allowable opinions restricted? You're full of shit. And that's my opinion.

  • by AndyKron ( 937105 ) on Monday April 10, 2017 @08:03PM (#54211137)
    Pretty soon the only thing we'll be able to post are pictures of our own penises.
  • Time is needed to be made aware of infringing content, to make a judgment as to whether or not the content actually is infringing, and to take appropriate action. Not allowing for enough time is short-sighted, vindictive, and can only serve one purpose: To harm or even shut down the forum accused of allowing infringement, silencing all legitimate communication in the process as well.
    • > Time is needed to be made aware of infringing content, to make a judgment as to whether or not the content actually is infringing, and to take appropriate action.

      In this case, Live Journal posts submissions after a team of editors / moderators have reviewed and approved submissions. They actively approved it before it was published on the site, and would have seen the watermarks on the images.

  • Its your loss! No one is going to care! Go ahead and tell us to wipe out your stuff. There will be a point where people don't care about your stuff.. Do you think we are going to run out of fantastic things to post? You should feel honoured that we pay attention to you at all!! ..And this will somehow stop us from making our own websites with humour and interest?? I don't think so! There are far more people just walking down the who are far more interesting, and better looking than the so called "famous" on
  • by CanEHdian ( 1098955 ) on Monday April 10, 2017 @09:30PM (#54211395)
    See how the so-called "rightsholders" squeeze and squeeze... try to get a little bit more, and a little bit more, and a little bit more every time? And every time they get it? They learned from SOPA/PIPA. Don't go for the big bang. Go for it step by step. If you told Europeans 20 years ago that the Copyright Industry would have the power to block you at the ISP level from accessing websites (except for the Germans, they have a very strong Copyright Industry that made them pay fees on computer printers since forever since you "might print out a book with it") they would have said you were completely insane. Yet here we are. Soon the US will be on the block "internationals standards" etc. that they put in place themselves through endless, tireless lobbying and litigation.
  • I moderate on a website that does allow moderators to edit user posts - I wonder if that opens them up to additional exposure in terms of images posted.

  • by Anonymous Coward

    Just turn moderation off - completely. No issues there at all :-)

  • TL;DR (Score:4, Informative)

    by Digital Avatar ( 752673 ) on Tuesday April 11, 2017 @01:32AM (#54211935) Journal

    It's not that sites that use moderators can lose their safe harbor protection, but rather sites that give too much direction as to how to moderate AND where moderators exercise prior restraint such that no post goes up without having been reviewed by a moderator, can cause said moderators to be viewed as agents of the sites they moderate for rather than uninvolved third parties (and hence the Safe Harbor no longer applies).

    In this case, moderators for a Livejournal community knowingly used photographs that were clearly watermarked such that any reasonable person would know they were copyrighted and they had no fair use rationale for posting them. Because Livejournal provides such explicit direction to moderators, and these moderators held posts for review, there is no way they couldn't have known what they posted, and they thus appeared to do so on behalf of Livejournal.

    Communities that don't want to run into this problem simply need to avoid giving too much direction to moderators (since that could be viewed as exercising arbitrary control over them such that they are your agents), don't exercise prior restraint or otherwise hold comments for review, and remove infringing content when you become aware of it.

    If you want the real story without the hyperbole and clickbait, try reading the actual fucking ruling [uscourts.gov].

    • Kind of like the difference between being an employee and an independent contractor. Too many constraints? You're an employee.
      • Yeah, more or less. Too many constraints = arbitrary control = you are now an agent of the principal and your actions are their actions.
    • While that's true, LJ isn't the editor or the moderator yet they're the ones being sued.

      The hosts/editors of that specific blog are the party with the potential legal liability and lost their safe harbor. Re-posting user-submitted content is easily just a thin veil over the editor potentially being the submitter too.

      • While that's true, LJ isn't the editor or the moderator yet they're the ones being sued.

        That's the point, though: By exercising so much control over their moderators they effectively BECAME the moderators. The moderators became agents of the principal, and their actions became attributed to the principal, hence the DMCA safe harbor no longer applies because it's not a third party doing the posting.

        • You're right - both the Slashdot summary and the linked article leave out the vital information:

          In 2010, LiveJournal
          sought to exercise more control over ONTD so that it could
          generate advertising revenue from the popular community.
          LiveJournal hired a then active moderator, Brendan Delzer,
          to serve as the community’s full time “primary leader.” By
          hiring Delzer, LiveJournal intended to “take over” ONTD,
          grow the site, and run ads on it.

          As the “primary leader,” Delzer instructs ONTD
          moderators on the content they should approve and selects
          and removes moderators on the basis of their performance.
          Delzer also continues to perform moderator work, reviewing
          and approving posts alongside the other moderators whom he
          oversees.

          All this decision does is remove the ability of a summary judgment. They might still find in favor of LJ at trial, but it seems unlikely. Looking at this, it seems like they are trying to claim control without claiming responsibility. The act of "editing" (posts only appear when approved) is enough to make the posting an act of LJ themselves.

  • I tend to agree.

    The 'safe harbor' provision essentially says "we're a provider of a service, we can't be held responsible for the people that use the service; their actions are their own".

    Once a site DOES take responsibility for the posts - filtering, banning, controlling - then they logically would become responsible for the content therein: if the leave it without deleting/moderating it, one could argue that's tacit approval.

    • Once a site DOES take responsibility for the posts - filtering, banning, controlling - then they logically would become responsible for the content therein: if the leave it without deleting/moderating it, one could argue that's tacit approval.

      So Youtube's Content ID program makes them lose their Safe Harbor?

      • IMO yes.

        Your phone company doesn't sort your calls and filter out curse words. Hell, they don't even block calls we ASK them to.
        The US Post Office doesn't sift the mail and prevent you getting letters that make you sad.
        Fedex doesn't go through your packages and remove vibrators and sex lubes.
        Those are examples of safe-harbor carriers: they care nothing for what they're transporting, thus they bear /no/ direct responsibility if it turns out to be a prank call, porn, or a mailbomb.

        Message boards and places li

        • The USPS and FedEx both reserve the right to inspect suspicious packages or even refuse to accept them for delivery.

          • I figured someone would go there. That's when I touched on it.
            They're reviewing suspicious packages for DANGER and LEGAL COMPLIANCE, not for the value of the content.

            How would you like it if the Post Office was moderating your letters, with the same powers mods have in forums - deleting communications from other people that they simply don't agree with, or whose ideas they find uncomfortable? What if you couldn't get Victoria's Secret catalogs because the USPS decided it was icky, and simply threw away

    • by Sloppy ( 14984 )

      Once a site DOES take responsibility for the posts - filtering, banning, controlling - then they logically would become responsible for the content therein

      Would you consider Slashdot to have taken responsibility for the posts here? Moderation exists, but they aren't the ones who do it. Seems like it'd be the moderators, not the host, who would lose safe harbor. (But they aren't hosting anyway, so there's no safe harbor to lose.)

The opossum is a very sophisticated animal. It doesn't even get up until 5 or 6 PM.

Working...