DMCA 'Safe Harbor' Up In the Air For Online Sites That Use Moderators (arstechnica.com) 96
"The Digital Millennium Copyright Act's so-called 'safe harbor' defense to infringement is under fire from a paparazzi photo agency," reports Ars Technica. "A new court ruling says the defense may not always be available to websites that host content submitted by third parties." The safe harbor provision "allow[s] websites to be free from legal liability for infringing content posted by their users -- so long as the website timely removes that content at the request of the rights holder," explains Ars. From the report: [A] San Francisco-based federal appeals court is ruling that, if a website uses moderators to review content posted by third parties, the safe harbor privilege may not apply. That's according to a Friday decision in a dispute brought by Mavrix Photographs against LiveJournal, which hosts the popular celebrity fan forum "Oh No they Didn't." The site hosted Mavrix-owned photos of Beyonce Knowles, Katy Perry, and other stars without authorization. LiveJournal claimed it was immune from copyright liability because it removed the photos. Mavrix claimed that the site's use of voluntary moderators removed the safe-harbor provision. The 9th U.S. Circuit Court of Appeals sided with Mavrix to a degree, but the court wants to know how much influence the moderators had on what was and was not published. With that, the court sent the case back to a lower court in Los Angeles to figure that out, perhaps in a trial. The highly nuanced decision overturned a lower court ruling that said LiveJournal was protected by safe harbor. The lower court said LiveJournal does not solicit any specific infringing material from its users or edit the content of its users' posts.
Twitter, Facebook, Etc. (Score:4, Informative)
Shaking in their booties! Imagine if Twitter were held responsible for the terrorist shit they allow?
Re: (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
There's not much to worry about from the DMCA unless the terrorists are committing copyright infringement.
Just rapings, beheadings, crucifixions, setting people on fire...nothing near so bad as copyright infringement.
DMCA: Digital Millennium COPYRIGHT Act (Score:2)
>> There's not much to worry about from the DMCA unless the terrorists are committing copyright infringement.
> nothing near so bad as copyright infringement.
DMCA stands for Digital Millennium COPYRIGHT Act. It specifies the procedure service providers must follow to have safe harbor from copyright claims. So GP is absolutely correct, beheadings don't come under the Copyright Act. Terrorism falls under different laws.
Re: (Score:3)
So GP is absolutely correct, beheadings don't come under the Copyright Act.
Well, not under the "offenses" section, but certainly under "penalties," right?
Re: (Score:1)
Even so I will consider it to be endorsement of terrorist groups and that they are compliant and supporting of them.
This stance will continue as long as shitter censors opinions they don't like.
Re: (Score:2)
Imagine if Twitter were held responsible for the terrorist shit they allow?
Assuming the terrorists intentionally distribute those videos on every available medium, there would be no problem.
The DMCA deals with copyright, and if the content owners do not complain then there are no legal issues at all.
It would have to be the original video, however. A newscast which shows the video during a segment is owned by the studio.
That's why the Internet needs an IQ test (Score:2)
If "No", proceed to Question 2.
If "Yes", GTFO.
Re:Good, past due (Score:5, Funny)
Good. This will be the end of "moderation" which silences one voice and amplifies another. This is a win for free speech. For freedom. If you love freedom you would also love anything that puts an end to this great injustice.
+5 insightful!
Oh, wait...
Re: (Score:2)
Re: (Score:2)
Giving everyone equal weight in on private services isn't "freedom of speech", it's tyranny.
Well, I considered it anarchy. Usenet, which was the wild west, and unwitting experiment of unfettered free speech, showed the tragedy of the commons effect. The trolls, flamers and internet kooks killed it pretty well. The response to any complaint was "Use a killfile". But after you spend more time adding people to a killfile than reading and participating, you just give up and go some place else.
And usenet died.
And that folks, is why we have moderation. I moderate several groups. I'm pretty lax for
Re: (Score:2)
Re: (Score:2)
The number of trolls wasn't an issue.
As a person who was and still is involved in Usenet on a moderator level, I gotta say we have a remarkably different perspective on it.
Re: (Score:2)
Re: (Score:1)
Without order, anarchy merely distills to warring feudal monarchies. There's no way to preserve freedom for all if by allowing all freedoms you neglect to protect the freedom of the meek from being stolen by the bold.
Re: (Score:2)
Every community has standards. If your behavior or opinions are utterly rejected, you probably ought to find another community.
Sometimes a lone actor can change a community for the better, but most trolls and outcasts are rejected for being odious, disrepectful, or ignorant.
Meta-moderation is important to prevent abuses, which some sites notably lack. I have seen brief instances of unmoderated commentary on the internet though, and even bad moderation is better than that.
Only if it's a whitelist (Score:2, Insightful)
If each post has to be approved by moderators I can see the safe harbor being eroded, but if it's a post facto moderation then I don't see how that changes anything.
Re:Only if it's a whitelist (Score:5, Interesting)
Re: (Score:2)
You can NOT have it both ways with mass hypocrisy as the starting ante.
Says you, and you alone. The case was sent back to determine whether or not this is true.
The court will have to choose whether it rules in a way that applies to all sites with moderation or just this particular moderation system.
Even then, there will be appeals---on this case or a future case---until an appellate court or the Supreme Court decides exactly what qualifies for safe harbor.
Eventually, if these lawsuits like these continue, the courts will have a comprehensive set of case law that indicates what
Re: (Score:2)
It would just be that the first is top, rather than some post-post moderation.
Because your standard for "moderated" includes explicitly unmoderated (by all other definitions on the planet), your definition and argument are simply invalid.
Re: (Score:2)
Even with moderator approval required before posts go public, are they seriously expecting VOLUNTARY MODERATORS to be able to recognize if a random picture in a random post belongs to the poster or not?
Re: (Score:2)
Even with moderator approval required before posts go public, are they seriously expecting VOLUNTARY MODERATORS to be able to recognize if a random picture in a random post belongs to the poster or not?
No, Mr. Poster. They expect you to shut up.
If you don't know, you don't post including editor (Score:3)
There are many images on the web, and tons of source code, for which I don't know the license. Therefore I don't use them, since I don't know whether I'm allowed to or not. (Or I first find out what the license is, such as by asking.) That's the general rule - if you don't know whether you are allowed to use some content in a particular way, either find out, or don't use it. As a general rule, that's more or less reasonable.
If the editor of a newspaper doesn't know the license status of a particular image,
Also the watermarks on the images (Score:2)
Also, in this case the watermarks on the images provide a pretty strong hint that the are professionally produced photos and the owner cares about their copyright. The editors / moderators would have seen the watermarks before approving the submission.
It's been a rough week (Score:2)
Last week I was like 'Don Rickles was still alive? Oh, bummer.' Now I'm like 'LiveJournal is still a thing? Oh, bummer.'
Re: (Score:2)
Circular (Score:2)
Isn't removing infringing content at the request of the rights holders its-self an act of moderation?
You must do that in order to be considered a safe harbor, but having moderators means you don't have safe harbor protections.
Re:Circular (Score:5, Interesting)
It looks like what makes this case different from typical moderation is "Oh No they Didn't" would review every submission before posting them, whereas typically submissions are automatically posted but can be taken down by moderators. So as I understand it the typical default approve moderation is fine, it's just this require approval before posting moderation which may be suspect.
Similarly I imagine sites like reddit where the moderators are also users is also fine, regardless of the method, since the company itself is (usually) not involved.
Re: (Score:2)
The simple fact is that, in the absence of a specific agreement to the contrary, the mere act of accepting professional services creates a liability to pay a reasonable rate for services rendered. If moderators sent BIZX (or whatever shitty company runs this place now) invoices, BIZX would need specific terms in the TOS to point to in court or they'd be made to pay.
That sounds like a reasonable requirement. Therefore I question whether you're writing from the U.S., because that sure as shit doesn't sound like something that would get through our Congress.
Re: (Score:2)
Right, which is why 17 U.S.C.S 512(c) (Safe Harbor Provision) exists, specifically absolving service providers of liability for copyright infringement under certain circumstances. If I understand correctly, IANAL, etc.
Fucking good. (Score:1)
Active moderation - along with many other things like 'karma', community moderation (*ahem*), filters, algorithms to select featured posts, and even named user accounts - introduces social pressures into online speech, heavily restricts the range of allowable opinions, and encourages a toxic, boring, self-reinforcing groupthink to develop.
Better IMO to follow the 4chan model where all posts are unedited, chronological, and anonymous, and nothing is deleted unless it is outright illegal (there posts are auto
Re:Fucking good. (Score:4, Insightful)
Because 4chan is such a shinning example of interesting arguments made by knowledgeable parties, that does not degenerate into name-calling or trivial logical fallacies.
Re: (Score:2)
faggot
and some text to get around the filter
Please try to keep posts on topic.
Try to reply to other people's comments instead of starting new threads.
Read other people's messages before posting your own to avoid simply duplicating what has already been said.
Use a clear subject that describes what your message is about.
Offtopic, Inflammatory, Inappropriate, Illegal,
Re: (Score:2)
"restricts the range of allowable opinions"
Bullshit. You can see everything at -1. To see everything, or almost nothing, or anything in between, is a user choice, not the site's choice. Obviously, your -1 post is still viewable, so just how is the range of allowable opinions restricted? You're full of shit. And that's my opinion.
Pretty soon (Score:3)
Re: (Score:1)
News flash: most of them are actually re-posting the penises themselves.
Re: (Score:2)
I guess the only thing worse than someone sending you a picture of their penis is no one wanting to send you a pic of their penis.
Moderation is not instantaneous (Score:1)
Live Journal posts AFTER editors review submission (Score:2)
> Time is needed to be made aware of infringing content, to make a judgment as to whether or not the content actually is infringing, and to take appropriate action.
In this case, Live Journal posts submissions after a team of editors / moderators have reviewed and approved submissions. They actively approved it before it was published on the site, and would have seen the watermarks on the images.
Go ahead and take 'em down!! (Score:1)
Re: (Score:2)
Yeah, that's a little weird. LJ is little more than the web host here, and moderators are not employed by LJ. This means that ONTD might have some legal liability as a group, but not LJ - even if they provide the moderation capability.
Slowwwwly but surely... (Score:3)
Interesting point about editing... (Score:2)
I moderate on a website that does allow moderators to edit user posts - I wonder if that opens them up to additional exposure in terms of images posted.
Easy (Score:1)
Just turn moderation off - completely. No issues there at all :-)
TL;DR (Score:4, Informative)
It's not that sites that use moderators can lose their safe harbor protection, but rather sites that give too much direction as to how to moderate AND where moderators exercise prior restraint such that no post goes up without having been reviewed by a moderator, can cause said moderators to be viewed as agents of the sites they moderate for rather than uninvolved third parties (and hence the Safe Harbor no longer applies).
In this case, moderators for a Livejournal community knowingly used photographs that were clearly watermarked such that any reasonable person would know they were copyrighted and they had no fair use rationale for posting them. Because Livejournal provides such explicit direction to moderators, and these moderators held posts for review, there is no way they couldn't have known what they posted, and they thus appeared to do so on behalf of Livejournal.
Communities that don't want to run into this problem simply need to avoid giving too much direction to moderators (since that could be viewed as exercising arbitrary control over them such that they are your agents), don't exercise prior restraint or otherwise hold comments for review, and remove infringing content when you become aware of it.
If you want the real story without the hyperbole and clickbait, try reading the actual fucking ruling [uscourts.gov].
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
While that's true, LJ isn't the editor or the moderator yet they're the ones being sued.
The hosts/editors of that specific blog are the party with the potential legal liability and lost their safe harbor. Re-posting user-submitted content is easily just a thin veil over the editor potentially being the submitter too.
Re: (Score:3)
While that's true, LJ isn't the editor or the moderator yet they're the ones being sued.
That's the point, though: By exercising so much control over their moderators they effectively BECAME the moderators. The moderators became agents of the principal, and their actions became attributed to the principal, hence the DMCA safe harbor no longer applies because it's not a third party doing the posting.
Re: (Score:2)
You're right - both the Slashdot summary and the linked article leave out the vital information:
In 2010, LiveJournal
sought to exercise more control over ONTD so that it could
generate advertising revenue from the popular community.
LiveJournal hired a then active moderator, Brendan Delzer,
to serve as the community’s full time “primary leader.” By
hiring Delzer, LiveJournal intended to “take over” ONTD,
grow the site, and run ads on it.
As the “primary leader,” Delzer instructs ONTD
moderators on the content they should approve and selects
and removes moderators on the basis of their performance.
Delzer also continues to perform moderator work, reviewing
and approving posts alongside the other moderators whom he
oversees.
All this decision does is remove the ability of a summary judgment. They might still find in favor of LJ at trial, but it seems unlikely. Looking at this, it seems like they are trying to claim control without claiming responsibility. The act of "editing" (posts only appear when approved) is enough to make the posting an act of LJ themselves.
Yep (Score:2)
I tend to agree.
The 'safe harbor' provision essentially says "we're a provider of a service, we can't be held responsible for the people that use the service; their actions are their own".
Once a site DOES take responsibility for the posts - filtering, banning, controlling - then they logically would become responsible for the content therein: if the leave it without deleting/moderating it, one could argue that's tacit approval.
Re: (Score:2)
Once a site DOES take responsibility for the posts - filtering, banning, controlling - then they logically would become responsible for the content therein: if the leave it without deleting/moderating it, one could argue that's tacit approval.
So Youtube's Content ID program makes them lose their Safe Harbor?
Re: (Score:2)
IMO yes.
Your phone company doesn't sort your calls and filter out curse words. Hell, they don't even block calls we ASK them to. /no/ direct responsibility if it turns out to be a prank call, porn, or a mailbomb.
The US Post Office doesn't sift the mail and prevent you getting letters that make you sad.
Fedex doesn't go through your packages and remove vibrators and sex lubes.
Those are examples of safe-harbor carriers: they care nothing for what they're transporting, thus they bear
Message boards and places li
Re: (Score:2)
The USPS and FedEx both reserve the right to inspect suspicious packages or even refuse to accept them for delivery.
Re: (Score:2)
I figured someone would go there. That's when I touched on it.
They're reviewing suspicious packages for DANGER and LEGAL COMPLIANCE, not for the value of the content.
How would you like it if the Post Office was moderating your letters, with the same powers mods have in forums - deleting communications from other people that they simply don't agree with, or whose ideas they find uncomfortable? What if you couldn't get Victoria's Secret catalogs because the USPS decided it was icky, and simply threw away
Re: (Score:2)
LEGAL COMPLIANCE == DMCA Violations
The same one that the safe harbor is a provision of...
Re: (Score:2)
Would you consider Slashdot to have taken responsibility for the posts here? Moderation exists, but they aren't the ones who do it. Seems like it'd be the moderators, not the host, who would lose safe harbor. (But they aren't hosting anyway, so there's no safe harbor to lose.)