Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Censorship Your Rights Online

The Algorithmic Copyright Cops: Streaming Video's Robotic Overlords 194

thomst writes "Geeta Dayal of Wired's Threat Level blog posts an interesting report about bot-mediated automatic takedowns of streaming video. He mentions the interruption of Michelle Obama's speech at the DNC, and the blocking of NASA's coverage of Mars rover Curiosity's landing by a Scripps News Service bot, but the story really drills down on the abrupt disappearance of the Hugo Award's live stream of Neil Gaiman's acceptance speech for his Doctor Who script. (Apparently the trigger was a brief clip from the Doctor Who episode itself, despite the fact that it was clearly a case of fair use.) Dayal points the finger at Vobile, whose content-blocking technology was used by Ustream, which hosted the derailed coverage of the Hugos."
This discussion has been archived. No new comments can be posted.

The Algorithmic Copyright Cops: Streaming Video's Robotic Overlords

Comments Filter:
  • by c0lo ( 1497653 ) on Thursday September 06, 2012 @08:13PM (#41255333)

    the trigger was a brief clip from the Doctor Who episode itself

    In itself, the tech has shown an impressive quality if a brief clip was recognized in realtime.

    Would anyone blame the hammer because it's an excellent tool to drive nails under one's... well... nails?

  • by Anonymous Coward on Thursday September 06, 2012 @08:16PM (#41255363)

    But the RIAA/MPAA has already dictated the terms of fair use: Any use that brings us revenue is fair, and all others are not :)

    Reason I think we should stuff a hot poker up their asses and make copyright a flat 18 years for individuals and 5 years for corporations, with not extensions and a one year loss in term for each transferral of copyright (be it selling the copyright or merging/wholly owning the company).

    That would solve the current issues with it, provide revenue over the primary useful life of the material, cut into residuals sadly, but result in more long term innovation since not producing new material will result in bankruptcy rather than an endless stream of relicensing/remaking old material. If all actors/actresses got flat pay (same as 'staff') however it'd be no different than any modern non-IP related job.

  • Ustream Boycott (Score:0, Interesting)

    by Anonymous Coward on Thursday September 06, 2012 @08:33PM (#41255495)

    From TFA:

    Brad Hunstable, Ustream’s CEO, says the volume of content is overwhelming and content-blocking algorithms are key to keeping copyright holders happy.

    Let's boycott Ustream for a week, starting October 1st. Perhaps some Reddittors can give this some boost. Who does Ustream want to listen to, the MPAA or those that they serve their ads to?

  • by Drishmung ( 458368 ) on Thursday September 06, 2012 @08:48PM (#41255647)
    The problem with that is that the *IAA don't, strictly, make a takedown request. This is a proactive service that Google/Ustream et al offer well above the DMCA requirements. So, there is no way to penalise them for what they will claim they didn't do.

    Instead, make the takedown request cost up front. It costs Google/Ustream etc. to implement the bots. It seems reasonable that those benefiting from them should pay.

    I suggest something like:

    • You put $x up front into our account, @$y per implemented block, sufficient to process n takedowns. (n = $x/$y).
    • Any takedowns in excess of 'n' will not be processed.
    • At the end of the month, you will be rebated m x $y where 'm' is the number of undisputed takedowns.
    • Disputed takedowns will not be reblocked. You must file a DMCA takedown f you wish to dispute the case.

      Still not perfect, but if the studios don't like it there is always the DCMA.

      (What Google get out of this is essentially the interest on the money for a month---not much but enough to compensate them somewhat).

  • by c0lo ( 1497653 ) on Thursday September 06, 2012 @09:58PM (#41256101)

    This technology was designed to find infringement. It was not designed to find cute images of puppies. There is nothing in the code to recognize fair use. The technology is intrinsically broken.

    Correction: as demonstrated, the technology is excellent (in its recognition capabilities). Also as demonstrated, the use of the technology for certain purposes (police copyright infringement) is broken.
    It doesn't mean that for other purposes (finding images of cute puppies included) the same technology cannot be excellent.

    My point: don't blame the "robots", blame those who use them as "overlords". Otherwise, you'd be only adopting the same position to those who would very much like to ban/criminalize a technology (e.g. encryption? The use of Tor?) only because they can be used for copyright infringement or drug trafficking.

  • by imp ( 7585 ) on Thursday September 06, 2012 @10:13PM (#41256181) Homepage

    ... skynet lives and it is testing its metal...

  • by formfeed ( 703859 ) on Thursday September 06, 2012 @10:30PM (#41256289)
    female or legitimate female?
  • by tqk ( 413719 ) <s.keeling@mail.com> on Thursday September 06, 2012 @10:34PM (#41256299)

    Reason I think we should stuff a hot poker up their asses and make copyright a flat 18 years for individuals and 5 years for corporations ...

    Nope, too generous. They're both flawed concepts, and too easily gamed. Zero years for both. Compete on your merits, damnit! Don't expect us to help you with legislative crutches. These are the rules the rest of us are expected to go by. Welcome to reality. Suck it up.

  • Re:The Solution (Score:4, Interesting)

    by mark-t ( 151149 ) <markt AT nerdflat DOT com> on Thursday September 06, 2012 @10:34PM (#41256301) Journal
    The problem could also be partially solved by simply instituting legal fines to corporations that falsely accuse somebody of infringing on copyright. There'd be no particular benefit to anyone who was wrongly accused, but if the fines are heavy enough, there could plenty of disincentive for companies to do that to people in the first place.
  • by Mr. Shotgun ( 832121 ) on Thursday September 06, 2012 @11:58PM (#41256679)

    Unfortunately the example from the summary are not fair use cases, more like original producers vs hangers on. The content publishers are using bot's without checking the results. They need to have some guy checking the flags and using sanity testing to verify if the flag is correct. I mean come on, NASA vs some newspaper in Cincinnati, who in the fuck is more likely to have produced footage from the curiosity rover on Mars. Or DNC coverage, who has the copyright, the DNC or a news organization rebroadcasting what the DNC made? Some types of people accept what a program says as the gospel truth, which leads to fuckups like the content flagging and Knight Capital. Computers are tools, not overlords as someone else said.

  • by Anonymous Coward on Friday September 07, 2012 @12:09AM (#41256715)

    It's also because he's friendly with the Clintons. When he was President, Bill Clinton did a lot of work deregulating media monopoly/ownership laws. The big 5 or whatnot owe their current situation to that camp, so that's probably why major media tries to always downplay his mistakes, ridicule his critics, and hype up his successes.

    It's kind of a scary "special interest" when you really think about it. It's almost like legalized blackmail, if you look at it from that angle. The sad thing is that American culture as a whole will continue to degrade and stagnate, as the government continues to outlaw competition and innovation via grossly distorted Copyright legislation.

  • by curunir ( 98273 ) * on Friday September 07, 2012 @12:12AM (#41256733) Homepage Journal

    The problem with your argument is that fair use is not a right. Instead, it's a defense against infringement. Even if something is within the boundaries of fair use, no one is required to respect that...it only protects you from being liable for infringement. So when someone (or some machine) denies that fair use, there's nothing legally wrong with doing so.

    The problem isn't (yet) with the definition of fair use, it's with the lack of protection of fair use as a right. For the purpose it serves, fair use is defined well enough...it describes enough to explain the intent and purposefully leaves the interpretation to judges and juries. To protect against cases like the one in the story, we need to first make it against the law to deny fair use...then we can worry about more explicitly defining what is and isn't fair use.

  • by Anonymous Coward on Friday September 07, 2012 @02:16AM (#41257301)

    So they can do something and then sue someone else for it? That sounds like the perfect business model!

  • by confuscan ( 2541066 ) on Friday September 07, 2012 @08:08AM (#41258651)
    This is a perfect example of an arms race where one side has no governor (my hat tip to the earlier Jefferson reference). James succinctly described the rationale and process by which DCMA laws came into being. However, the fatal flaw is that DCMA enforces no penalty for excessive false positives. Consequently, as a content owner, it is in my best interest to take down first and ask (actually, they never ask) questions later. Businesses are driven to maximize the value of their product. Creating scarcity is a straightforward and easily understood concept. DCMA is just one more tool by which a business can achieve this goal.

    That said, creating a monopoly was and is to a degree, another time-honoured tradition of creating scarcity and maximizing value. However, that was legislated as illegal. Likewise, there is an opportunity to amend DCMA and create a governor that addresses egregious false positive offenders. The penalties could be administrative, lose their privilege to use automated take downs (e.g., take away their toys) or even financial. Effectively, you would create a new line to quote James that balances the process.

    Will this happen? Unlikely unless organizations with influence and money are consistently impacted by the current law. Other than the rare case as we've seen here, that has not happened and the status quo is likely to remain.

There are two ways to write error-free programs; only the third one works.

Working...