EFF Argues For 'Empowerment, Not Censorship' Online (eff.org) 62
An activism director and a legislative analyst at the EFF have co-authored an essay arguing that the key to children's safetly online "is user empowerment, not censorship," reporting on a recent hearing by the U.S. Senate's Judiciary Commitee:
While children do face problems online, some committee members seemed bent on using those problems as an excuse to censor the Internet and undermine the legal protections for free expression that we all rely on, including kids. Don't censor users; empower them to choose... [W]hen lawmakers give online platforms the impossible task of ensuring that every post meets a certain standard, those companies have little choice but to over-censor.
During the hearing, Stephen Balkam of the Family Online Safety Institute provided an astute counterpoint to the calls for a more highly filtered Internet, calling to move the discussion "from protection to empowerment." In other words, tech companies ought to give users more control over their online experience rather than forcing all of their users into an increasingly sanitized web. We agree.
It's foolish to think that one set of standards would be appropriate for all children, let alone all Internet users. But today, social media companies frequently make censorship decisions that affect everyone. Instead, companies should empower users to make their own decisions about what they see online by letting them calibrate and customize the content filtering methods those companies use. Furthermore, tech and media companies shouldn't abuse copyright and other laws to prevent third parties from offering customization options to people who want them.
The essay also argues that Congress "should closely examine companies whose business models rely on collecting, using, and selling children's personal information..."
"We've highlighted numerous examples of students effectively being forced to share data with Google through the free or low-cost cloud services and Chromebooks it provides to cash-strapped schools. We filed a complaint with the FTC in 2015 asking it to investigate Google's student data practices, but the agency never responded."
During the hearing, Stephen Balkam of the Family Online Safety Institute provided an astute counterpoint to the calls for a more highly filtered Internet, calling to move the discussion "from protection to empowerment." In other words, tech companies ought to give users more control over their online experience rather than forcing all of their users into an increasingly sanitized web. We agree.
It's foolish to think that one set of standards would be appropriate for all children, let alone all Internet users. But today, social media companies frequently make censorship decisions that affect everyone. Instead, companies should empower users to make their own decisions about what they see online by letting them calibrate and customize the content filtering methods those companies use. Furthermore, tech and media companies shouldn't abuse copyright and other laws to prevent third parties from offering customization options to people who want them.
The essay also argues that Congress "should closely examine companies whose business models rely on collecting, using, and selling children's personal information..."
"We've highlighted numerous examples of students effectively being forced to share data with Google through the free or low-cost cloud services and Chromebooks it provides to cash-strapped schools. We filed a complaint with the FTC in 2015 asking it to investigate Google's student data practices, but the agency never responded."
Re: (Score:2)
Why, I might even go so far as to describe you as a hooligan and blackguard!
Some opinions... (Score:5, Funny)
...are just wrong. Companies like Google and Facebook know that and they work hard to remove the wrongthink from the Internet.
Re: (Score:2, Insightful)
And they're under zero obligation to remove any content that is not illegal. One has to ask what their real motivations are for playing this role?
Re: Some opinions... (Score:1, Funny)
Re: Some opinions... (Score:1)
No. They see that it is profitable to create reality bubbles where any Contrary opinion is eliminated under the guise of "hate speech."
Re: (Score:2)
Is the funny mod because of a typo? The truth is obviously half opposite.
The EFF has become a rather sad joke filled with great confusion about censorship, freedom, and time.
Re: (Score:2)
True, but when they edit or filter based on content they've crossed the line from platform to publisher, which makes them little different than a newspaper.
Does this preclude them from hiding behind CDA section 230 in future legal proceedings?
47 USC 230 encourages moderation (Score:2)
when they edit or filter based on content they've crossed the line from platform to publisher, which makes them little different than a newspaper.
Does this preclude them from hiding behind CDA section 230 in future legal proceedings?
No. On the contrary, the U.S. Congress enacted 47 USC 230 to encourage platforms to moderate [theverge.com]. The key part of the statute [cornell.edu] is written as follows:
Re: (Score:2)
..or that's just the excuse to silence ALL contrary political expression.
Re: (Score:2)
Re: (Score:2)
The problem with the EFF proposal is that it still relies on these companies to do the categorization. They want people to be empowered to block whatever content they like, but behind the "extreme political ideology" filter is a complex categorizer that has to deal with text, audio and video, and somehow apply a political scale that people can't even agree on anyway.
Even if the control is put in people's hands, people will just complain that their particular ideas were wrongly classified and shouldn't be in
Re: (Score:1)
Um, that is was "wrongthink" means. Not sure why you brought up liberal, it has nothing to do with that. We live in a corporatocracy. Wrongthink is whatever doesn't benefit the corporations.
Re: (Score:1)
...are just wrong. Companies like Google and Facebook know that and they work hard to remove the wrongthink from the Internet.
A lot of this comes from the 2016 election. By using the word wrongthink your trying to frame the argument to make what they are doing automatically bad. The correct goal is not the removal of wrongthink, it is the removal, or more likely the down ranking plus fact check links of propaganda and bullshit.
Bullshit has no value. Actually that is wrong. Bullshit is a net negative to society. Better no information that bullshit. We just have to be careful not to downrank something or label it false, if it
Re: (Score:1)
Um, what does that have to do with right/left/Trump? By the way, what do you think the old "Opinion" sections of newspapers were? It is propaganda and bullshit. The major newspapers would RECOMMEND candidates for election. You guys are just mad because other people can now purchase their own "Opinion" sections and spread their own bullshit.
Re: (Score:2)
"The correct goal is not the removal of wrongthink, it is the removal, or more likely the down ranking plus fact check links of propaganda and bullshit."
This position still cedes one's responsibility to validate information to algorithms created by mega-corporations, the people who set policy and their associated biases.
It is and should be nobody's job but the reader's to determine what is and isn't bullshit. Lazily pawning that task off on others is what creates "groupthink" in the first place.
good luck (Score:3, Insightful)
When polled, most of the EU agrees that there is already too much free speech. It's a lost cause at this point and people won't relearn history until it repeats itself again.
Re: (Score:3, Insightful)
Re: (Score:2)
There is always a limit to free speech, and a balance between it and the damage that may be caused by spreading lies, insults or inciting violence.
Where you put that balance may differ between states, and be dependent on their context and history.
For Germany, for example, it is quite understandable, given their history, that Nazi proopaganda and even artifacts are forbidden.
Wether such a prohibition is effective in the long run I cannot say, but you should not be so arrogant and think that the US history an
Re: (Score:2)
If only EFF would stick to issues like these... (Score:2)
Instead, they've branched out into issues like immigration and have lost my donations because, no matter which side of the debate they chose, they alienated (no pun intended) a large portion of their donors.
Leaving the area of their expertise was a bad move.
Not workable (Score:2)
People won't bother to make the choices--they'll just take the default. They always have; exceptions to that behavior are quite rare.
Exactly (Score:1)
So many times someone complains about hate speech or something and I'm just well why did you read those bloggers / channels / etc if you didn't like them?