FCC Seeks Comment on Internet Filtering Rules 154
"The law in question (CIPA) requires filters to be used whenever *anyone* is using the computer, although adults with "bona fide" research needs may request that they be disabled. Minors may not use unfiltered computers with Internet access, even if a parent gives permission or a teacher deems the site to be wrongly filtered.
"The FCC is seeking comment on how to go about implementing this requirement, and it is on the FAST TRACK. Comments have to be made in the next 15 days! For anyone interested in participating, the FCC makes it easy.
"How to Submit a Comment to the FCC: The FCC provides a number of ways for members of the public to comment on proceedings. The easiest is their web-based form.
"The Internet filtering proceeding docket number, which you must include in any comment to the FCC, is: 96-45."
URL changed (Score:3)
`ø,,ø!
Corrected URL (Score:1)
http://www.ala.org/cipa/FCCRulemaking.pdf [ala.org]
it's a PDF.
We've always got filtering stuff at LISNews.com [lisnews.com] too, :-)
Filtering should happen (Score:1)
Talk to them, why? Except to say, "Good job," that is.
I don't understand this knee-jerk
The last thing I want my tax dollars going to is for some dude to sit around all day surfing for porn. Screw that, I pay for my own porn just like everyone else should.
Education. The drug of the nation... (Score:4)
And if they are viewing porn, which knowing the number of 13 year old boys which I do, I find very unlikely, just hardwire their seats into the mains and give them a 'little' jolt for picture viewed.
You'll find that this aversion therapy was highly successful in curing homosexuals in the 50s, which is why it's now damn near impossible to buy chapps in downtown San Francisco...
Filtering Doesn't work (Score:4)
A better solution is to have the computer out in the open where everyone can see it.
For kids at home, this would be in the family room. [the rule of thumb at home is that, if you can buy your own, then you can have it in your room. If mom and dad buy it, it is in the family room.]
For Libraries, this would be not in the side room, but in the main hall where every passer by can see. This being a public space, maybe the illusion of "privacy" in this very public space should be dispensed with. This thought has problems both ways tho.
But this would certainly enforce "community standards"
Now there is a issue as far as making sure that there is no one hacking the system. But that is a sysadmin security issue, not a filtering issue.
Re:Filtering should happen (Score:4)
Because filters don't work, and block "legitimate" information along with what you consider "porn".
Because filters block sites based on arbitrary criteria chosen by people with agendas.
Because we all pay for public libraries, not just those of you who want to decide what we all see and learn.
And oh yeah, because it's unconstitutional. Deny it all you like, it's censorship based on content.
Courts say no! (Score:2)
Re:URL changed (Score:1)
Re:Filtering should happen (Score:3)
Open-source blocking software? (Score:5)
`ø,,ø!
Re:Filtering Doesn't work (Score:3)
I was mercilessly flamed for daring to suggest that parents become involved in their children's activities on the 'net. I was seen as accusing someone of bad parenting because I had the gall to suggest practical solutions over commercial snake-oil.
Given this attitude, I would imagine this rule will fairly sail through the rule-making process. We certainly can't have parents taking on the responsibility of monitoring their kids on the web, or even instilling in them the moral values necessary to differentiate between what parents consider to be acceptable and unacceptable content.
Filtering Doesn't work (but does the community) (Score:3)
Yes it certainly does. For starters, I wouldn't want my borrowing history to be broadcast to everyone. Why? Wrong question. It is my business what I read and the idea that I must justify that is wrongheaded. The question is it in the best interests of the nation to know what I read. The answer, it isn't. Therefore I demand that my privacy rights continue to be respected.
How is putting the Net box in a public place any different. One example. What if I'm too poor, too old, too backwards to own a computer (maybe I just like to read paperbacks when I'm at home), but my wife comes down with breast cancer. So, being the caring guy that I am, I treck down to the library and want to do some research so we can ask good questions of the oncologist during the course of treatment. Again, I see no reason why I should have to justify my right to privacy, but this is a good example of a time where I really don't want the entire neighborhood looking over my should enforcing 'community standards'. I just want to do my research, print some stuff out, and go home. It's my computer, my taxes paid for it and I resent 'the man' telling me where and how I can use it just as much as I would resent him telling me what books are fit to be published.
--
What do you mean my son can't look up peacefire.or (Score:5)
You're kidding me.
You're not kidding me?
Wonderful way to protect censorware companies from having to actually update their lists. Also a rather interesting removal of individual power from parents and teachers. I guess a lot of bleating about "giving educational choice back to the parents" only extends as far as where parents can spend their money, not what their children can look at, even when the filtering software is clearly wrong in the eyes of the adults involved.
"My son needs to look up 2600.com for a project on hackers." "My daughter is doing a research project on breast cancer." "Why can't I see any sites about Wicca? I'm doing a project for religion class." Legitimate queries, all of which under law can only be met with, "I'm sorry, but I can't shut down the filter."
Consider that many filters not only try to block sexual pornography, but often sites about other religions, left- and sometimes right-wing political philosophies, gay and lesbian sites, peacefire.org [peacefire.org], and anything else that doesn't fit the political agenda of the software company's target market. Fortunately, some (most?) filters allow the degree of blocking to be tailored, but how often will the administrator in a library or school be knowledgable enough to not just block everything out of worry or paranoia?
Yeah, it'll scare off the porn jockeys, but they can just claim to be looking up something else and have the filters turned off anyway. Not to mention intelligent, knowledgable students (the existence of which I'm sure will be considered dangerous to the public good, since they tend to know realfacts, not goodfacts) who already know how to get around filters will do so, and probably get accused of "hacking" by teachers who should know better.
If I were an American resident, I'd already have sent off some rather long letters to representatives about this moronic law, and I'd send a comment to the FCC now telling them software should not be implemented in taxpayer-funded facilities. that serves to restrict political and religious freedom. Find a filter that blocks only pornography, or recommend that filters be set only to block sex and extreme violence (and hope the latter doesn't block sites about the horrors of the Holocaust). Of course, I'd love to say "don't follow the law at all, it's badly-conceived", but I somehow doubt it would be taken seriously.
DMCA, UCITA, CIPA - government of the people, for the people?
Re:Filtering Doesn't work (Score:1)
A better way to handle the problem is to use monitoring software which watches for certain cities and alerts a supervisor if those sites are hit. The supervisor could then take action accordingly.
Monitoring content... (Score:1)
Australia should take note. (Score:2)
Middle Ground (Score:1)
Questions: Filtering Software and Library Policies (Score:1)
Librarian issues (Score:2)
Re:Filtering should happen (Score:2)
IMHO (Score:1)
I don't beleive that widespread filtering is a solution to anything. What would that filter would actually filter out? Has the image recognition software evolved so much that it can differenciate between porn and art?
For text recognition, how does a software make a difference between acceptable and acceptable? By using a vocabulary black-list? Does that mean that online dictionnairies will not be accessible from librairies because they contain words like "nazi", "devil", or "murder"? Will the words be taken out of context? If they're taken in context, how does the software draw the line between a site promoting hainous crimes and another simply relating to them in a journalistic fashion?
Will they filter out IP addresses? If so, who decides what's on the black list? Will Slashdot be on that blacklist??? (There was an article about virtual kiddie porn yesterday that contained some "bad" words).
What is scary about all this is that if the filtering is done by machine, there will be mistakes and "good" sites won't be available, while "bad" sites will be accessible. What is even scarier is if the filtering is done using a human generated blacklist of websites, the censure will actually be tainted and subjective, reflective of the censoring party's ideological, moral, religious and political beleifs, which is not democratic at all. And since america just "elected" a right-wing president, this can just go sour, IMHO.
I beleive in a free internet, where information flows freely, uncensored and unfiltered. Internet is not a babysitter nor an educator.
That will be all.
Filtering isn't necessarily bad... (Score:3)
I've said before that librarians have generally done a fine job of selecting what they do and don't want in their libraries over the centuries. Why don't we work at developing generic tools which can be set as appropriate by the librarians, if they so desire? I'm not normally an open source zealot, but this is one case where it's absolutely essential that the code be open for review, and you can bet that FCC censorware (along with it's stupid rules) won't be.
Disclaimer: I'm not a citizen or resident of the USA.
Oh. I get it. (Score:2)
I get it. The government takes my money in the name of the common good and uses it to micromanage the institutions that are out there doing the good. Where have I seen this before? Oh yeah, Federal Financial aid for education (Title IV). Either way, we pay for education, but now the money filters through the Feds' hands and arrives with strings attached.
Re:Filtering should happen (Score:1)
Re:Open-source blocking software? (Score:1)
public displays (Score:1)
It got so bad that women on the otherwise largely liberal library staff complained to the board that it was creating a hostile work environment.
I'm not advocating anything here, just an observation about the effectiveness of an open floor plan as a deterent.
Getting involved... (Score:2)
After reading about Holland, MI, USA on this site last year (or so), I decided that it would be in my best interest for this 25 yr. old h*cker to 'get involved'.
We recently moved (2000-12-01) to a new town and I read in the paper that the Mayor was taking applications to fill the city boards, one of them being the Library Board. So, even though I had zero experience and really didn't know anyone in town, I applied.
On the application I mentioned my tech background, spoke about the issues of filtering and mentioned that the city could use a few people who 'understood the technology and the issues at hand'.
After a couple of interviews with the mayor, I was appointed last week! You're looking at the newest Library Board member (3 year term) for the city of Northfield, MN.
Filtering proponents beware! You have no friend here!
Re:Librarian issues (Score:1)
This regulation simply enforces existing law (Score:4)
The CHIP act requires that Internet connections be filtered for both adults and children, in both schools and libraries that recieve discounts on Internet connections & equipment. (Mind you, I'm not falling for the "we paid for it, so we get to reulate it" fallacy. After all, I paid for some of it, and I support uncensored, full access at public libraries.)
Additionally, it requires that Internet connections at schools (but not libraries), be monitored. And it requires both institutions to certify the existence of a "internet safety" policy. In the spirit of France's "Commitee for public safety" thsi apprently means that librarians are required to evesdrop on electronic mail.
ISPs (Score:2)
Re:Filtering Doesn't work (but does the community) (Score:1)
But why can't you do such research the "old-fashioned" way - with BOOKS? Yes, the library is there to provide services to the public. But those services don't include allowing minors to view pornographic material. I don't know about your library, but my library doesn't carry porn magazines. When you go through the magazines in my public library, you don't see "Hustler" up there with "Home and Garden" or "Pussy" on the same shelf as "Scientific American". There has already been some filtering in the sense that those types of material are simply not in the library's collection to begin with. They have been filtered out.
And you don't need the web to look up acticles on breast cancer or find a book that will allow you to read up on appropiate questions. If that material happens to be filtered out on the web, it is merely a means to justify the ends. And many libraries (mine included) charge you the same amount for printing as they do for the copy machine. So you're just the same financially, whether you photocopy info from a book or print it off the web.
I can see how the web is a helpful research tool, and why it is becoming more and more important to students. But the idea of having computers in the library is not for personal entertainment. The computers in libraries are for those who want to honestly do some research, and for those who feel they cannot go through the traditional route of research through books, magazines, film, etc. The filters are there to catch some of the nastier things out there on the net. Yes, some things not really considered "porn" will be filtered out as well. But if you have a legimate topic (breast cancer research), you can still find legitimate, relevant, current information in books and magazines. The web is not the end-all-be-all of resources in the library. And it's not the only form of filtering that goes on at a public library.
My Comments, just Submitted (Score:5)
FCC Request for Comments
Docket 96-45, Proposed Rule Making re: CHIP Act.
I am pleased to find that the FCC is requesting public comments on docket 96-45, a rulemaking proposal related to the CHIP Act. The following are my comments on mandatory computer- based censorship, and why I feel that it would be both dangerous and unethical for our government to filter the information available to any of its citizens.
Firstly, computer-based censorship has been proven entirely unfit for the task of accurately blocking or sanctioning either text or photographic content. Computers are not able to make subtle distinctions between HIV research, breast cancer studies, images of people in clothing, or discussions of firearm safety.
A politician once said of pornography, "I know it when I see it." That's just about the closest we as a society have come to specific standards on what is obscene, and what is not obscene. To some people, various images are purely artistic. To some, the same images are an abomination to their religious or moral fiber. With that dichotomy in place, and that inability for a sensible and exact definition of obscenity, is it no wonder that computer software cannot be made to comply?
Put in simple terms, computers cannot be offended: it's not the censorship computer program that is doing the filtering of offensive material, it is the human creators of the censorship program who decide what other people may or may not access.
That brings us to who watches the watchers? If it's not the censorware that is doing the censorship, then it is the proponents and designers of the censorware instead, that chooses what to hide from you. What political slant or prejudices are these people harboring as they design their filters? What bias or prejudice are you entrusting with the filter?
It has been demonstrated often that many software filters available on the market today block any mention of competitors, and also block any news story or website of the opponents of such content filtering. It may be the government's intent to block pictures of bestiality. It is in the best interests of those proponents to also block any dissent with their own opinions or objectives. If the same software blocks legitimate discourse about whether censorship is ethical, how can anyone learn more on the issues of freedom and democracy and the rights of citizens?
Government-mandated filtering via a commercial product means that a private company becomes a de facto government bureacracy. Many software companies are only a couple dozen or few hundred employees. Think of the required complexity of ensuring that several million, if not billions, of websites are blocked or allowed according to government-mandated standards. Think of the added complexity of oversight, reviews, and policy changes, as different local and national standards are introduced. The world wide web (just http: pages) grows and changes hundreds of thousands, perhaps tens of millions of pages every day. Can the government keep up with those changes?
If a government sets the standards for what to filter, then the government opens itself to be responsible and accountable for every blocked website. There will be millions of lawsuits where website creators feels they are being censored unfairly. The infrastructure to provide accurate and daily updates and access to the operations of the filter will be required, and inaccuracies will have to be addressable.
Public libraries have a mandate to provide open and unfettered access to materials to all its patrons. Public libraries do not have a mandate to babysit our children while we run off to the market. If a parent cannot trust their child's choices in conduct, then the parent should supervise that child. It is not appropriate for the government to play nanny to these children.
In closing, I'd like to quote from a favorite portion of a famous document. It's not taken out of context; the author's points were very clear and concise.
Amendment I
Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or the right of the people peaceably to assemble, and to petition the government for a redress of grievances.
It is important for this government to conduct itself as the founders intended. A government of the people, for the people, by the people. Thank you for your attention in this matter.
Signed,
Ed Halley
Age Check Readily (Score:3)
Why not implement a log-in procedure and tie it in with the card numbers and student id's. Anyone over the age of 18 should not be filtered at all. Under 18s could be filtered by default, but I'm a great proponent of having only the parents control the filtering option. Oh yeah, that would require parental attention.
I'm not sure why my library has to use filtering software at all, since all of the supposed types of no-no sites are already against library rules to view anyway. Penalty: no computer privileges.
----------------------
Okay, here's a thought . . . (Score:1)
My question is this: would the icra voluntary content rating system [icra.org] qualify as a "technology" under the bill sufficient to comply with the statute and the regulation? If so, the thing to do is to head to your local library and get them to implement ICRA, a self-rating and filtering tool based on voluntary disclosure by the page author of web site content. A quick and dirty summary of ICRA can be found here [law.com]. I'm interested to know what
--J
Filtering requirement bounding, obscenity def'n (Score:1)
Also, isn't material only legally obscene when it has been deemed so by a court? Or am I confusing US law with Canadian law again? =)
Re:Filtering should happen (Score:2)
Absolutely! Because the majority of librarians oppose this law.
Great... I'm on my way.. (Score:1)
Missing the bigger picture (Score:2)
Re:Filtering should happen (Score:1)
One small example of how well censorware works: at a public computer the other day, it blocked the course listings website of the University of Toronto's Department of Religion. I am not sure why--perhaps the page had mentions of sexuality. Check it out [utoronto.ca], and tell me if this is valid "research and education".
Of course, censorware also blocks Slashdot discussions frequently, as well.
heinzkeinz
Re:Filtering Doesn't work (but does the community) (Score:2)
Because I don't live near a medical library. Just a community one. Cancer treatments change far more rapidly than my local libraries book collection.
Even if I did live near a teaching hospital with a good library, the web often precedes paper publications.
When you go through the magazines in my public library, you don't see "Hustler" up there with "Home and Garden" or "Pussy" on the same shelf as "Scientific American". There has already been some filtering in the sense that those types of material are simply not in the library's collection to begin with. They have been filtered out.
You are confusing two issues here. It is one thing to actively spend money on things the community most values (Scientific American v Hustler) and quite another to actively spend money on censorship (filters aren't free).
I can see how the web is a helpful research tool, and why it is becoming more and more important to students. But the idea of having computers in the library is not for personal entertainment.
1) I see nothing entertaining about breast cancer
2) You are completely wrong. The library is there for personal entertainment as much as any other reason. That's why they have such a large fiction section.
--
Re:Great... I'm on my way.. (Score:1)
Before you submit comments to the FCC... (Score:5)
The FCC is seeking comment on fairly specific issues. The FCC does not have the authority to choose to not enforce this law at all. Their hands are tied; they cannot overturn an act of congress.
So comments like "this law is stupid and unconstitional," while true, will have no effect on the FCC's rules. Even if you convince people at the FCC that it's unconstitutional, they must still enforce it. (Legally, a law is presumed constitutional unless and until declared unconstitutional by a court.) The place to challenge the law as a whole (rather than just the particular way in which it is implemented) is in the courts, which is already being done.
Include Senator McCain in your comments... (Score:1)
Re:Filtering should happen (Score:1)
Personally, I have no problem with filtering software provided it is implemented properly and openly, so that you could see what agenda was being pushed. In particular, I would like to see that the librarians have the capability of adding or removing any sites.
As for using a public resource any way you want, why don't you try that in a public park and find out how amused the police are?
Does it even matter? (Score:1)
This is meant as a serious question, although I can see how it might be interpreted as trolling...
--
Depends a lot on the Filter (Score:1)
I'm not saying I know the right answer - I know how I would go about dealing with that distinction, but clearly it's a different process for everyone. I find I can agree with a filter only if that filter takes into account all these different views on what is porn and what is valid research material. And only if that filter is based not on a single group's input, but the input of many organizations and libraries. Though that's probably too much to hope for.
Re:Filtering Doesn't work (but does the community) (Score:4)
That's absurd. Just because some hyper-conservative parents out there might be terrified of the idea that little Johnny might see *gasp* a bare breast, the entire adult population should happily accept censorship of an entire medium? Sorry, but no.
Say all books with *wicked naked people* (including Cancer and You and Living With Breast Cancer) are banned from public libraries. That's no big deal! Hey, you can read up on that stuff in magazines, right? What are you complaining about?
The point, which has already been made in earlier posts, is that this is
1) Unacceptable censorship of material adults (and in many cases even children) have the right to access.
2) A promotion of an ineffectual technology to solve an exaggerated problem which can only really only be solved by parents getting up off their sofas and becoming involved in their childrens' lives.
But nobody ever got (re-)elected by telling the parents of america that their children are running wild because they're bad parents. So the rest of us have to live with these ridiculous nostrums and snake oil cures. *sigh*
Standard letter? (Score:2)
Lets Demand "Open Filtering" (Score:4)
That way we can know what is going on when a breast cancer awareness site, or the ACLU site gets filtered out.
Re:Filtering should happen (Score:1)
Besides -- I don't know how long it has been since any of you were in school -- but a lot of classes require research into things that would be blocked. First Aide pages could be blocked. Health pages [including STD information] could be blocked. Hacking/Unix pages might end up getting blocked within the year.
Or is everyone forgetting that the schools and libraries THEMSELVES are against it?
And besides -- what right does the FCC have to say that the school has to put in a Filter if I (as an ISP) give them a discount for being a school? "No sir, we can't give you an educational discount because we don't believe in Censorship." That's BS.
Re:Filtering should happen (Score:2)
Re:Filtering should happen (Score:1)
To say that filters "don't work" just isn't true. They do work. They aren't perfect, but they work well enough to stop circumvention by most casual users.
and block "legitimate" information along with what you consider "porn".
Fix this problem where possible, but running away screaming from filtering just isn't going to advance the technology at all - which is probably what you want.
Because filters block sites based on arbitrary criteria chosen by people with agendas.
I'll admit that I don't always agree with the decisions made, but to call them "arbitrary" denotes that you have an agenda as well by engaging in belittling and hyperbolic rhetoric against the ideas that you oppose.
Because we all pay for public libraries, not just those of you who want to decide what we all see and learn.
If you want the local library to get Pay Per View so you can watch Wrestlemania XXXXIV, you can go to hell. Pay for your own god damned casual entertainment. Public libraries are great institutions as repositories of knowledge available to those without the means to gain that knowledge on their own. If we subvert the legitimacy of the Public Library system by catering to every entertainment whim out there, we shall destroy society's need and support for the institutions, and we shall eventually destroy the institutions themselves.
And oh yeah, because it's unconstitutional. Deny it all you like, it's censorship based on content.
Last I looked, you can't get Hustler, Penthouse, or any other "adult" magazine at the local library. Why should available content via the Internet be any different?
A modest suggestion... (Score:1)
Good for you. (Score:1)
E-Rate is not a discount (Score:1)
When schools and libraries apply for E-Rate grants, they already agree to many restrictions on how they can use the provided Internet access. Filtering is just one more rule.
Under the current rules you cannot use E-Rate Internet access for religious education. Why is religious freedom less important than porn?
Re:Filtering should happen (Score:1)
Way to go,
Re:Filtering should happen (Score:1)
Content blocked on political views by federally-mandated tools is about as blatant as you can get.
Oh, and when you start talking about publicly funded exotic dancers being prohibited from discussing their politcal views in a government building, then you'll be comparing apples with apples.
Re:Open-source blocking software? (Score:1)
I'd argue that it's because censorware companies generally produce software that tries to do too much. Their default policies seem to be "deny access to this list unconditionally; deny access to other sites/pages that match these rules; allow access to anything not denied."
For libraries and the like, what you really want is filtering software that instead blocks anything that is not explictly allowed. That way, you have to involve a human being in the decision process. Done correctly, you could reach the point where:
Yeah, it's still filtering. The point would be to put the control in the hands of the library itself, rather than in the hands of the commercial vendors. Default library policy could be "If we have a book or a periodical that references this, then it's allowed. Otherwise, use your best judgement, or get the parent's approval."
[1] Just as an aside, I went to a Catholic high school, and this sort of thing was a common type of assignment in junior/senior level religion classes. Kind of a "know your enemy" sort of thing, I guess.
Re:Filtering isn't necessarily bad... (Score:2)
> "...librarians have generally done a fine job
> of selecting what they do and don't want in
> their libraries over the centuries".
1. I have a public library down the block, with unrestricted Internet access. I once looked at the history out of curiocity. pr0n, pr0n, free email, pr0n. If you expect our 3 over-worked ilbrarians who wouldn't know what a URL is to be of any help in restricting pr0n usage, you're very mistaken.
2. I completely agree that the most correct response from the community should not be bitching and moaning but writing a competent open source blocking software.
-DVK
Luckily for me, I'm a Canadian too (Score:2)
You can take away my web links - but you can't take away my freedom!
Re:Filtering should happen (Score:1)
In a situation like that, how is "local control" possible? If a provider decides to block political views it disagrees with as "obscene," how can the local library control that?
The problem isn't the technology, it's the people using it. Is "running away screaming from filtering" (speaking of belittling and hyperbolic rhetoric) the solution? Let me put it another way: Does it make sense to endorse a flawed solution on the hope that it will improve later?
Unless you can propose specific ways to solve filtering's flaws, the argument that we should just jump on board because it's the best game in town is specious at best.
My comments to the FCC (Score:2)
In regards to public filters, my suggestions:
1. Allow libraries to choose their own filter products.
2. Suggest or require that libraries use filtering products whose parent company discloses what criteria is used when blocking or allowing a site (many don't) since many sites may be blocked for often poor reasons, sometimes even on a whim, or if they go against an individual's beliefs or agenda (even if the sites are not in any way profane or could be considered dangerous.) Here's a few articles of interest:
http://slashdot.org/yro/00/12/08/0238239.shtml about a decrypted filtering list used by SmartFilter, which found a number of sites obviously mis-rated. As with everything else on Slashdot, there are a number of interesting comments appended to this article.
http://dfn.org/focus/censor/contest.htm A contest, apparently to find the most ridiculous examples of filtering - such as a high school student that couldn't access his own high school's web page from its own library since the filter automatically blocked the word "high" from domain names. The runner up, Hillary Anne, tried to register her email address at Hotmail as hillaryanne@hotmail.com but the filtering software spotted the word "aryan" hidden in her name... A number of other examples exist here, too. Richard "Dick" Armey's Web site is blocked by Netnanny, Surfwatch, Cybersitter, N2H2, and Wisechoice, the very same filters he promotes - because his site contains the word "dick." And, this excerpt from the article:
The conservative group Focus on the Family intends its anti-porn site Pure Intimacy to be a "resource for those struggling with sexual temptations" and the "psychological bondage" that is "a major reason why individuals go online." Jim K. observed that Cybersitter blocked this site for violating the following categories: porno, hardcoreporno, sexual, nudity, and, of course, bondage.
I could go on and on, but just visit that contest site - it does.
http://slashdot.org/yro/00/03/09/133243.shtml about Symantec's I-Gear software and a hack that revealed its true nature - reporting user information back to Symantec and blocking - unnecessarily - a number of
About I-Gear, the high school I graduated from - Celebration - used it, but only for monitoring, since they believed it to be too unreliable and preferred to go after students that continued to visit sites that they themselves could deem inappropriate. They could check for sites that were inappropriate, some of which the filtering software would miss, and would be able to ignore sites that weren't really inappropriate at all but were otherwise blocked by the software anyway. I only know of one site that was blocked, which was a message board for people complaining about the town of Celebration, which has a policy that forbids anyone living there from practicing their first-ammendment rights (they sign them away.) involving negative comments about the town. Whether the site was blocked because of over-use or if it was blocked for more political reasons is unknown to me. I also do not know if they continue this more involved practice of checking the records themselves.
Having read the article on the contest, however, I really can't say I blame them!
4. Take steps to protect privacy. Notably, Mattell's filter software (cybernanny, I believe, but I'm not sure) has been accused of spying on the user. A good idea would be banning any software that reports back to the filtering software company. This information is often sold for profit, for uses like targeted advertising or for "spamming."
5. Allow institutions to selectively add or remove individual sites. Given the above information, I feel that this is more than reasonable. Researching Fibonacci's sequence tripped up one filter, for example - so much for a student's math homework.
6. Please endorse free and open source solutions. An open product would allow for anyone to suggest that a site be added or removed because of its content, and allows for a debate over sites that otherwise could be blocked or allowed on a whim. In the case of open software, a library could be instead required to go through the developer for a change to be made (improving the integrity of the filters). In addition, allowing for different "levels" of filtering might be desired - allowing for different levels of appropriateness similar to movie ratings. For example, while no minor should access pornography, "adult language" isn't really a problem for 16-year-old students (anyone who's ridden on the back of a school bus can vouch that most elementary school students don't have much trouble with it either, but that's another story.). As an added benefit, they're free, and libraries, schools, and the like could use the money. Unfortunately, however, I know of no open-source filters at this time (much of the tech community is anti-filter anyway, since they disagree with cencorship of all but the most objectionable content.)
7. Allowing users to gain higher levels of access in order to be able to find more diverse material upon proving age is probably a good idea. In middle school, my required reading occasionally had some amount of profanity, which nearly all filters would immediately modify or block. Filters would also block a number of perfectly legitimate phrases. Saying "I'll just be in and out of the store really quick" would be blocked by any filter that watches for the phrase "in and out" (and I'm sure that there's a filter out there that does, or will.) The upside to this is that a number of areas of interest could be blocked to younger users, such as a study on homosexuality that, while perfectly fine for an older teenager or adult to read, may not be read by children if their parents would prefer they didn't. In addition, allowing parents to selectively add and remove restrictions on various areas of interest that could be considered "gray areas" would be good. While it's fairly obvious that pornography, bomb-making recipies, marijuana growing tips, and the like should be blocked, a number of areas such as human anatomy, homosexuality, religion, and discussions or even readings of banned books (Catcher In The Rye, for example, which many schools and libraries still censor but some embrace as a literary masterpiece) and research reports on the aforementioned topics could be considered "gray areas" that some ages could be allowed or denied access to, and that parents could control more. Allowing a filter software provider of full control over what users can see, and deeming everything to be either bad or good (as is done now) is, obviously, less than desirable. Personally, I don't see what a political campaign and hardcore bestiality have in common, but may programs will block both at once and don't discriminate. Since there is no clear-cut means of deciding what to block, etc., allowing for more flexibility is ideal.
Personally, I think a standard should be defined for filtering software:
1. NO reporting back to ANYONE other than local administrators on what users have been doing. Privacy should be protected and only necessary exceptions to that should be allowed. Corporations have no business knowing what a child does online. However, the local administrator and certain individuals (parents, perhaps teachers, etc.) should be able to review logs of what a user does (and all usage should be logged! This offers recourse in the event that a site that is inappropriate, but not yet blocked, is visited.) No one that has no business in knowing should be able to review an individual's usage of computer resources.
2. Material should be broken down into different "categories" by their topic and the objectionability of each, and a separate standard should be set for which categories are to be blocked to all users, which should be blocked to older teens, to younger teens, etc. and unless unlocked by an authorized operator (eg. A librarian) the software will assume that the user on the given station is a toddler, or a member of the most restricted group.
3. Aside from mandatory blockings (such as porn) that are simply blocked, period, parents and local administrators should be able to define restrictions for the aforementioned "gray areas," topics that may or may not be objectionable to some. While administrators would be able to specify general restrictions by user groups, parents would have the option to selectively modify their individual child's restrictions (aside, of course, from the mandatory blocks such as bona fide pornography.) A standard should be set for what gray areas will and won't be blocked, that should be enabled by default on all software, but that is modifiable by the local administrator.
4. Administrators should be able to selectively alter site lists. As demonstrated by the previous examples, a local administrator really is no less trustworthy of what should be blocked and allowed than the software developers, so as long as site visiting is logged, there should be no problem.
5. Word and phrase filters should be avoided, and only the most extreme examples used. For example: "sex" should not be blocked as the word is often used for gender and appears accidentally within a number of phrases, "cum" appears, among other places, in the word cucumber, "breast" would prevent discussion of breast cancer (America Online once inadvertently did this).
Government-produced, multi-platform software is likely a plus for this. Since this will cost a great deal of money for each institution to set up, avoiding for-pay services will be a tremendous benefit.
I would suggest finding and endorsing an open-source effort for creating filtering software and push to see the above features implemented, or for the government to spearhead an effort itself. A number of programmers would become involved for the opportunity to protect their children/nieces/nephews/neighbors and still minimize censorship that is simple unnecessary. An open source model, under government supervision and control, would enable tremendous security and quality and minimize potential for abuse, while allowing for an amazingly rapid exposure rate for newly-appearing sites.
At any rate, I would appreciate a reply on this. This hits close to home for nearly all Internet users, especially the more technologically-inclined like myself. The simple fact is, currently available solutions simply are not ready for public use and are very much unrefined, and a fair implementation is impossible.
Re:Filtering should happen (Score:1)
I'm sorry if the word "arbitrary" hurts your feelings, but that's what it is. Look it up if you like. Censorware does not use standard, narrow, and specific criteria for blocking sites. Whatever the software makers want to block, they block. They won't tell you what or why.
"belittling and hyperbolic"? Do you know what those words mean? Can you refute one thing I've stated? I find it frustrating that the people pushing this on us cannot argue the facts, cannot argue the law, they just want it done. We live in a country governed by laws, not the whim of those who make the most noise.
Take a Constitutional Law class. Hell, high school level should do it for you. You might learn something about the First Amendment.
Re:What do you mean my son can't look up peacefire (Score:1)
My wife's a teacher, and when she demonstrated to parents how blocking software worked, a lot of them were actually pissed off that it was so general. The public has the idea that the programs can actually identify porn and only porn.
Re: public displays (Score:2)
hmm
It appears that the only those with "morals" would be intimidated.
On the other hand, would the libraians be able to make a civil rights complaint and have the gentleman arrested, on the basis of the hostile work enviroment?
I don't know. IANAL
It almost seems that the civil rights are extended so much that we protect the abuser of these freedoms (for example, these guys creating the hostile work environment)
Do these guys have a fundamental *right* to Internet pr0n?
Damn it, another can of worms to open.
Re:Do *this* and see what happens (Score:1)
Could we treat this like an interview? (Score:2)
Now, with more Fun(tm)! (Score:1)
1) I don't support censorship or any other public expression or public gathering limits by governments (and their agents). They almost universally don't make any sense. (e.g. public nudity laws... can these be justified in any rational manner?)
2) Censorship laws which target by age group simply add insult (age discrimination) to injury (see 1).
I live in a country where age discrimination constitutionally prohibited.
Oh, but wait: -- "subject only to such reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society" -- a limitation big enough to drive a tank through.
I forsee that the "demonstrably" adds a lot of complexity. For example, I have a feeling that the pending supreme court decision about the constitutionality of the child pornography posession law will hinge on that one word and its meaning in context (maybe the law has an indirect effect on child pornography production, but if so, can this be demonstrated?). Although, even then, I have no way to know what the court considers justifiable in a "free and democratic" society. =( My free and democratic society justified law finder box is on the fritz and will be undergoing repairs in the shop until the end of next week. =)
CIPA - that sounds familiar to me! (Score:3)
Re:Getting involved... (Score:1)
I am guilty of this kind of laziness. It seems that with all the things that are happening in the US, now is the time to change. Now is when we should all be taking an active part in our government.
If this country becomes so restrictive that we no longer have the basic freedoms that many have fought and died for, it is our own fault for letting it happen.
I commend you for your effort to make your town a better place. You may not receive many thanks now, but your effort will be well worth what you put into it.
--
Re:Filtering should happen (Score:3)
EXCUSE ME?!?! There's a slight difference between a spell-checker not having a specific word in it, and a 'filtering' program that blocks thousands of legitimate sites.
Personally, I have no problem with filtering software provided it is implemented properly and openly, so that you could see what agenda was being pushed. In particular, I would like to see that the librarians have the capability of adding or removing any sites.
There are some filtering programs that work on blocking specific sites. Pretty bloody useless, too, as the main class they're trying to block is pr0n, which change their names on a daily basis. So, unless you've got some company scanning the web 24/365, and sending updates as they come in to every library/school/whatever running their software, it isn't 100%, in fact, it's closer to 10%. The second class, which is being pushed far more, blocks based on 'key' words (or if other words contain these words). So forget going to a site on breast cancer, any site on sexual diseases, ... the list is huge.
Perhaps banning porn is a good goal, but the primary software being pushed tramples all over the constitution, if implemented in public places.
I use libraries for both research and recreation: I use the web the same way. If what I do is legal, then why the f$ck is it being prohibited in a library? My local library doesn't carry Penthouse, nor does it carry pr0n videos (although it does have some 'soft' literature in the fction section). This presents no problem - the library has limited resources, and chose not to get them. However, they also have net access. It costs them no more for me to browse Penthouse.com online than it does for me to browse Bookfinder.com. If they block access to a legal product (Penthouse), then that is active censorship, unconstitutional, and illegal. I did run into a situation that was laugable ... research into Goodyear kept getting me the big red X at the library ... probably because of the use of the word 'rubber'. Had to continue that work at home. Tell me again how that's implemented 'properly and openly'.
There are laws in the US about what constitutes illegal pr0n. This mandatory filtering (on publicly funded institutions) goes way beyond that, attempting to legislate software into existence that just doesn't exist. And, while the site specific software doesn't raise my hackles on the 1st amendment issue as badly as key-word specific software, if it's legal in the real world, then it should be legal online. The sole problem here is that pr0n is legal if you're over 18, illegal if under. Libraries don't normally have the personnel to monitor who's browsing what. This could be resolved with 'Over 18 only' machines. Personally, I reached the point long ago of getting thouroughly P.O.d over having legitimate research blocked, and the first thing I do with any library computer is disable the filter (along with the monitoring and anti-virus, but hey! ;)
BTW, gotta love the new image filtering stuff. The best tests I've seen show that a program randomly blocking 20% of all requests would do better.
Re:Age Check Readily (Score:1)
There are many ways to use the same ideas accross the entire Adult area (which is the true purpose of filtering) to allow access to only those who are permited to use it. Dennis Zaphod
Re:Filtering should happen (Score:1)
Law refers to obscenity (Score:1)
Re:Filtering should happen (Score:1)
Excuse me? This law was signed by President Clinton, not President Bush. That being said, your anger needs to be re-directed towards the courts now, as this has been law for a while and only they can change that.
Re:Filtering should happen (Score:1)
Re:Filtering Doesn't work (but does the community) (Score:2)
Let's remove the computer from the picture. First you have to find the book on the shelf, so any passerby can see what section of the library you are in. Then you get the book and take it to the reading area. Again, any passerby can see what book(s) you are carrying. Then you sit down and read for a while, with the book on display for anyone who cares to look. Finally, you decide to check it out, again in full view.
You probably have more "privacy" with a computer in a common area, since there is a narrower field of view for others to see what you are reading, they certainly would need to be closer to read the URL than they need to see the title on most book covers, and you aren't carrying a URL from the shelves at one end of the library to the reading room or check-out desk at the other end. Even it that wasn't the case, there aren't any "privacy rights" being violated by placing the computer in a common area that exist in the absence of the computer, and the presence of the computer doesn't confer additional "privacy rights".
Re:Australia should take note. (Score:1)
Re:Filtering should happen (Score:1)
'Secondly, considering the fact that the library is a "privilege" and not a "right"'
Um, no, libraries are a government SERVICE, not a "privilege" that can be granted or denied on a whim. Hence the requirement for handicapped access, lack of fees for access, etc. Sure the local government can tear down all the libraries any time it wants. But if it provides them, and funds them with our tax dollars, then they must abide by the law. And like it or not, the US Constitution is the law of the land. (this land anyway)
Re:Before you submit comments to the FCC... (Score:1)
If the FCC got 5,000,000 letters saying this law is WRONG, I'd think that would have an impact on the enforcement of the law. Wouldn't you agree?
pan
So how about THIS! (Score:3)
We can't change whether the law is implemented, but be can change how it is implemented. So write to the FCC to demand they institute a policy whereby:
If there is going to be a private industry censoring what Americans can read in their libraries, then there'd better be checks and balances. We can take advantage of this situation to force filtering software companies to be more ethical and responsible.
Re:Before you submit comments to the FCC... (Score:2)
No.
The FCC has no choice but to enforce the law. Period.
I wonder about the wisdom of posting the way to make a comment here on
The library just has to say "I comply" (Score:2)
Beyond that "self-certification", the FCC says nothing about filtering itself. Clearly they are not enthusiastic.
And don't blame Clinton: McCain pushed this one through as a Rider to a big appropriations bill. Vetoing that bill would have been very hard. Riders are a trick for passing bad law without going through the constitutional approval process.
No, you got it all wrong (Score:2)
Re:Filtering should happen (Score:2)
I agree that current filtering programs are laughable, but that does not imply that a reasonable filtering cannot be created. The first chess-playing programs played pretty bad chess, but now they play with the best.
There are some filtering programs that work on blocking specific sites. Pretty bloody useless, too, as the main class they're trying to block is pr0n, which change their names on a daily basis.
I find that hard to believe. Changing your site name on a daily basis doesn't seem like a good way to attract and keep any paying customers. In any case, I find it hard to have much fear in sites that change their name daily. A list of those porn sites with stable names would be a reasonable start on a filter in my opinion.
The second class, which is being pushed far more, blocks based on 'key' words (or if other words contain these words). So forget going to a site on breast cancer, any site on sexual diseases, ... the list is huge.
Yes, simple keyword filtering is laughable. However, I'll wager that any decent machine learning algorithm could acheive 90% or better. Does anyone have a nicely differentiated list of sites?
It costs them no more for me to browse Penthouse.com online than it does for me to browse Bookfinder.com.
Yes, it could cost them. Taxpayers will not want to fund libraries if people are using library computers for browsing penthouse.com and the like.
If they block access to a legal product (Penthouse), then that is active censorship, unconstitutional, and illegal.
Finally, we get to the central argument. I am queasy about this kind of censorship, too, but I don't want my taxes funding porn-viewing.
I wonder if can you get issues of Penthouse via interlibrary loan. If libraries allow this, then to be consistent, they have to allow penthouse.com. However, if libraries are allowed to deny interlibrary loan of porn, they should be allowed to deny interact access to porn.
Re:Filtering should happen (Score:3)
I'm not a lawyer (just a lowly librarian), so this may not fly, but:
My guess is that the implication of the above is that copyright holders must be consulted before derivative works can be published.
Now, some problems creep into this online world, like this:
Filtering software does not alter the original work, but only the presentation of the original work. In the offline world, this is a big distinction; in the online world, original work/copy/modified copy are indistinguishable. Bits is bits, that is.
The CIPA has been worded so that it actually refers to "visual depictions of" the various categories of material to be filtered/denied for viewing. Does this mean text and images (still or motion?), or just images? On screen, pixels is pixels.
If there are any real live copyright attorneys out there, I'd dearly love to hear what you have to say about these matters.
Scott
Assistant Director, North Bend Public Library
North Bend, OR 97459
Take them to Haag (Score:2)
If you as a parent is denied this right, drag them to Haag.
While I'm not that opposed to general idea of Censorware, I'm very opposed to current implementations, and I would reserve the right to teach my kids (I don't have any yet) to crack the filters, or put up a proxy for them so that they can surf unhindered anywhere.
Also, it is worth pointing out that any rule-based filter is snake oil.
Re:Filtering should happen (Score:2)
Come on, I know this is a hotbutton /. issue, but can we maintain some level of intelectual honesty? There's a pretty big difference between someone coming up and deliberately reading over your shoulder close enough and long enough to grow "offended" with your choice of reading matter, and displaying a 12 to 20 inch example of corphaphilia (sp?) to all and sundry passerbys.
Also, the word "offended" is gettign a little thick arround here. In the time of "anti-PC" offending people has become a sort of positive value, and offense is stereotyped as merely uptight anger at someone else's choices. Why not talk about "upsetting" someone? how about "hurting" someone? Is hurting out of line because there is no physical contact? I don't think so. If someone is arrested for exposing themselves to people, it isn't for "offending" them. Its a form of sexual assault. If the library is a public resource, why should people using it have to put up with you flashing them by proxy?
Not that I think the "person walking by" argument is the only one in favor of public terminal filtering, but its nowhere near as dismissable as you're trying to make it out. If anti-filter types can't deal with opposing arguments honestly, its no wonder you're losing all the time. Stick to the first amendment. Its about the only argument you have that works. And if you wait long enough it will work.
-Kahuna Burger
A good idea, but will anyone listen? (Score:2)
Good luck. *sigh* The few times I have suggested this, I've gotten either ignored or flamed. I think I saw one similar suggestion with similar results. I'm growing convinced that the entire "unreliable blocking software" argument is more of an excuse to oppose filtering than an insurmountable reason, because anyone who tries to talk about overcoming this "problem" is attacked. (though I'm seeing more of this suggestion on this thread, so maybe something good will eventually emerge from the noise)
Another good response, since this is only being applied to users of a government discount program would be to form local coalitions to offset the loss of these programs for libraries that are willing to opt out.
The bad response, which seem to predominate, is to scream and rant and rage, steroetype and insult your opposition and ignore any chance there could be real concern in there.
-Kahuna Burger
Re:Okay, here's a thought . . . (Score:2)
I think this is a good solution in theory, the problem is that unrated sites would have to be blocked by default. So this would work only if a lot of people and organizations got behind it. And since it is a compomise position, that will be hard. For instance, is /. participating in this program, or have they refused "on principle"? How does the creator of a new web site find out about this program and use it?
In theory, though, self rating is one of the best solutions.
Kahuna Burger
Re:Filtering should happen (Score:2)
Actually, there are many reasons why a reasonable filtering program cannot be created. First and foremost, I think everyone has heard of the laughable excesses of key-word filters. That path is out. No amount of technical expertice can overcome a problem that cannot be resolved in plain English. Second, there's serious problems - as in, it can't be done - distinguishing between the Miss December centerfold and much of the stuff hanging in the Louvre. Calling the first 'porn' and the latter 'art' just doesn't translate into rules one can put into a program.
Sorry for not being more explicit, but many of the really bad sites (use your imagination) use numeric addresses and redirects. If you haven't seen them, then you're obviously not looking hard enough ;)
It costs them no more for me to browse Penthouse.com online than it does for me to browse Bookfinder.com. Yes, it could cost them. Taxpayers will not want to fund libraries if people are using library computers for browsing penthouse.com and the like.
That is, quite frankly, a BS argument. There are a huge number of libraries that already have unrestricted web access. Viewing pr0n at them hasn't been a problem. As more of society gets wired, I expect it to become even less of a problem than it's current non-problem state.
Finally, we get to the central argument. I am queasy about this kind of censorship, too, but I don't want my taxes funding porn-viewing.
I think it's a misuse of library resources, too. But the answer does not lie down the path that paves over the Constitution. A more realistic approach would be to limit web access to half hour slots if anyone is waiting.
To take an extreme example, what if I wanted to do a research project on how changing media images have contributed to the growth in anorexia/bulimia rates in the US. Part of that research involves taking pictures of female models from various mainsteam magazines and calculating their weights. That's legitimate. A half hour limit is fine. But blocking my access to something is censorship. That said, Tom the Trucker two terminals down could be looking at the same images, kleenex in hand, so to speak. Can you make a suggestion as to how a program can distinguish between the two of us? Until you can, I'd much rather stay in the spirit of 'Freedom', the Constitution, and strangely enough, the Law, and not implement something that reeks of Big Brother, tramples the Constitution, and is against the law.
Re:Take them to Haag (Score:2)
Actually the USA has been known to sign treaties on other things. If they sign is less of an issue than if they obey.
Re:Before you submit comments to the FCC... (Score:2)
Wasn't this exactly one of the things the people who wrote the US constitution would NOT want to happen?
Re:Filtering Doesn't work (but does the community) (Score:2)
Even if one could be created the person/people who created it would be too interested in getting a Nobel prize to worry about such things as filtering porn sites. With the program itself probably far more interested in establishing its own citizenship rights than something it would undoubtedly see as irrelevant.
The solution which is being reccommended would filter out a great deal of legitimate research content - and not just content that has something to do with objectionable body parts - it's amazing some of the random sites that get blocked by filtering software.
It's rather less amazing when you realise that there is very little which is unique to "porn". e.g. The majority of words which may be used on porn sites are also used in completly unrelated areas. Or should sports sites replace "ball" with "spherical playing object" and wood work sites replace "screw" with "spiral fixing".
Re:Questions: Filtering Software and Library Polic (Score:2)
What does the last catagory mean? Someone could argue that disney.com, microsoft.com, etc are "harmful to minors"...
Re:IMHO (Score:2)
No, this is a difficult task. even for humans. It is a very difficult task for computers.
For text recognition, how does a software make a difference between acceptable and acceptable? By using a vocabulary black-list? Does that mean that online dictionnairies will not be accessible from librairies because they contain words like "nazi", "devil", or "murder"? Will the words be taken out of context?
They will only be taken in context once someone comes up with a program which can understand natural language in the first place...
Re:Filtering should happen (Score:2)
Indeed this is software which may never exist. Since in order to actually do the job you'd be wanting an AI, probably one smart enough to resent being a "slave".
Re:Filtering should happen (Score:2)
But not everyone has the computer science background to understand why cannot work.
Second, there's serious problems - as in, it can't be done - distinguishing between the Miss December centerfold and much of the stuff hanging in the Louvre. Calling the first 'porn' and the latter 'art' just doesn't translate into rules one can put into a program.
Humans can't even agree on what the "rules" would be. To even attempt to "explain" them to a computer you'd need the kind of AI which just dosn't exist outside of science fiction. (It's more likely we will be sending manned missions to Jupiter or sending prisoners to the moon than creating such a program anytime soon.) Even assuming someone pulls it off, unless the AI was built into an anatomically correct android, it probably wouldn't have a clue as to why there should even be a problem in the first place.
Re:Filtering should happen (Score:2)
And even fewer have the machine learning background to understand why it can work most of the time. There are a substantial number of papers that demonstrate that programs can learn to categorize text, including web pages, based on "combinations" of keywords. The programs are not perfect, but 90% accuracy and better is very achievable. If you want some filtering with false positives at around 1%, then you can do that, too.
Re:Before you submit comments to the FCC... (Score:2)
You're heading down a slippery slope here: if the FCC had the authority to ignore this bad law, what good laws would it also ignore?