Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Censorship Your Rights Online

FCC Seeks Comment on Internet Filtering Rules 154

Liza writes: "The FCC issued a Notice of Proposed Rule-Making today, implementing the mandatory filtering requirements for schools and libraries. People should know that this applies to all schools and libraries that receive "E-Rate" discounts for Internet access, Internet service, or internal connections." More of Ms. Kessler's comments below; but you'll want to see the Notice of Proposed Rulemaking first. The ALA's guide to the law has some good information. For nearly every U.S. reader: this law will affect your local library. Go talk to them. Get involved.

"The law in question (CIPA) requires filters to be used whenever *anyone* is using the computer, although adults with "bona fide" research needs may request that they be disabled. Minors may not use unfiltered computers with Internet access, even if a parent gives permission or a teacher deems the site to be wrongly filtered.

"The FCC is seeking comment on how to go about implementing this requirement, and it is on the FAST TRACK. Comments have to be made in the next 15 days! For anyone interested in participating, the FCC makes it easy.

"How to Submit a Comment to the FCC: The FCC provides a number of ways for members of the public to comment on proceedings. The easiest is their web-based form.

"The Internet filtering proceeding docket number, which you must include in any comment to the FCC, is: 96-45."

This discussion has been archived. No new comments can be posted.

FCC Plans Internet Filtering Rules: Speak Now

Comments Filter:
  • by Stephen Samuel ( 106962 ) <samuel@NOsPaM.bcgreen.com> on Wednesday January 24, 2001 @06:24AM (#484349) Homepage Journal
    Either the URL was originally typed in wrong (extraneous space) or they've decided to move it. In any case, the proposed rule-making is now here [ala.org] (pdf)
    `ø,,ø!
  • FCC Seeks Comment on Internet Filtering Rules

    http://www.ala.org/cipa/FCCRulemaking.pdf [ala.org]
    it's a PDF.

    We've always got filtering stuff at LISNews.com [lisnews.com] too, :-)

  • For nearly every U.S. reader: this law will affect your local library. Go talk to them. Get involved.

    Talk to them, why? Except to say, "Good job," that is.

    I don't understand this knee-jerk /. reaction against filtering at public libraries. Normally, public libraries have extremely limited computing resources (like one or two computers), that should be used as a tool for research and education.

    The last thing I want my tax dollars going to is for some dude to sit around all day surfing for porn. Screw that, I pay for my own porn just like everyone else should.
  • by Anonymous Coward on Wednesday January 24, 2001 @06:32AM (#484352)
    Simple. Don't let minors use the computers... They'll only be using them for schoolwork, aiming to reach a higher standard of education than you, or particularly, I. Give em another few years and they'll be better qualified and after YOUR job!

    And if they are viewing porn, which knowing the number of 13 year old boys which I do, I find very unlikely, just hardwire their seats into the mains and give them a 'little' jolt for picture viewed.

    You'll find that this aversion therapy was highly successful in curing homosexuals in the 50s, which is why it's now damn near impossible to buy chapps in downtown San Francisco...
  • by Alien54 ( 180860 ) on Wednesday January 24, 2001 @06:33AM (#484353) Journal
    They have to know that Filtering does as much harm as it does good.

    A better solution is to have the computer out in the open where everyone can see it.

    For kids at home, this would be in the family room. [the rule of thumb at home is that, if you can buy your own, then you can have it in your room. If mom and dad buy it, it is in the family room.]

    For Libraries, this would be not in the side room, but in the main hall where every passer by can see. This being a public space, maybe the illusion of "privacy" in this very public space should be dispensed with. This thought has problems both ways tho.

    But this would certainly enforce "community standards"

    Now there is a issue as far as making sure that there is no one hacking the system. But that is a sysadmin security issue, not a filtering issue.

  • by still cynical ( 17020 ) on Wednesday January 24, 2001 @06:33AM (#484354) Homepage
    [sigh] Ok, I'll type this slowly...

    Because filters don't work, and block "legitimate" information along with what you consider "porn".

    Because filters block sites based on arbitrary criteria chosen by people with agendas.

    Because we all pay for public libraries, not just those of you who want to decide what we all see and learn.

    And oh yeah, because it's unconstitutional. Deny it all you like, it's censorship based on content.
  • The recent court cases (see http://www.aclu.org/news/2000/n062200b.html) very clearly show no tolerance for this sort of thing. Is this just more grandstanding for the voters?
  • ALA moved it. The filename originally had a space in it, and their link had a space (not a %20) as well, so it would only work on IE and not Netscape. I linked to the URL with %20 .... but obviously they got complaints and changed the filename. Anyway, I've fixed the link in the story.
  • by SquadBoy ( 167263 ) on Wednesday January 24, 2001 @06:40AM (#484357) Homepage Journal
    Becuase it is not about blocking porn. And in any case define porn some of the pictures on a breast cancer site would be considered by someone like my Grandmother porn. Also it is not about blocking porn it is about blocking any information that the government considers to be bad for us. It would be my thought that the number of people surfing porn in the libarary is pretty low (kind of hard to do the one handed typing thing there). This is not about porn it is about anything W. thinks should be banned and that is *alot* of stuff.
  • If this thing holds up, there may be a real value in open-source blocking software. As unpleasant as it may seem to support censorship, it could, at least, offer the following:

    • You can actually find out what is being blocked, and why.
    • You can choose and modify your list (especially if you disagree with it).
    • You could, possibly, define your own criteria for automated blocking.
    • There would be no NDA to deal with.
    • kids could have realistic input into what is considered 'harmful' to them.
    Self-censorship -- Isn't it wonderful?
    `ø,,ø!
  • by pongo000 ( 97357 ) on Wednesday January 24, 2001 @06:46AM (#484359)
    One day, someone on the North Texas Linux Users' Group m/l posted a request for filtering software that would run on Linux so they could filter their home service. I suggested that instead of using software that was next to useless [peacefire.org], I helpfully suggested that he simply put the computer in a public place and communicate with his kids about the Internet. Set boundaries and limits, and if they're youngsters, be proactive and monitor their activities.

    I was mercilessly flamed for daring to suggest that parents become involved in their children's activities on the 'net. I was seen as accusing someone of bad parenting because I had the gall to suggest practical solutions over commercial snake-oil.

    Given this attitude, I would imagine this rule will fairly sail through the rule-making process. We certainly can't have parents taking on the responsibility of monitoring their kids on the web, or even instilling in them the moral values necessary to differentiate between what parents consider to be acceptable and unacceptable content.

  • by rw2 ( 17419 ) on Wednesday January 24, 2001 @06:47AM (#484360) Homepage
    For Libraries, this would be not in the side room, but in the main hall where every passer by can see. This being a public space, maybe the illusion of "privacy" in this very public space should be dispensed with. This thought has problems both ways tho.


    Yes it certainly does. For starters, I wouldn't want my borrowing history to be broadcast to everyone. Why? Wrong question. It is my business what I read and the idea that I must justify that is wrongheaded. The question is it in the best interests of the nation to know what I read. The answer, it isn't. Therefore I demand that my privacy rights continue to be respected.

    How is putting the Net box in a public place any different. One example. What if I'm too poor, too old, too backwards to own a computer (maybe I just like to read paperbacks when I'm at home), but my wife comes down with breast cancer. So, being the caring guy that I am, I treck down to the library and want to do some research so we can ask good questions of the oncologist during the course of treatment. Again, I see no reason why I should have to justify my right to privacy, but this is a good example of a time where I really don't want the entire neighborhood looking over my should enforcing 'community standards'. I just want to do my research, print some stuff out, and go home. It's my computer, my taxes paid for it and I resent 'the man' telling me where and how I can use it just as much as I would resent him telling me what books are fit to be published.

    --

  • Minors may not use unfiltered computers with Internet access, even if a parent gives permission or a teacher deems the site to be wrongly filtered.

    You're kidding me.

    You're not kidding me?

    Wonderful way to protect censorware companies from having to actually update their lists. Also a rather interesting removal of individual power from parents and teachers. I guess a lot of bleating about "giving educational choice back to the parents" only extends as far as where parents can spend their money, not what their children can look at, even when the filtering software is clearly wrong in the eyes of the adults involved.

    "My son needs to look up 2600.com for a project on hackers." "My daughter is doing a research project on breast cancer." "Why can't I see any sites about Wicca? I'm doing a project for religion class." Legitimate queries, all of which under law can only be met with, "I'm sorry, but I can't shut down the filter."

    Consider that many filters not only try to block sexual pornography, but often sites about other religions, left- and sometimes right-wing political philosophies, gay and lesbian sites, peacefire.org [peacefire.org], and anything else that doesn't fit the political agenda of the software company's target market. Fortunately, some (most?) filters allow the degree of blocking to be tailored, but how often will the administrator in a library or school be knowledgable enough to not just block everything out of worry or paranoia?

    Yeah, it'll scare off the porn jockeys, but they can just claim to be looking up something else and have the filters turned off anyway. Not to mention intelligent, knowledgable students (the existence of which I'm sure will be considered dangerous to the public good, since they tend to know realfacts, not goodfacts) who already know how to get around filters will do so, and probably get accused of "hacking" by teachers who should know better.

    If I were an American resident, I'd already have sent off some rather long letters to representatives about this moronic law, and I'd send a comment to the FCC now telling them software should not be implemented in taxpayer-funded facilities. that serves to restrict political and religious freedom. Find a filter that blocks only pornography, or recommend that filters be set only to block sex and extreme violence (and hope the latter doesn't block sites about the horrors of the Holocaust). Of course, I'd love to say "don't follow the law at all, it's badly-conceived", but I somehow doubt it would be taken seriously.

    DMCA, UCITA, CIPA - government of the people, for the people?

  • Putting the computer out in the open would be a bad idea. There are some people who might be doing legitmate research on subjects which are inappropriate for children to see. If the computer was in the main hall, there would be no way to prevent a passerby from viewing the web page.

    A better way to handle the problem is to use monitoring software which watches for certain cities and alerts a supervisor if those sites are hit. The supervisor could then take action accordingly.
  • It should be noted that this ruling specifically states that all internet activity by minors should be monitored, but NOT in libraries. So at least they had some sense that privacy should be respected for adults in a public setting. I don't think that this is a bad thing at all. When sending your child to school, you should not have to worry that they would have access to content on the internet that you would otherwise restrict. I doubt if any teenager would have much of a problem circumventing the restrictive measures that would have to be implemented given this ruling.
  • As an Australian, I am impressed that the FCC is interested in listening to what the American public think of their proposed laws... If only our equivalent body would do the same, with a number of issues -- censorship, telco deregulation, etc etc etc... Of course, it remains to be seen what the _general public_ will actually say -- there are louder voices than us nerds --, and it remains to be seen if the FCC will actually care what they hear... rr
  • I can't agree on filtering in Public Libraries on the basis that we do pay for them, and that filters have too much of a tendancy to block out very useful sights. In the schools, and this only applies to my personal belief, I don't mind the filters. Shools are a place set for our children to learn, and as a parent, I don't want my child to have easy access to certain things. True, with a filter there is still easy access if you know where to go and what to do, but at least there is some effort. I don't desire to take away first amendment rights, but don't take away my right to be a parent and have my children receive and education in an environment that I desire. Of course, that's just my opinion, I could be wrong. (copyright Dennis Miller)
  • Does anyone know what filtering solution(s) they're planning to use? I didn't see anything in the PDF file, but maybe I'm not looking hard enough. More generally, do they plan to use web filters that block based on a list of "unsuitable" URLs or addresses chosen by some organization, or will they just filter for keywords? While I'm fishing, does anyone know anything about libraries' existing policies for determining what is and isn't suitable for display in a public institution?
  • I'm the son of a librarian, and I know firsthand that they are trying to stop this in any way they can... but should that not work, we're screwed, because I can research virtually nothing for my anatomy course... unfortunately, they get their internet connection through the local school, who has some kinda filtering software up... it's really screwed up, you can't go to half the 3rd party candidates' websites, 2600 [2600.com], or even the Onion [theonion.com]. It's really annoying, because I got one of the librarians hooked on red meat [redmeat.com], and now we can't access it!
  • ??? This was signed by Clinton. ??? Not that I'm saying W. is anti-filtering or anything, but its stupid to spin things onto the new president that he didn't even create.
  • by jpm242 ( 202316 )
    Unfortunately, I can't submit a comment to the FCC since I'm not an american citizen, but I still would like to express my opinion.

    I don't beleive that widespread filtering is a solution to anything. What would that filter would actually filter out? Has the image recognition software evolved so much that it can differenciate between porn and art?

    For text recognition, how does a software make a difference between acceptable and acceptable? By using a vocabulary black-list? Does that mean that online dictionnairies will not be accessible from librairies because they contain words like "nazi", "devil", or "murder"? Will the words be taken out of context? If they're taken in context, how does the software draw the line between a site promoting hainous crimes and another simply relating to them in a journalistic fashion?

    Will they filter out IP addresses? If so, who decides what's on the black list? Will Slashdot be on that blacklist??? (There was an article about virtual kiddie porn yesterday that contained some "bad" words).

    What is scary about all this is that if the filtering is done by machine, there will be mistakes and "good" sites won't be available, while "bad" sites will be accessible. What is even scarier is if the filtering is done using a human generated blacklist of websites, the censure will actually be tainted and subjective, reflective of the censoring party's ideological, moral, religious and political beleifs, which is not democratic at all. And since america just "elected" a right-wing president, this can just go sour, IMHO.

    I beleive in a free internet, where information flows freely, uncensored and unfiltered. Internet is not a babysitter nor an educator.

    That will be all.
  • by swordgeek ( 112599 ) on Wednesday January 24, 2001 @07:05AM (#484370) Journal
    ...but mandatory legislated filtering is!

    I've said before that librarians have generally done a fine job of selecting what they do and don't want in their libraries over the centuries. Why don't we work at developing generic tools which can be set as appropriate by the librarians, if they so desire? I'm not normally an open source zealot, but this is one case where it's absolutely essential that the code be open for review, and you can bet that FCC censorware (along with it's stupid rules) won't be.

    Disclaimer: I'm not a citizen or resident of the USA.

  • The Federal government adds a tax to my telephone service, to fund fat pipes to libraries and schools. This is cool, because my community gets to use the schools, and I get to use the library! But wait! Now that the government is so generously giving my money to schools and libraries, the government says "Ah Ah Ah! You have to follow our rules to get this money!"

    I get it. The government takes my money in the name of the common good and uses it to micromanage the institutions that are out there doing the good. Where have I seen this before? Oh yeah, Federal Financial aid for education (Title IV). Either way, we pay for education, but now the money filters through the Feds' hands and arrives with strings attached.

  • It is true that it was signed by Clinton and if he was still in office my comment would have said Clinton. The simple fact is with the exception of Harry Browne every person running last year would have been in favor of this and would use it to further their agenda. Clinton signed it hoping Al would get to use but W. will use it. That is way I said what I said. The *only* reason he will not create it is because it was already created for him.
  • The problem is the size of the blacklist. Think of how many items have to be on it. Hundreds of thousands. Maybe millions. That's a major database project, and no existing censorware company has been able to run such a thing even nearly successfully.
  • The Minneapolis Public Library main branch is well connected and terminals are in the main hallways off the foyer. Gentlemen with poor hygiene and sub-standard housing can be witnessed at these terminals surfing pr0n for much of the day.

    It got so bad that women on the otherwise largely liberal library staff complained to the board that it was creating a hostile work environment.

    I'm not advocating anything here, just an observation about the effectiveness of an open floor plan as a deterent.
  • How about a little signal to all this noise on this topic?

    After reading about Holland, MI, USA on this site last year (or so), I decided that it would be in my best interest for this 25 yr. old h*cker to 'get involved'.

    We recently moved (2000-12-01) to a new town and I read in the paper that the Mayor was taking applications to fill the city boards, one of them being the Library Board. So, even though I had zero experience and really didn't know anyone in town, I applied.

    On the application I mentioned my tech background, spoke about the issues of filtering and mentioned that the city could use a few people who 'understood the technology and the issues at hand'.

    After a couple of interviews with the mayor, I was appointed last week! You're looking at the newest Library Board member (3 year term) for the city of Northfield, MN.

    Filtering proponents beware! You have no friend here!

  • I hate to reply to my own post, but I just realized something... the filtering is done at the school, and piped to the library... so, even if an adult comes forward and asks to work without the filter, they can't! How is this going to effect things?
  • by Jeremy Erwin ( 2054 ) on Wednesday January 24, 2001 @07:12AM (#484377) Journal
    The regulation is designed to enforce the "Children's Internet Protection Act", which was signed by President Clinton 21 Dec 2000. Although the FCC could have declined to issue regulations implementing this rather draconian legislation, they have not. (They could have been sued by Congress for failing to do so). Besides, I really don't think that Shrub is all that more receptive to free speech and privacy claims.

    The CHIP act requires that Internet connections be filtered for both adults and children, in both schools and libraries that recieve discounts on Internet connections & equipment. (Mind you, I'm not falling for the "we paid for it, so we get to reulate it" fallacy. After all, I paid for some of it, and I support uncensored, full access at public libraries.)

    Additionally, it requires that Internet connections at schools (but not libraries), be monitored. And it requires both institutions to certify the existence of a "internet safety" policy. In the spirit of France's "Commitee for public safety" thsi apprently means that librarians are required to evesdrop on electronic mail.

  • If I setup a ISP, I'm going to supply my local library with Internet access. This is ridiculous. What happened to free speech? Wasn't this country founded on the ideals of freedom of speech and tolerance? A lot of people don't want this, unfortuntely, the people who do are in the positions of power to make it happen...when did elected officials stop being the servents of the people, and start representing only one interest, their own? These representitives are supposed to work with the people, to help them get their voices heard, and represent them and their interests in government. If they can't do that, get rid of them, and have a true democracy...if something doesn't work, why keep it? (I know that there are those who DO represent the people, but I'd guess that they're in the minority, considering what I've been seeing.) Back to the main point, filtering, in a public forum, where people who can't afford to have access to the Internet, will just create more of this "Digital Divide", this rift between those who have access to infomation, and those who don't. That's exactly what this country DOESN'T need. We need to help those people who can't afford it, help them get it, because the Internet is the portal to the world.
  • What if I'm too poor, too old, too backwards to own a computer (maybe I just like to read paperbacks when I'm at home), but my wife comes down with breast cancer. So, being the caring guy that I am, I treck down to the library and want to do some research so we can ask good questions of the oncologist during the course of treatment.

    But why can't you do such research the "old-fashioned" way - with BOOKS? Yes, the library is there to provide services to the public. But those services don't include allowing minors to view pornographic material. I don't know about your library, but my library doesn't carry porn magazines. When you go through the magazines in my public library, you don't see "Hustler" up there with "Home and Garden" or "Pussy" on the same shelf as "Scientific American". There has already been some filtering in the sense that those types of material are simply not in the library's collection to begin with. They have been filtered out.

    And you don't need the web to look up acticles on breast cancer or find a book that will allow you to read up on appropiate questions. If that material happens to be filtered out on the web, it is merely a means to justify the ends. And many libraries (mine included) charge you the same amount for printing as they do for the copy machine. So you're just the same financially, whether you photocopy info from a book or print it off the web.

    I can see how the web is a helpful research tool, and why it is becoming more and more important to students. But the idea of having computers in the library is not for personal entertainment. The computers in libraries are for those who want to honestly do some research, and for those who feel they cannot go through the traditional route of research through books, magazines, film, etc. The filters are there to catch some of the nastier things out there on the net. Yes, some things not really considered "porn" will be filtered out as well. But if you have a legimate topic (breast cancer research), you can still find legitimate, relevant, current information in books and magazines. The web is not the end-all-be-all of resources in the library. And it's not the only form of filtering that goes on at a public library.
  • by Speare ( 84249 ) on Wednesday January 24, 2001 @07:19AM (#484380) Homepage Journal

    FCC Request for Comments
    Docket 96-45, Proposed Rule Making re: CHIP Act.

    I am pleased to find that the FCC is requesting public comments on docket 96-45, a rulemaking proposal related to the CHIP Act. The following are my comments on mandatory computer- based censorship, and why I feel that it would be both dangerous and unethical for our government to filter the information available to any of its citizens.

    Firstly, computer-based censorship has been proven entirely unfit for the task of accurately blocking or sanctioning either text or photographic content. Computers are not able to make subtle distinctions between HIV research, breast cancer studies, images of people in clothing, or discussions of firearm safety.

    A politician once said of pornography, "I know it when I see it." That's just about the closest we as a society have come to specific standards on what is obscene, and what is not obscene. To some people, various images are purely artistic. To some, the same images are an abomination to their religious or moral fiber. With that dichotomy in place, and that inability for a sensible and exact definition of obscenity, is it no wonder that computer software cannot be made to comply?

    Put in simple terms, computers cannot be offended: it's not the censorship computer program that is doing the filtering of offensive material, it is the human creators of the censorship program who decide what other people may or may not access.

    That brings us to who watches the watchers? If it's not the censorware that is doing the censorship, then it is the proponents and designers of the censorware instead, that chooses what to hide from you. What political slant or prejudices are these people harboring as they design their filters? What bias or prejudice are you entrusting with the filter?

    It has been demonstrated often that many software filters available on the market today block any mention of competitors, and also block any news story or website of the opponents of such content filtering. It may be the government's intent to block pictures of bestiality. It is in the best interests of those proponents to also block any dissent with their own opinions or objectives. If the same software blocks legitimate discourse about whether censorship is ethical, how can anyone learn more on the issues of freedom and democracy and the rights of citizens?

    Government-mandated filtering via a commercial product means that a private company becomes a de facto government bureacracy. Many software companies are only a couple dozen or few hundred employees. Think of the required complexity of ensuring that several million, if not billions, of websites are blocked or allowed according to government-mandated standards. Think of the added complexity of oversight, reviews, and policy changes, as different local and national standards are introduced. The world wide web (just http: pages) grows and changes hundreds of thousands, perhaps tens of millions of pages every day. Can the government keep up with those changes?

    If a government sets the standards for what to filter, then the government opens itself to be responsible and accountable for every blocked website. There will be millions of lawsuits where website creators feels they are being censored unfairly. The infrastructure to provide accurate and daily updates and access to the operations of the filter will be required, and inaccuracies will have to be addressable.

    Public libraries have a mandate to provide open and unfettered access to materials to all its patrons. Public libraries do not have a mandate to babysit our children while we run off to the market. If a parent cannot trust their child's choices in conduct, then the parent should supervise that child. It is not appropriate for the government to play nanny to these children.

    In closing, I'd like to quote from a favorite portion of a famous document. It's not taken out of context; the author's points were very clear and concise.

    • The Constitution of the United States

    • Amendment I
      Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or the right of the people peaceably to assemble, and to petition the government for a redress of grievances.

    It is important for this government to conduct itself as the founders intended. A government of the people, for the people, by the people. Thank you for your attention in this matter.

    Signed,
    Ed Halley

  • by robbway ( 200983 ) on Wednesday January 24, 2001 @07:19AM (#484381) Journal
    Most, and I'd guess all, schools and libraries have the ages of their students and lenders on file. It's kind of required to get a children's or adult's card, or to attend school.

    Why not implement a log-in procedure and tie it in with the card numbers and student id's. Anyone over the age of 18 should not be filtered at all. Under 18s could be filtered by default, but I'm a great proponent of having only the parents control the filtering option. Oh yeah, that would require parental attention.

    I'm not sure why my library has to use filtering software at all, since all of the supposed types of no-no sites are already against library rules to view anyway. Penalty: no computer privileges.

    ----------------------

  • It would appear that a .pdf of the enacted text of the Children's Internet Protection Act can be found here [gpo.gov] (check in the middle of the page, under "McCain Amendment No. 3610").

    My question is this: would the icra voluntary content rating system [icra.org] qualify as a "technology" under the bill sufficient to comply with the statute and the regulation? If so, the thing to do is to head to your local library and get them to implement ICRA, a self-rating and filtering tool based on voluntary disclosure by the page author of web site content. A quick and dirty summary of ICRA can be found here [law.com]. I'm interested to know what /.ers think about this possible compromise.

    --J
  • So, what does this requirement mandate: That the filter filter all content that falls in the requested categories (impossible), or only some content that falls in the requested categories (e.g. min 1 item)?

    Also, isn't material only legally obscene when it has been deemed so by a court? Or am I confusing US law with Canadian law again? =)

  • Talk to them, why? Except to say, "Good job," that is.

    Absolutely! Because the majority of librarians oppose this law.

  • ...to sumbit my comments about what a great idea this is. I'm glad the may tax dollars will no longer go to subsudize people looking at pr0n in public libraries.
  • Call me crazy but I could have sworn that I alredy had one of the best filtering programs available to mankind. PARANTAL RESPONSIIBILITY! Being a parant it is my job to teach values, morals and common decency to my kids. It is not up to there teacher, librian or goverment to raise and control my kids or myself. That is a growing problem in the world today. To many people either dont know how or choose not to raise there own kids, instead allowing others to do it for them, either directly or indirectly, as in this case through censorship. I surf porn on a daily basis, my kids surf the internet almost daily as well. The differnce is that they are responsible enough to know what is approate for them and what is not, (according to ME the parant). Yes they from time to time run accross a site that is qustionable, and for the most part they either close the window or ask either myself or my wife about the page. I think that before we start mandating filtering programs that we should mandate Responibility amoung parants. Just my 2 cents with Deflation.. Zaphod
  • In a society, lots of things happen that you disagree with. That difference of opinion does not give you the right to legislate my freedoms. Moreover, when did the purpose of libraries become solely "research and education"?

    One small example of how well censorware works: at a public computer the other day, it blocked the course listings website of the University of Toronto's Department of Religion. I am not sure why--perhaps the page had mentions of sexuality. Check it out [utoronto.ca], and tell me if this is valid "research and education".

    Of course, censorware also blocks Slashdot discussions frequently, as well.

    heinzkeinz
  • But why can't you do such research the "old-fashioned" way - with BOOKS?

    Because I don't live near a medical library. Just a community one. Cancer treatments change far more rapidly than my local libraries book collection.

    Even if I did live near a teaching hospital with a good library, the web often precedes paper publications.

    When you go through the magazines in my public library, you don't see "Hustler" up there with "Home and Garden" or "Pussy" on the same shelf as "Scientific American". There has already been some filtering in the sense that those types of material are simply not in the library's collection to begin with. They have been filtered out.

    You are confusing two issues here. It is one thing to actively spend money on things the community most values (Scientific American v Hustler) and quite another to actively spend money on censorship (filters aren't free).

    I can see how the web is a helpful research tool, and why it is becoming more and more important to students. But the idea of having computers in the library is not for personal entertainment.

    1) I see nothing entertaining about breast cancer

    2) You are completely wrong. The library is there for personal entertainment as much as any other reason. That's why they have such a large fiction section.

    --

  • So, you don't mind that web sites for certain environmental groups, religious groups, political parties, etc. that are blocked by every major filtering package will no longer be accessable?
  • by Sodium Attack ( 194559 ) on Wednesday January 24, 2001 @07:32AM (#484390)
    ...please, PLEASE, PLEASE read the Notice of Proposed Rulemaking, particularly the "Issues for Comment" section.

    The FCC is seeking comment on fairly specific issues. The FCC does not have the authority to choose to not enforce this law at all. Their hands are tied; they cannot overturn an act of congress.

    So comments like "this law is stupid and unconstitional," while true, will have no effect on the FCC's rules. Even if you convince people at the FCC that it's unconstitutional, they must still enforce it. (Legally, a law is presumed constitutional unless and until declared unconstitutional by a court.) The place to challenge the law as a whole (rather than just the particular way in which it is implemented) is in the courts, which is already being done.

  • Any comments you send to the FCC I would include a CC: to Sen. John McCain since he is the one that put the language into the Appropriations bill December. You can email him at john_mccain@mccain.senate.gov [mailto].
  • I agree that any filtering will not work perfectly, but that's not a very good reason. Hardly any software works perfectly either, but a lot of software is good enough. Similarly, no computer security is perfect, but we still try to do the best we can.

    Personally, I have no problem with filtering software provided it is implemented properly and openly, so that you could see what agenda was being pushed. In particular, I would like to see that the librarians have the capability of adding or removing any sites.

    As for using a public resource any way you want, why don't you try that in a public park and find out how amused the police are?

  • The FCC seems to have gone out of their way to ignore all of the DMCA comments which didn't come from big business. Any reason to think they'd take us seriously this time?

    This is meant as a serious question, although I can see how it might be interpreted as trolling...
    --

  • I only skimmed the articles mentioned, but IMHO, there wasn't much being said on the details for the filtering. A lot depends very much the nature of the filter to be used. There wasn't much info on exactly what guidelines they plan to filter on, and which filter they wanted to use. Something the government's going to cook up? Something the librarians will determine? Something compromised in-between? I think the distinction between what is pornographic and what isn't is different for different groups. And the library has for years dealt with issues of censorship. I mean, libraries already filter out porn magazines and there are certainly no porn films in the AV section (although I only have my public libary to base these observations on). In a sense, there has been some slight filtering in the libraries already. The question is, how to make that leap from porn magazines to porn sites, and what might also be unfairly censored along the way. I think it's pretty much agreed that your 6 year old kid shouldn't be looking at porn. And parents have the right to expect that a city-funded public place will not expose their child to porn. The question is, how exactly does one create a methodology to tell a hardcore porn site from a medical anatomy chart on the web?

    I'm not saying I know the right answer - I know how I would go about dealing with that distinction, but clearly it's a different process for everyone. I find I can agree with a filter only if that filter takes into account all these different views on what is porn and what is valid research material. And only if that filter is based not on a single group's input, but the input of many organizations and libraries. Though that's probably too much to hope for.
  • by unclei ( 55647 ) on Wednesday January 24, 2001 @07:45AM (#484395)
    But why can't you do such research the "old-fashioned" way - with BOOKS? Yes, the library is there to provide

    That's absurd. Just because some hyper-conservative parents out there might be terrified of the idea that little Johnny might see *gasp* a bare breast, the entire adult population should happily accept censorship of an entire medium? Sorry, but no.

    Say all books with *wicked naked people* (including Cancer and You and Living With Breast Cancer) are banned from public libraries. That's no big deal! Hey, you can read up on that stuff in magazines, right? What are you complaining about?

    The point, which has already been made in earlier posts, is that this is
    1) Unacceptable censorship of material adults (and in many cases even children) have the right to access.
    2) A promotion of an ineffectual technology to solve an exaggerated problem which can only really only be solved by parents getting up off their sofas and becoming involved in their childrens' lives.

    But nobody ever got (re-)elected by telling the parents of america that their children are running wild because they're bad parents. So the rest of us have to live with these ridiculous nostrums and snake oil cures. *sigh*

  • Does anyone have a standard letter or comments we can cut and paste in there? I'm sure I'm not the only one too lazy/inarticulate to get my point across.
  • by kenf ( 75431 ) on Wednesday January 24, 2001 @07:50AM (#484397)
    If we are going to be stuck with this crappy law, lets see if we can get the FCC to insist that any filtering software used in public libraries or schools publish what is being filtered.

    That way we can know what is going on when a breast cancer awareness site, or the ACLU site gets filtered out.
  • I agree. When Clinton passed the Communications Decency Act (yes, it DID pass before we got it booted), the new law automatically blocked the White Houses' Childrens site (because it said Bill and Hillary were a COUPLE [bad word there, if you are using filtering software, you might not see it]).

    Besides -- I don't know how long it has been since any of you were in school -- but a lot of classes require research into things that would be blocked. First Aide pages could be blocked. Health pages [including STD information] could be blocked. Hacking/Unix pages might end up getting blocked within the year.

    Or is everyone forgetting that the schools and libraries THEMSELVES are against it?

    And besides -- what right does the FCC have to say that the school has to put in a Filter if I (as an ISP) give them a discount for being a school? "No sir, we can't give you an educational discount because we don't believe in Censorship." That's BS.
  • Actually, it's *not* unconstitutional. The library or school itself has the right to decide what can and cannot be allowed on its computers, based on "community standards" and local ordinances. The Federal government cannot legally outlaw adult-oriented websites, but localities certainly do not have to permit obscene material in public places. Read the First Amendment. Congress is not prohibiting the freedom of expression here. They're simply regulating the forum. A recent Supreme Court decision upheld a local ordinance requiring exotic dancers to wear pasties and g-strings. I don't see too much difference here.
  • Because filters don't work,

    To say that filters "don't work" just isn't true. They do work. They aren't perfect, but they work well enough to stop circumvention by most casual users.

    and block "legitimate" information along with what you consider "porn".

    Fix this problem where possible, but running away screaming from filtering just isn't going to advance the technology at all - which is probably what you want.

    Because filters block sites based on arbitrary criteria chosen by people with agendas.

    I'll admit that I don't always agree with the decisions made, but to call them "arbitrary" denotes that you have an agenda as well by engaging in belittling and hyperbolic rhetoric against the ideas that you oppose.

    Because we all pay for public libraries, not just those of you who want to decide what we all see and learn.

    If you want the local library to get Pay Per View so you can watch Wrestlemania XXXXIV, you can go to hell. Pay for your own god damned casual entertainment. Public libraries are great institutions as repositories of knowledge available to those without the means to gain that knowledge on their own. If we subvert the legitimacy of the Public Library system by catering to every entertainment whim out there, we shall destroy society's need and support for the institutions, and we shall eventually destroy the institutions themselves.

    And oh yeah, because it's unconstitutional. Deny it all you like, it's censorship based on content.

    Last I looked, you can't get Hustler, Penthouse, or any other "adult" magazine at the local library. Why should available content via the Internet be any different?
  • Clearly this rule should require the libraries to buy only filters which have been certified by their vendors to be accurate and to NOT violate the 1st Amendment. Which means no filters -- since such a certification would expose any of the lame filters in existence to being sued into bankruptcy....
  • My hat is off to you. While most of us (myself included) sit here and moan and groan about the stupidity of this measure, you have actually put yourself in a position to help shape public policy. I applaud your initiative.
  • E-Rate is a subsidy paid for with a tax on your long distance phone bill. Schools and libraries must submit piles of paperwork to apply for grants. It is not a discount!

    When schools and libraries apply for E-Rate grants, they already agree to many restrictions on how they can use the provided Internet access. Filtering is just one more rule.

    Under the current rules you cannot use E-Rate Internet access for religious education. Why is religious freedom less important than porn?
  • Typical... /. glorifies removal of all censorship, yet the moderators moderate me down so that others won't see what I have to say. So typical.

    Way to go, /. moderators.
  • Yes, it IS unconstitutional. We are not talking about obscenity, we are applying overly broad and SECRET standards based on content to what adults can read and see in a publicly funded library.

    Content blocked on political views by federally-mandated tools is about as blatant as you can get.

    Oh, and when you start talking about publicly funded exotic dancers being prohibited from discussing their politcal views in a government building, then you'll be comparing apples with apples.
  • I'd argue that it's because censorware companies generally produce software that tries to do too much. Their default policies seem to be "deny access to this list unconditionally; deny access to other sites/pages that match these rules; allow access to anything not denied.&quot

    For libraries and the like, what you really want is filtering software that instead blocks anything that is not explictly allowed. That way, you have to involve a human being in the decision process. Done correctly, you could reach the point where:

    • Teenager is researching "akternative religions" for a class at school [1]
    • Finds that http://www.wiccan.org is blocked
    • Goes to librarian and explains what they're doing
    • Librarian looks up site, sees that it's not an obvious porn site, and adds it to the temporary allow list
    • Teenager gets to finish his/her research
    • At the end of the day/week/month, the head librarian or board gets to review the temporary allow list and decide if they want to add them to the default list

    Yeah, it's still filtering. The point would be to put the control in the hands of the library itself, rather than in the hands of the commercial vendors. Default library policy could be "If we have a book or a periodical that references this, then it's allowed. Otherwise, use your best judgement, or get the parent's approval."

    [1] Just as an aside, I went to a Catholic high school, and this sort of thing was a common type of assignment in junior/senior level religion classes. Kind of a "know your enemy" sort of thing, I guess.

  • OK, on two points:

    > "...librarians have generally done a fine job
    > of selecting what they do and don't want in
    > their libraries over the centuries".

    1. I have a public library down the block, with unrestricted Internet access. I once looked at the history out of curiocity. pr0n, pr0n, free email, pr0n. If you expect our 3 over-worked ilbrarians who wouldn't know what a URL is to be of any help in restricting pr0n usage, you're very mistaken.

    2. I completely agree that the most correct response from the community should not be bitching and moaning but writing a competent open source blocking software.

    -DVK

  • Which means the Canadian Internet Privacy Act applies to me, especially since I'm a dual citizen (USA/Canada).

    You can take away my web links - but you can't take away my freedom!

  • The problem with filtering is that all current solutions are controlled by soley by private entities, often with unstated agendas.

    In a situation like that, how is "local control" possible? If a provider decides to block political views it disagrees with as "obscene," how can the local library control that?

    The problem isn't the technology, it's the people using it. Is "running away screaming from filtering" (speaking of belittling and hyperbolic rhetoric) the solution? Let me put it another way: Does it make sense to endorse a flawed solution on the hope that it will improve later?

    Unless you can propose specific ways to solve filtering's flaws, the argument that we should just jump on board because it's the best game in town is specious at best.

  • I sent this in. Here's my comments, for your viewing pleasure. Warning: This is LONG.

    In regards to public filters, my suggestions:

    1. Allow libraries to choose their own filter products.
    2. Suggest or require that libraries use filtering products whose parent company discloses what criteria is used when blocking or allowing a site (many don't) since many sites may be blocked for often poor reasons, sometimes even on a whim, or if they go against an individual's beliefs or agenda (even if the sites are not in any way profane or could be considered dangerous.) Here's a few articles of interest:
    http://slashdot.org/yro/00/12/08/0238239.shtml about a decrypted filtering list used by SmartFilter, which found a number of sites obviously mis-rated. As with everything else on Slashdot, there are a number of interesting comments appended to this article.
    http://dfn.org/focus/censor/contest.htm A contest, apparently to find the most ridiculous examples of filtering - such as a high school student that couldn't access his own high school's web page from its own library since the filter automatically blocked the word "high" from domain names. The runner up, Hillary Anne, tried to register her email address at Hotmail as hillaryanne@hotmail.com but the filtering software spotted the word "aryan" hidden in her name... A number of other examples exist here, too. Richard "Dick" Armey's Web site is blocked by Netnanny, Surfwatch, Cybersitter, N2H2, and Wisechoice, the very same filters he promotes - because his site contains the word "dick." And, this excerpt from the article:
    The conservative group Focus on the Family intends its anti-porn site Pure Intimacy to be a "resource for those struggling with sexual temptations" and the "psychological bondage" that is "a major reason why individuals go online." Jim K. observed that Cybersitter blocked this site for violating the following categories: porno, hardcoreporno, sexual, nudity, and, of course, bondage.
    I could go on and on, but just visit that contest site - it does.

    http://slashdot.org/yro/00/03/09/133243.shtml about Symantec's I-Gear software and a hack that revealed its true nature - reporting user information back to Symantec and blocking - unnecessarily - a number of .edu sites for no real reason.

    About I-Gear, the high school I graduated from - Celebration - used it, but only for monitoring, since they believed it to be too unreliable and preferred to go after students that continued to visit sites that they themselves could deem inappropriate. They could check for sites that were inappropriate, some of which the filtering software would miss, and would be able to ignore sites that weren't really inappropriate at all but were otherwise blocked by the software anyway. I only know of one site that was blocked, which was a message board for people complaining about the town of Celebration, which has a policy that forbids anyone living there from practicing their first-ammendment rights (they sign them away.) involving negative comments about the town. Whether the site was blocked because of over-use or if it was blocked for more political reasons is unknown to me. I also do not know if they continue this more involved practice of checking the records themselves.

    Having read the article on the contest, however, I really can't say I blame them!

    4. Take steps to protect privacy. Notably, Mattell's filter software (cybernanny, I believe, but I'm not sure) has been accused of spying on the user. A good idea would be banning any software that reports back to the filtering software company. This information is often sold for profit, for uses like targeted advertising or for "spamming."

    5. Allow institutions to selectively add or remove individual sites. Given the above information, I feel that this is more than reasonable. Researching Fibonacci's sequence tripped up one filter, for example - so much for a student's math homework.

    6. Please endorse free and open source solutions. An open product would allow for anyone to suggest that a site be added or removed because of its content, and allows for a debate over sites that otherwise could be blocked or allowed on a whim. In the case of open software, a library could be instead required to go through the developer for a change to be made (improving the integrity of the filters). In addition, allowing for different "levels" of filtering might be desired - allowing for different levels of appropriateness similar to movie ratings. For example, while no minor should access pornography, "adult language" isn't really a problem for 16-year-old students (anyone who's ridden on the back of a school bus can vouch that most elementary school students don't have much trouble with it either, but that's another story.). As an added benefit, they're free, and libraries, schools, and the like could use the money. Unfortunately, however, I know of no open-source filters at this time (much of the tech community is anti-filter anyway, since they disagree with cencorship of all but the most objectionable content.)

    7. Allowing users to gain higher levels of access in order to be able to find more diverse material upon proving age is probably a good idea. In middle school, my required reading occasionally had some amount of profanity, which nearly all filters would immediately modify or block. Filters would also block a number of perfectly legitimate phrases. Saying "I'll just be in and out of the store really quick" would be blocked by any filter that watches for the phrase "in and out" (and I'm sure that there's a filter out there that does, or will.) The upside to this is that a number of areas of interest could be blocked to younger users, such as a study on homosexuality that, while perfectly fine for an older teenager or adult to read, may not be read by children if their parents would prefer they didn't. In addition, allowing parents to selectively add and remove restrictions on various areas of interest that could be considered "gray areas" would be good. While it's fairly obvious that pornography, bomb-making recipies, marijuana growing tips, and the like should be blocked, a number of areas such as human anatomy, homosexuality, religion, and discussions or even readings of banned books (Catcher In The Rye, for example, which many schools and libraries still censor but some embrace as a literary masterpiece) and research reports on the aforementioned topics could be considered "gray areas" that some ages could be allowed or denied access to, and that parents could control more. Allowing a filter software provider of full control over what users can see, and deeming everything to be either bad or good (as is done now) is, obviously, less than desirable. Personally, I don't see what a political campaign and hardcore bestiality have in common, but may programs will block both at once and don't discriminate. Since there is no clear-cut means of deciding what to block, etc., allowing for more flexibility is ideal.

    Personally, I think a standard should be defined for filtering software:

    1. NO reporting back to ANYONE other than local administrators on what users have been doing. Privacy should be protected and only necessary exceptions to that should be allowed. Corporations have no business knowing what a child does online. However, the local administrator and certain individuals (parents, perhaps teachers, etc.) should be able to review logs of what a user does (and all usage should be logged! This offers recourse in the event that a site that is inappropriate, but not yet blocked, is visited.) No one that has no business in knowing should be able to review an individual's usage of computer resources.
    2. Material should be broken down into different "categories" by their topic and the objectionability of each, and a separate standard should be set for which categories are to be blocked to all users, which should be blocked to older teens, to younger teens, etc. and unless unlocked by an authorized operator (eg. A librarian) the software will assume that the user on the given station is a toddler, or a member of the most restricted group.
    3. Aside from mandatory blockings (such as porn) that are simply blocked, period, parents and local administrators should be able to define restrictions for the aforementioned "gray areas," topics that may or may not be objectionable to some. While administrators would be able to specify general restrictions by user groups, parents would have the option to selectively modify their individual child's restrictions (aside, of course, from the mandatory blocks such as bona fide pornography.) A standard should be set for what gray areas will and won't be blocked, that should be enabled by default on all software, but that is modifiable by the local administrator.
    4. Administrators should be able to selectively alter site lists. As demonstrated by the previous examples, a local administrator really is no less trustworthy of what should be blocked and allowed than the software developers, so as long as site visiting is logged, there should be no problem.
    5. Word and phrase filters should be avoided, and only the most extreme examples used. For example: "sex" should not be blocked as the word is often used for gender and appears accidentally within a number of phrases, "cum" appears, among other places, in the word cucumber, "breast" would prevent discussion of breast cancer (America Online once inadvertently did this).

    Government-produced, multi-platform software is likely a plus for this. Since this will cost a great deal of money for each institution to set up, avoiding for-pay services will be a tremendous benefit.

    I would suggest finding and endorsing an open-source effort for creating filtering software and push to see the above features implemented, or for the government to spearhead an effort itself. A number of programmers would become involved for the opportunity to protect their children/nieces/nephews/neighbors and still minimize censorship that is simple unnecessary. An open source model, under government supervision and control, would enable tremendous security and quality and minimize potential for abuse, while allowing for an amazingly rapid exposure rate for newly-appearing sites.

    At any rate, I would appreciate a reply on this. This hits close to home for nearly all Internet users, especially the more technologically-inclined like myself. The simple fact is, currently available solutions simply are not ready for public use and are very much unrefined, and a fair implementation is impossible.

  • You can put up your strawman arguments about adult magazines all you want, but that has nothing to do with the issue. We're not talking about restricting children's access to adult material, that is a legally accepted practice. We are talking about arbitrary and secret standards for censoring CONSTITUTIONALLY PROTECTED SPEECH!

    I'm sorry if the word "arbitrary" hurts your feelings, but that's what it is. Look it up if you like. Censorware does not use standard, narrow, and specific criteria for blocking sites. Whatever the software makers want to block, they block. They won't tell you what or why.

    "belittling and hyperbolic"? Do you know what those words mean? Can you refute one thing I've stated? I find it frustrating that the people pushing this on us cannot argue the facts, cannot argue the law, they just want it done. We live in a country governed by laws, not the whim of those who make the most noise.

    Take a Constitutional Law class. Hell, high school level should do it for you. You might learn something about the First Amendment.
  • The best part is that you can't look up health information from a lot of hospital and health organization web sites with blocking software in place.

    My wife's a teacher, and when she demonstrated to parents how blocking software worked, a lot of them were actually pissed off that it was so general. The public has the idea that the programs can actually identify porn and only porn.

  • Gentlemen with poor hygiene and sub-standard housing can be witnessed at these terminals surfing pr0n for much of the day. It got so bad that women on the otherwise largely liberal library staff complained to the board that it was creating a hostile work environment.

    hmm

    It appears that the only those with "morals" would be intimidated.

    On the other hand, would the libraians be able to make a civil rights complaint and have the gentleman arrested, on the basis of the hostile work enviroment?

    I don't know. IANAL

    It almost seems that the civil rights are extended so much that we protect the abuser of these freedoms (for example, these guys creating the hostile work environment)

    Do these guys have a fundamental *right* to Internet pr0n?

    Damn it, another can of worms to open.

  • Study your history. Follow the money.
  • Why don't we gather all the highest-moderated posts (excluding jokes, since this is serious), and submit them together as the representative opinions of the /. community? It might carry more weight to submit as an organization.
  • So, I've posted the purely technical comment. Now, it's time for the rant. . . =)

    1) I don't support censorship or any other public expression or public gathering limits by governments (and their agents). They almost universally don't make any sense. (e.g. public nudity laws... can these be justified in any rational manner?)

    2) Censorship laws which target by age group simply add insult (age discrimination) to injury (see 1).

    I live in a country where age discrimination constitutionally prohibited.

    Oh, but wait: -- "subject only to such reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society" -- a limitation big enough to drive a tank through.

    I forsee that the "demonstrably" adds a lot of complexity. For example, I have a feeling that the pending supreme court decision about the constitutionality of the child pornography posession law will hinge on that one word and its meaning in context (maybe the law has an indirect effect on child pornography production, but if so, can this be demonstrated?). Although, even then, I have no way to know what the court considers justifiable in a "free and democratic" society. =( My free and democratic society justified law finder box is on the fritz and will be undergoing repairs in the shop until the end of next week. =)
  • by the_olo ( 160789 ) on Wednesday January 24, 2001 @08:51AM (#484417) Homepage
    Heh, that one amused me. This law's name (CIPA) in Polish means "pussy"...
  • This is what more people should do. Far too many people spend their time bemoaning the "sad state of affairs" they are faced with when what they should be doing is taking action.

    I am guilty of this kind of laziness. It seems that with all the things that are happening in the US, now is the time to change. Now is when we should all be taking an active part in our government.

    If this country becomes so restrictive that we no longer have the basic freedoms that many have fought and died for, it is our own fault for letting it happen.

    I commend you for your effort to make your town a better place. You may not receive many thanks now, but your effort will be well worth what you put into it.

    --

  • by lrichardson ( 220639 ) on Wednesday January 24, 2001 @08:52AM (#484419) Homepage
    I agree that any filtering will not work perfectly, but that's not a very good reason.

    EXCUSE ME?!?! There's a slight difference between a spell-checker not having a specific word in it, and a 'filtering' program that blocks thousands of legitimate sites.

    Personally, I have no problem with filtering software provided it is implemented properly and openly, so that you could see what agenda was being pushed. In particular, I would like to see that the librarians have the capability of adding or removing any sites.

    There are some filtering programs that work on blocking specific sites. Pretty bloody useless, too, as the main class they're trying to block is pr0n, which change their names on a daily basis. So, unless you've got some company scanning the web 24/365, and sending updates as they come in to every library/school/whatever running their software, it isn't 100%, in fact, it's closer to 10%. The second class, which is being pushed far more, blocks based on 'key' words (or if other words contain these words). So forget going to a site on breast cancer, any site on sexual diseases, ... the list is huge.

    Perhaps banning porn is a good goal, but the primary software being pushed tramples all over the constitution, if implemented in public places.

    I use libraries for both research and recreation: I use the web the same way. If what I do is legal, then why the f$ck is it being prohibited in a library? My local library doesn't carry Penthouse, nor does it carry pr0n videos (although it does have some 'soft' literature in the fction section). This presents no problem - the library has limited resources, and chose not to get them. However, they also have net access. It costs them no more for me to browse Penthouse.com online than it does for me to browse Bookfinder.com. If they block access to a legal product (Penthouse), then that is active censorship, unconstitutional, and illegal. I did run into a situation that was laugable ... research into Goodyear kept getting me the big red X at the library ... probably because of the use of the word 'rubber'. Had to continue that work at home. Tell me again how that's implemented 'properly and openly'.

    There are laws in the US about what constitutes illegal pr0n. This mandatory filtering (on publicly funded institutions) goes way beyond that, attempting to legislate software into existence that just doesn't exist. And, while the site specific software doesn't raise my hackles on the 1st amendment issue as badly as key-word specific software, if it's legal in the real world, then it should be legal online. The sole problem here is that pr0n is legal if you're over 18, illegal if under. Libraries don't normally have the personnel to monitor who's browsing what. This could be resolved with 'Over 18 only' machines. Personally, I reached the point long ago of getting thouroughly P.O.d over having legitimate research blocked, and the first thing I do with any library computer is disable the filter (along with the monitoring and anti-virus, but hey! ;)

    BTW, gotta love the new image filtering stuff. The best tests I've seen show that a program randomly blocking 20% of all requests would do better.

  • An even better method would not only allow restricting, much in the same thoery as Adult magizines are not available at liberys and schools, access to adult meterial at schools and librarys but would also also Better and more secure access overall. This method would e in implament a Smart Card, embeded with your name, age and login information. Equip the computers with a very inexpensive reader for the cards. In order to use the system one would have to "log in" using there access card, then depending on the "rights" that have been given to that user access would be granted. No filter, no censorship. If you are of legal age to view it, you can. If not you dont.
    There are many ways to use the same ideas accross the entire Adult area (which is the true purpose of filtering) to allow access to only those who are permited to use it. Dennis Zaphod
  • It's shocking to see so many people submitting to and/or championing this effort at censorship. Do we really want to give the government the power to determine the content of our libraries, and by extension the type of information available to the general public? It seems like a dangerous step to me. Also: why have we become so willing to have the government take over enforcement of matters that should be personal. Is the first ammendment situational now? Do we realize what we're giving up? We're trading a basic right to free and unfettered access to information for the comfort of knowing our kids aren't seeing boobies at the library: does this make sense?
  • This law appears to me to have two potentially conflicting standards. It mentions obscenity, which is a legal term defined at the local level, and which is different from place to place. A library in New York, New York would have to apply a different standard for obscenity than a library in Backwater, Nebraska. It also mentions content 'harmful to minors' which would quite possibly conflict with the obscenity standard in place. In short, I don't think there's a filter out there that can conform to the standards the law requires. It would have to be completely configurable by the library and also allow multiple levels of filtering depending on who was at the keyboard.
  • This is not about porn it is about anything W. thinks should be banned and that is *alot* (sic) of stuff.

    Excuse me? This law was signed by President Clinton, not President Bush. That being said, your anger needs to be re-directed towards the courts now, as this has been law for a while and only they can change that.
  • We all have opinions on this and it's great to read them all. To say that it is "unconstitutional" to censor content in the public library system is to presume that the library system is actually an entitlement. This is not true. Nowhere in our constitution does it guarantee that our government will provide library services to it's citizens. This is a pure privilege to be appreciated as such. Also.. I agree that many of the filters out today block some valid content. But let's get back to the primary issue: The intention is to block pornography! Just because there may not be any filters currently that effectively block pornography WITHOUT blocking valid content does not mean that it is not possible to develop such a system. So let's not rule filters out altogether. Let's work on building a more effective filter. Secondly, considering the fact that the library is a "privilege" and not a "right", then we as a society need to decide what's more important. Can we accept blocking 98% access to porn for our children at the expense of blocking 5% of valid content to the rest of us? What numbers do we decide are acceptable? No system is perfect. I live in Fairfax County Virginia. We have one of the most advanced library systems in the country. I have been in the library 8 times in the past 12 months. On at least 3 or 4 occassions I have seen boys between the age of 13 and 18 viewing pornography on the computer. The computers are in the center of the main floor and are viewable to anyone walking by. It is not unreasonable for a parent in Fairfax county to demand better from it's library system... otherwise they might as well start carrying Playboy in the magazine area. Just my opinion.. don't have a heart-attack.
  • For Libraries, this would be not in the side room, but in the main hall where every passer by can see. This being a public space, maybe the illusion of "privacy" in this very public space should be dispensed with. This thought has problems both ways tho.

    Yes it certainly does. For starters, I wouldn't want my borrowing history to be broadcast to everyone.
    Your "borrowing history" wouldn't be broadcast to everyone, just what you are reading in the library while you are reading it (and probably after it, since I doubt that many libraries set their systems up to flush the disk cache and clear the browser history lists after each patron). It's no different than reading a book in the library rather than checking it out - other people can see what book you are reading. And even the act of checking out a book isn't private - you set a stack of books on the check-out desk, and anyone close enough can see what you are checking out.
    One example. What if I'm too poor, too old, too backwards to own a computer (maybe I just like to read paperbacks when I'm at home), but my wife comes down with breast cancer. So, being the caring guy that I am, I treck down to the library and want to do some research so we can ask good questions of the oncologist during the course of treatment.
    Let's remove the computer from the picture. First you have to find the book on the shelf, so any passerby can see what section of the library you are in. Then you get the book and take it to the reading area. Again, any passerby can see what book(s) you are carrying. Then you sit down and read for a while, with the book on display for anyone who cares to look. Finally, you decide to check it out, again in full view.

    You probably have more "privacy" with a computer in a common area, since there is a narrower field of view for others to see what you are reading, they certainly would need to be closer to read the URL than they need to see the title on most book covers, and you aren't carrying a URL from the shelves at one end of the library to the reading room or check-out desk at the other end. Even it that wasn't the case, there aren't any "privacy rights" being violated by placing the computer in a common area that exist in the absence of the computer, and the presence of the computer doesn't confer additional "privacy rights".
  • As an Australian, I am impressed that the FCC is interested in listening to what the American public think of their proposed laws.
    I think you may have misunderstood. First, this isn't a proposed law, but the regulations to implement an already passed law. Second, they don't really care to listen (shown by the 15 day "fast track" approach), they're just required to solicit and accept public comments. That doesn't mean that they actually have to take them into account.
  • The intent does not matter, it is an issue of LAW, why can't you understand that? You cannot nuke all the publishers because you "intend" to block pornography! You cannot broadly censor legally-protected speech because you "intend" to do good.

    'Secondly, considering the fact that the library is a "privilege" and not a "right"'

    Um, no, libraries are a government SERVICE, not a "privilege" that can be granted or denied on a whim. Hence the requirement for handicapped access, lack of fees for access, etc. Sure the local government can tear down all the libraries any time it wants. But if it provides them, and funds them with our tax dollars, then they must abide by the law. And like it or not, the US Constitution is the law of the land. (this land anyway)

  • Don't take this wrong, but even though you have a good point, please don't tell people how to voice their oppinion....

    If the FCC got 5,000,000 letters saying this law is WRONG, I'd think that would have an impact on the enforcement of the law. Wouldn't you agree?

    pan
  • by goliard ( 46585 ) on Wednesday January 24, 2001 @09:30AM (#484430)


    We can't change whether the law is implemented, but be can change how it is implemented. So write to the FCC to demand they institute a policy whereby:

    • Any filtering software used to compy with the law MUST release to public review the list of censored sites. "Public" MUST mean posted to the web (and not self-censored!) and available for free upon demand in hard copy.
    • Any filtering software company MUST have and conform to an appeal process so that wrongfully censored sites have a process of appeal and redress. This appeal process MUST presume the filtering software is in error and the burden of proof MUST be upon the filtering software company.
    • Any filtering software company which is found in a court of law to have censored web sites unethically, such as censoring the pages of its competitors or of political opinions its owners disagree with, MUST BE SUBJECT TO UNLIMITED PUNITIVE DAMAGES. There must be restraints on the almost unlimited power being given to these corporations.

    If there is going to be a private industry censoring what Americans can read in their libraries, then there'd better be checks and balances. We can take advantage of this situation to force filtering software companies to be more ethical and responsible.

  • If the FCC got 5,000,000 letters saying this law is WRONG, I'd think that would have an impact on the enforcement of the law. Wouldn't you agree?

    No.

    The FCC has no choice but to enforce the law. Period.

    I wonder about the wisdom of posting the way to make a comment here on /. If the typical comment on the website is any indication, the FCC is going to get (and ignore) a bunch of whiny comments that don't address the issues they've outlined in the Notice for Rulemaking.
  • The FCC does not apparently want to get too involved in enforcing this Bad Law, which specifies that they have to enforce it. So (if you read the NPRM it's clear) what the FCC wants the schools and libraries to do, when filling out the form acknowleging receipt of the federal money, is say "we comply" or "we do not need to comply".

    Beyond that "self-certification", the FCC says nothing about filtering itself. Clearly they are not enthusiastic.

    And don't blame Clinton: McCain pushed this one through as a Rider to a big appropriations bill. Vetoing that bill would have been very hard. Riders are a trick for passing bad law without going through the constitutional approval process.
  • This country was founded by Puritans, who were fleeing religious persecution in Europe because their views were a little too fucked up for anyone over there to deal with. Their legacy inclues the Salem Witch Burnings, Prohibition and the McCarthy communism inquisition, among others. Any time the minority is vocal and the majority is apathetic, the minority can sieze power.
  • There's a slight difference between a spell-checker not having a specific word in it, and a 'filtering' program that blocks thousands of legitimate sites.

    I agree that current filtering programs are laughable, but that does not imply that a reasonable filtering cannot be created. The first chess-playing programs played pretty bad chess, but now they play with the best.

    There are some filtering programs that work on blocking specific sites. Pretty bloody useless, too, as the main class they're trying to block is pr0n, which change their names on a daily basis.

    I find that hard to believe. Changing your site name on a daily basis doesn't seem like a good way to attract and keep any paying customers. In any case, I find it hard to have much fear in sites that change their name daily. A list of those porn sites with stable names would be a reasonable start on a filter in my opinion.

    The second class, which is being pushed far more, blocks based on 'key' words (or if other words contain these words). So forget going to a site on breast cancer, any site on sexual diseases, ... the list is huge.

    Yes, simple keyword filtering is laughable. However, I'll wager that any decent machine learning algorithm could acheive 90% or better. Does anyone have a nicely differentiated list of sites?

    It costs them no more for me to browse Penthouse.com online than it does for me to browse Bookfinder.com.

    Yes, it could cost them. Taxpayers will not want to fund libraries if people are using library computers for browsing penthouse.com and the like.

    If they block access to a legal product (Penthouse), then that is active censorship, unconstitutional, and illegal.

    Finally, we get to the central argument. I am queasy about this kind of censorship, too, but I don't want my taxes funding porn-viewing.

    I wonder if can you get issues of Penthouse via interlibrary loan. If libraries allow this, then to be consistent, they have to allow penthouse.com. However, if libraries are allowed to deny interlibrary loan of porn, they should be allowed to deny interact access to porn.

  • by sagallagherstarr ( 256196 ) on Wednesday January 24, 2001 @11:28AM (#484448)
    I've been working to uncover another possible objection to filtering/blocking that has to do with copyright.

    I'm not a lawyer (just a lowly librarian), so this may not fly, but:

    • filters work by changing the text of an online document.
    • Current copyright law reserves copyright in "derivative works" to the holder of copyright for the original work.

    My guess is that the implication of the above is that copyright holders must be consulted before derivative works can be published.

    Now, some problems creep into this online world, like this:

    Filtering software does not alter the original work, but only the presentation of the original work. In the offline world, this is a big distinction; in the online world, original work/copy/modified copy are indistinguishable. Bits is bits, that is.

    The CIPA has been worded so that it actually refers to "visual depictions of" the various categories of material to be filtered/denied for viewing. Does this mean text and images (still or motion?), or just images? On screen, pixels is pixels.

    If there are any real live copyright attorneys out there, I'd dearly love to hear what you have to say about these matters.

    Scott
    Assistant Director, North Bend Public Library
    North Bend, OR 97459

  • If this is correct, it is a blatant violation of the Universal Declaration of Human Rights [un.org], Article 26:

    (3) Parents have a prior right to choose the kind of education that shall be given to their children.

    If you as a parent is denied this right, drag them to Haag.

    While I'm not that opposed to general idea of Censorware, I'm very opposed to current implementations, and I would reserve the right to teach my kids (I don't have any yet) to crack the filters, or put up a proxy for them so that they can surf unhindered anywhere.

    Also, it is worth pointing out that any rule-based filter is snake oil.

  • The only answer I keep hearing is that "other people" walking by might be offended if they looked at the monitor. So, what then, I guess if I want to read something, say, D.H. Lawrence or Henry Miller, I'd better make sure that no one can read over my shoulder, because, on the off chance that it might offend them, I can't read anything in public?

    Come on, I know this is a hotbutton /. issue, but can we maintain some level of intelectual honesty? There's a pretty big difference between someone coming up and deliberately reading over your shoulder close enough and long enough to grow "offended" with your choice of reading matter, and displaying a 12 to 20 inch example of corphaphilia (sp?) to all and sundry passerbys.

    Also, the word "offended" is gettign a little thick arround here. In the time of "anti-PC" offending people has become a sort of positive value, and offense is stereotyped as merely uptight anger at someone else's choices. Why not talk about "upsetting" someone? how about "hurting" someone? Is hurting out of line because there is no physical contact? I don't think so. If someone is arrested for exposing themselves to people, it isn't for "offending" them. Its a form of sexual assault. If the library is a public resource, why should people using it have to put up with you flashing them by proxy?

    Not that I think the "person walking by" argument is the only one in favor of public terminal filtering, but its nowhere near as dismissable as you're trying to make it out. If anti-filter types can't deal with opposing arguments honestly, its no wonder you're losing all the time. Stick to the first amendment. Its about the only argument you have that works. And if you wait long enough it will work.

    -Kahuna Burger

  • I completely agree that the most correct response from the community should not be bitching and moaning but writing a competent open source blocking software.

    Good luck. *sigh* The few times I have suggested this, I've gotten either ignored or flamed. I think I saw one similar suggestion with similar results. I'm growing convinced that the entire "unreliable blocking software" argument is more of an excuse to oppose filtering than an insurmountable reason, because anyone who tries to talk about overcoming this "problem" is attacked. (though I'm seeing more of this suggestion on this thread, so maybe something good will eventually emerge from the noise)

    Another good response, since this is only being applied to users of a government discount program would be to form local coalitions to offset the loss of these programs for libraries that are willing to opt out.

    The bad response, which seem to predominate, is to scream and rant and rage, steroetype and insult your opposition and ignore any chance there could be real concern in there.

    -Kahuna Burger

  • My question is this: would the icra voluntary content rating system qualify as a "technology" under the bill sufficient to comply with the statute and the regulation? If so, the thing to do is to head to your local library and get them to implement ICRA, a self-rating and filtering tool based on voluntary disclosure by the page author of web site content.

    I think this is a good solution in theory, the problem is that unrated sites would have to be blocked by default. So this would work only if a lot of people and organizations got behind it. And since it is a compomise position, that will be hard. For instance, is /. participating in this program, or have they refused "on principle"? How does the creator of a new web site find out about this program and use it?

    In theory, though, self rating is one of the best solutions.

    Kahuna Burger

  • I agree that current filtering programs are laughable, but that does not imply that a reasonable filtering cannot be created.

    Actually, there are many reasons why a reasonable filtering program cannot be created. First and foremost, I think everyone has heard of the laughable excesses of key-word filters. That path is out. No amount of technical expertice can overcome a problem that cannot be resolved in plain English. Second, there's serious problems - as in, it can't be done - distinguishing between the Miss December centerfold and much of the stuff hanging in the Louvre. Calling the first 'porn' and the latter 'art' just doesn't translate into rules one can put into a program.

    Sorry for not being more explicit, but many of the really bad sites (use your imagination) use numeric addresses and redirects. If you haven't seen them, then you're obviously not looking hard enough ;)

    It costs them no more for me to browse Penthouse.com online than it does for me to browse Bookfinder.com. Yes, it could cost them. Taxpayers will not want to fund libraries if people are using library computers for browsing penthouse.com and the like.

    That is, quite frankly, a BS argument. There are a huge number of libraries that already have unrestricted web access. Viewing pr0n at them hasn't been a problem. As more of society gets wired, I expect it to become even less of a problem than it's current non-problem state.

    Finally, we get to the central argument. I am queasy about this kind of censorship, too, but I don't want my taxes funding porn-viewing.

    I think it's a misuse of library resources, too. But the answer does not lie down the path that paves over the Constitution. A more realistic approach would be to limit web access to half hour slots if anyone is waiting.

    To take an extreme example, what if I wanted to do a research project on how changing media images have contributed to the growth in anorexia/bulimia rates in the US. Part of that research involves taking pictures of female models from various mainsteam magazines and calculating their weights. That's legitimate. A half hour limit is fine. But blocking my access to something is censorship. That said, Tom the Trucker two terminals down could be looking at the same images, kleenex in hand, so to speak. Can you make a suggestion as to how a program can distinguish between the two of us? Until you can, I'd much rather stay in the spirit of 'Freedom', the Constitution, and strangely enough, the Law, and not implement something that reeks of Big Brother, tramples the Constitution, and is against the law.

  • I'm pretty sure that the U.S. is not a signatory to this document, though. We only sign international trade and arms agreements, nothing humanitarian.

    Actually the USA has been known to sign treaties on other things. If they sign is less of an issue than if they obey.
  • The FCC is seeking comment on fairly specific issues. The FCC does not have the authority to choose to not enforce this law at all. Their hands are tied; they cannot overturn an act of congress.

    Wasn't this exactly one of the things the people who wrote the US constitution would NOT want to happen?
  • I would have no objection to a filter which could filter out ONLY porn, and nothing else. Unfortunately, there is no such filter in existence, and I find it doubtful that one can be created.

    Even if one could be created the person/people who created it would be too interested in getting a Nobel prize to worry about such things as filtering porn sites. With the program itself probably far more interested in establishing its own citizenship rights than something it would undoubtedly see as irrelevant.

    The solution which is being reccommended would filter out a great deal of legitimate research content - and not just content that has something to do with objectionable body parts - it's amazing some of the random sites that get blocked by filtering software.

    It's rather less amazing when you realise that there is very little which is unique to "porn". e.g. The majority of words which may be used on porn sites are also used in completly unrelated areas. Or should sports sites replace "ball" with "spherical playing object" and wood work sites replace "screw" with "spiral fixing".
  • However, the library or school must filter, when minors use the computer, material that is "obscene, child pornography, and harmful to minors."

    What does the last catagory mean? Someone could argue that disney.com, microsoft.com, etc are "harmful to minors"...
  • by mpe ( 36238 )
    Has the image recognition software evolved so much that it can differenciate between porn and art?

    No, this is a difficult task. even for humans. It is a very difficult task for computers.

    For text recognition, how does a software make a difference between acceptable and acceptable? By using a vocabulary black-list? Does that mean that online dictionnairies will not be accessible from librairies because they contain words like "nazi", "devil", or "murder"? Will the words be taken out of context?

    They will only be taken in context once someone comes up with a program which can understand natural language in the first place...
  • This mandatory filtering (on publicly funded institutions) goes way beyond that, attempting to legislate software into existence that just doesn't exist.

    Indeed this is software which may never exist. Since in order to actually do the job you'd be wanting an AI, probably one smart enough to resent being a "slave".
  • Actually, there are many reasons why a reasonable filtering program cannot be created. First and foremost, I think everyone has heard of the laughable excesses of key-word filters.

    But not everyone has the computer science background to understand why cannot work.

    Second, there's serious problems - as in, it can't be done - distinguishing between the Miss December centerfold and much of the stuff hanging in the Louvre. Calling the first 'porn' and the latter 'art' just doesn't translate into rules one can put into a program.

    Humans can't even agree on what the "rules" would be. To even attempt to "explain" them to a computer you'd need the kind of AI which just dosn't exist outside of science fiction. (It's more likely we will be sending manned missions to Jupiter or sending prisoners to the moon than creating such a program anytime soon.) Even assuming someone pulls it off, unless the AI was built into an anatomically correct android, it probably wouldn't have a clue as to why there should even be a problem in the first place.
  • But not everyone has the computer science background to understand why [it] cannot work.

    And even fewer have the machine learning background to understand why it can work most of the time. There are a substantial number of papers that demonstrate that programs can learn to categorize text, including web pages, based on "combinations" of keywords. The programs are not perfect, but 90% accuracy and better is very achievable. If you want some filtering with false positives at around 1%, then you can do that, too.

  • You mean they'd want a federal agency to be able to ignore an act of congress? Seems pretty unlikely to me.

    You're heading down a slippery slope here: if the FCC had the authority to ignore this bad law, what good laws would it also ignore?

On the eighth day, God created FORTRAN.

Working...