Become a fan of Slashdot on Facebook


Forgot your password?
Censorship Privacy Communications Government Security Social Networks The Internet Your Rights Online

Pakistan Looking For Homegrown URL Blocking System 97

chicksdaddy writes "Tech-enabled filtering and blocking of Web sites and Internet addresses that are deemed hostile to repressive regimes has been a major political and human rights issue in the last year, as popular protests in Egypt, Tunisia, Libya and Syria erupted. Now it looks as if Pakistan's government is looking for a way to strengthen its hand against online content it considers undesirable. According to a request for proposals from the National ICT (Information and Communications and Technologies) R&D Fund, the Pakistani government is struggling to keep a lid on growing Internet and Web use and is looking for a way to filter out undesirable Web sites. The 'indigenous' filtering system would be 'deployed at IP backbones in major cities, i.e., Karachi, Lahore and Islamabad,' the RFP reads (PDF). It would be 'centrally managed by a small and efficient team stationed at POPs of backbone providers,' and must be capable of supporting 100Gbps interfaces and filtering Web traffic against a block list of up to 50 million URLs without latency of more than 1 millisecond."
This discussion has been archived. No new comments can be posted.

Pakistan Looking For Homegrown URL Blocking System

Comments Filter:
  • 50 million URLs (Score:4, Interesting)

    by DigiShaman ( 671371 ) on Friday February 24, 2012 @07:14PM (#39154071) Homepage

    Why not just block everything and only allow what's whitelisted? Examples to include in the whitelist are Corporations, Universities, and other governmental sites. All others seen as non-islamic get blocked outright. If you're not on the list, you simply fill out a form for internal review and hopefully added to the whitelist.

    If they're going for total control, do it right. Better yet, just create an entirely new Pakistani network without any outside peering. A pakistani version of Wikipedia could be translated and updated via an air-gapped network scheme.

    And no, I'm not the first person to think of this. I'm not that much smarter than everyone else :-P

  • by K. S. Kyosuke ( 729550 ) on Friday February 24, 2012 @07:18PM (#39154107)
    They remind me of our former communist politicians - thanks to their position, they got to visit the West, and they still didn't see. They went back home and continued vilifying things they had seen but hadn't grasped. This is such a similar situation that it's not even funny. Islam is the new communism is the new fascism.
  • by DarkOx ( 621550 ) on Friday February 24, 2012 @07:21PM (#39154151) Journal

    Maybe but URL filtering in under 1ms with any sizeable list of URLs is going to be pretty darn impossible. Its pretty tough to do much of any thing to traffic that requires any sort of lookup that fast. I mean DRAM fetch is 5+ns.

    Even if you can search your lookup table fast enough keep in mind you are not just comparing values at fixed offsets like NAT and IP Access lists and similar need to you first have to figure out is this traffic http? Locate the host header and read until new line. Non of that is especially time consuming but its still going to be a chuck of that already tight ms.

  • by Zan Lynx ( 87672 ) on Friday February 24, 2012 @07:38PM (#39154321) Homepage

    1 millisecond is 1,000 microseconds or 1,000,000 nanoseconds. A 2 GHz CPU runs at least one instruction every nanosecond and usually more like 6-12 instructions. As you say, the DRAM fetch is significant, but a well-designed B-tree database already loaded in RAM reduces the impact because of good algorithm design.

    It's like an eternity in CPU time.

    Of course, you can't write the code in Python, Perl or Ruby. You have to use C++.

  • by kawabago ( 551139 ) on Friday February 24, 2012 @08:38PM (#39154795)
    It's a shame to waste resources trying to stop the few who can read from doing so.

"Turn on, tune up, rock out." -- Billy Gibbons