Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Australia Security Your Rights Online

Campaign To Kill CAPTCHA Kicks Off 558

Bismillah writes "CAPTCHA may be popular with webmasters and others running different sites, but it's a source of annoyance to blind and partially sighted people — and dyslexic people and older ones — who often end up being locked out of important websites as they can't read wonky, obfuscated letters any more than spambots can. A campaign in Australia has started to rid sites of CAPTCHA to improve accessibility for everyone."
This discussion has been archived. No new comments can be posted.

Campaign To Kill CAPTCHA Kicks Off

Comments Filter:
  • by mrjb ( 547783 ) on Monday August 05, 2013 @04:53PM (#44480641)
    Captcha fulfills a need - it is, as the name implies, a test to completely automatically tell computers and humans apart. It's necessary to keep spambots from registering accounts and spamming the hell out of us. Granted, the "type this wobbly word" may not be the most practical (nor safe) solution. It's easy enough to come up with alternatives- Perhaps show four photographs and ask the user to click on the one that doesn't belong (maybe the kitten out of a picture of 4 cats). Coming up with good ideas? Much harder. Complain about it all you like. Come back if you have a better alternative.
  • by Anonymous Coward on Monday August 05, 2013 @04:56PM (#44480679)

    What if I want my users to be able to post the form more than 50 times per day?
    Cooldowns and cacheing just wont do it. The only real alternative I see is to hide the form behind a login, which in the end is more inconvenient for the end user than a user friendly captcha.

    There are simple ones that are easy on the eye out there ( like slashdot's ), and you can make your own quite easily as well. There is one widely used one, reCAPTCHA I think, that is just awful and should be avoided.

  • by Anonymous Coward on Monday August 05, 2013 @05:00PM (#44480737)

    from automated submissions?

    If you don't know any alternatives, you shouldn't be administering them.

    Yeah, I guess the folks at Google, Yahoo, Microsoft, Amazon etc don't know what they are doing either. Captcha is used because there is no real alternative if you want anonymous form submissions on your site. There are certain measures we can put in place, in certain contexts, but no catch all one size solution.

  • by Zmobie ( 2478450 ) on Monday August 05, 2013 @05:04PM (#44480775)

    I think you're missing the idea of what type of logic puzzles they mean. Simple things like image processing (someone in the comments below brought the example up of using company logos and you type the name, pizza toppings matched to the correct pizza) or natural language processing could be used to WRECK a bot. Imagine this, I pose the question as a human verification, "What color was George Washington's favorite white horse?" A human (with half a brain) easily sees how stupid simple it is to find the answer which is white, but a bot would have hell with that type of question because it involves language processing to determine the appropriate response. That is a pretty simplified example, but you can find these all over the place and they are fairly easy to create.

    Some of these could be defeated easily with something like a call to Wolfram Alpha, but you could quite easily find and create things that are not going to be simple to automate the logic processing, but would be completely trivial for a human to process, even stupid ones. Language and image processing are RIDICULOUSLY difficult to automate efficiently which would defeat the purpose of the bots, while making things a lot easier on the people that do have to deal with this sort of thing. I personally hate the current version of CAPTCHAS (hell, I can't read some of the more difficult ones and I write some of the software that USES them), but I do recognize the need for them. No reason they can't be improved upon though.

  • Re:stupid (Score:4, Interesting)

    by pla ( 258480 ) on Monday August 05, 2013 @05:09PM (#44480805) Journal
    Yes it is stupid. I understand that spam is a problem, but if you run a website, it's *YOUR* problem. CAPTCHAs make it *MY* problem and that's just stupid.

    You assume the website needs you more than you need it. For the standard commercial "wall of ads with some random content between" site, sure, what you say holds true

    For a lot of smaller interest-group-themed sites, usually run by a handful of non-IT-gurus, put bluntly you need them more than they need you, and they don't have a full-time body around to read through all new posts to purge the spam.

    Now, personally, I prefer the "math word problem" style CAPTCHAs - Because not only do they not discriminate against the blind or the old, they effectively keep out the spam and the stupid. Win-win!
  • by Qzukk ( 229616 ) on Monday August 05, 2013 @05:12PM (#44480851) Journal

    Wolphram Alpha had no idea about the color of Washington's favorite white horse (it looked up the distance between some town named George, WA and White Horse,NJ), but if you put it into google, you discover that Washington had no white horses, the closest being a gray named Blueskin.

  • Re:stupid (Score:5, Interesting)

    by Thry ( 962012 ) on Monday August 05, 2013 @05:13PM (#44480857)
    I was about to tell you to take advantage of the audio alternative offered by many services, then I went and tried a reCAPTCHA audio test to make sure I knew what I was talking about.

    I apologise for even considering telling you to use those.
  • Re:stupid (Score:5, Interesting)

    by icebike ( 68054 ) on Monday August 05, 2013 @05:13PM (#44480863)

    If taking a couple seconds to answer a CAPTCHA is too much effort, I probably don't really care what you have to say in the comment section.

    Or a couple of minutes considering most capchas are illegible.

    This!

    More and more, captchas take two or three attempts.
    (Disclaimer: IMHO, I'm not senile, dyslexic, a horrible typist. blind. Your opinion may vary).

    I suspect some sites are intentionally forcing a fail once or twice, at least occasionally, especially when you enter the word
    in a timely interval. Bots probably give up after two failures, and they probably answer quickly.

    So implementers make it more and more restrictive and throw in bogus failures.

  • Re:stupid (Score:4, Interesting)

    by IamTheRealMike ( 537420 ) on Monday August 05, 2013 @05:44PM (#44481135)

    The NSA and its friends already track who logs into your website (or at least the IPs that do) so I wouldn't worry about that one too much.

    One technical measure that has been floated recently is the idea of using Bitcoin. What you do is provably sacrifice some bitcoins to miner fees, thus creating a kind of anonymous passport. That proof of sacrifice has public keys embedded in it to which you own the private keys, and it was provably expensive to create. So the idea is that you sign up with your passport and then if you misbehave, it can get added to a blacklist kind of like how Spamhaus blacklists IP addresses. Now you can set the cost of abuse to a precise degree. Good users only have to pay once and can use the same passport for years. Abusers find their business models are unprofitable.

    Unfortunately the software and protocols for that aren't implemented yet.

  • by spinozaq ( 409589 ) on Monday August 05, 2013 @06:11PM (#44481345)

    I recently started getting hundreds of spam signups a day on my site. So I installed a CAPTCHA to prevent that. I setup a standard image CAPTCHA with a plugin for the CMS. More then 80% of the spam sign ups just walked right through it. Then I changed the type of CAPTCHA to an ASCII art CAPTCHA. I haven't had a spam sign up since. The ASCII art CAPTCHA is also much easier to read then weird image CAPTCHAs.

  • Re:stupid (Score:4, Interesting)

    by plover ( 150551 ) on Monday August 05, 2013 @06:36PM (#44481519) Homepage Journal

    Adding rel="nofollow" to any links provided by your untrusted commenters is a good start. It's a promise that Google and other search engines won't do any indexing or page ranking based on the href in the same tag.

    Spammers have a pretty common M.O. They sign up with an account and use their spam link as their "home page". They then pollute the blog. The obvious spam is repeated variations on the same topic, and looks like "brand name products, products brand name, brand products name, ..."

    Lately, link spam is done with a flattering but generic message that looks like it came from a non-native speaker: "I thanking you for your keen insight, have you other similar articles online? I would like to know more how you come to know this." An unwary site operator will often mistake the flattery for a conversation, and allow the spammer to remain a user. (The flattery is script-generated, by the way.) Their "home page" is often a dummy "news portal", which is just replaying whatever feeds they can get. The trick is this news portal has lots of links to the sites the SEO is trying to push.

    While rel="nofollow" will render their efforts to associate their spam with a legitimate blog completely wasted, there are two negatives. First, unless the spammer knows it's there, they're going to spam you anyway. Second, it takes away your contribution of "linkiness" for your legitimate users' links to Google's pagerank algorithm. You can fix this with extra work like "probationary" and "full" users, but then you're taking on the task of rating your readers, which may be Sisyphean on a site the size of Slashdot.

Happiness is twin floppies.

Working...