Human Rights Group Deploys An 'Empathy Test' Captcha System To Help Sites Fend Off Trolls
from the because-i-love-being-talked-down-to-by-a-dialog-box dept
Fact: if you have a site with any amount of traffic and open comment threads, you're going to draw trolls. There's no method that's been proven to completely rid your site of trolls, though not for a lack of trying. (This one is particularly mischevious.) Various sites have tried anything from aggressive moderation to requiring Facebook logins... all to no avail. (Although the latter method has proven that certain people are more than willing to troll without the protection of anonymity.)A new anti-troll tool from a rather unlikely source has just been unveiled, one that hopes to combine the "fun" of solving captchas with something akin to a "blush response:"
A human rights group is introducing a new take on CAPTCHAs, those little boxes that make you type in a word to prove you are human before you can comment or register for a site. Their version doesn’t just present a scrambled word to be deciphered, but instead forces a person to choose the right word to unscramble based on the proper emotional response to a human rights violation.That's right. It's Voight-Kampff for comment threads. Instead of trying to parse a set of badly scanned words and Street View cam house numbers, Civil Rights CAPTCHA instead asks you how you feel about certain horrific situations, hoping that you'll make the "right" decision before spewing your vitriol and ignorance into the now-unlocked comment box.
Civil Rights Defenders, the Swedish-based group that developed the tool, hopes the Civil Rights Captcha will help sites block spiders and bots, while letting humans in — and hopefully educating the humans at the same time...
But perhaps forcing a troll to repeatedly choose an empathetic response will, over time, soothe the ravages of comment sections around the net. Okay, that might also be asking too much, but at the very least spreading information about human rights abuses certainly can’t hurt, even if the jerks of the internet (see, for example, YouTube comments) remain beyond help.
While its heart is certainly in the right place, the implementation still requires captchas, something most users would rather not encounter every time they make a comment. (Yes, I know. But sometimes, decent , non-trolling humans don't want to "create an account" or "enter an email address" in order to participate.) On top of that is the fact that each captcha has only one "right" answer, making the system more than a little heavy-handed in its moralizing. This assumes that your regular, non-troll commenters are going to be fine with being preached at while jumping through hoops. It also assumes that all dedicated trolls are morons incapable of deducing the (obviously) "right" reaction to each situation presented.
This particular captcha service might prove useful in limited situations, like being pre-loaded with questions related to a particular cause or event being discussed/promoted at the website deploying it. It also might prove popular with the sort of people who are willing to annoy a certain percentage of their community in order to "raise awareness." It will become a form of penance for those involved, much like forwarding "concerned" emails and switching Facebook statuses to show support. You know, the sort of thing that will morph into "I solved Captchas for world peace. What will YOU do?" t-shirts popping up on Cafepress.
I can't see this solving the troll problem, but I can see it annoying most of a user base, leaving the site deploying it with a smaller audience consisting of people who like being moralized at frequently. Like any other captcha, the spambots and trolls will find a way around it, with the only ones affected being decent human beings, which would seem to be the sort of "demographic" you'd want to annoy less. Pushing them through a "think our way or hit the road" filtering system doesn't make trolls any less prevalent or make non-decent human beings any more "decent."