Good To See: Wikipedia Moves Forward With Color Coding Less Trustworthy Text
from the teaching-people-to-be-skeptical dept
More than two years ago, we talked about a great idea to deal with the (somewhat misleading) question of the trustworthiness of Wikipedia: color code new edits from untrustworthy editors. Not only would this alert people to at least double-check that particular info, it would remind people that Wikipedia is a constantly changing site. To be honest, I was a bit disappointed that I hadn't heard much about this idea since that summer of 2007. However, apparently, it's been gaining in popularity, and now Wikipedia is set to start using it across the site. Here's how it works:Based on an person's past contributions, WikiTrust computes a reputation score between zero and nine. When someone makes an edit, the background behind the new text gets shaded orange depending on their reputation: the brighter the orange, the less "trust" the text has. Then when another author edits the page, they essentially vote on the new text. If they like the edit, they'll keep it, and if not, they'll revert it. Text that persists will become less orange over time, as more editors give their votes of approval.While there are some concerns about how well this will work (and how much processing power it will take), it seems like a worthwhile experiment.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: color-coded, trust, wikipedia
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Color Coding
[ link to this | view in thread ]
Re: Color Coding
[ link to this | view in thread ]
Re: Color Coding
Even if it's not, don't be the kind of person to remove a useful function from the majority, code up a greasemonkey script to see the colors and turn them into a pattern.
[ link to this | view in thread ]
Re: Re: Color Coding
[ link to this | view in thread ]
Well, time for the holidays
[ link to this | view in thread ]
Good idea
[ link to this | view in thread ]
Re: Color Coding
http://colorschemedesigner.com/
Standard colour-scheme-builder sort of thing, but it has a drop-down menu that lets you choose from 8 different vision disorders (and shows you the percentage of the population with each) so you can see what the colours would look like to each group.
[ link to this | view in thread ]
Still a restricted experiment at this stage
Still, at least it gives them a chance to see how it goes with the full data set and real time updates as entries change.
I wouldn't be surprised if this then became a new editorial option in the future - instead of locking a page, editors may be given the ability to flag a page as one which should display trust info to all users, which would also enable an info box at the top of the page explaining what all the orange colouring was about.
[ link to this | view in thread ]
Hopefully there's some sort of system to avoid this - besides being extra special nice.
[ link to this | view in thread ]
I was thinking the exact same thing. Wikipedia (and really most wiki sites) have a problem where a few select editors (who make a lot of edits) consider themselves the most trusted editors. Those editors frequently revert changes by new users even if those edits are easily verifiable fact. I suspect this new system will be a be used as a tool for to exert control over the site.
I hope I am wrong, but human nature is quite predictable in this regard.
[ link to this | view in thread ]
Re:
You forgot that after they color the unpopular (but verifiable) opinion orange, they may/will ultimately remove it.
I look at wikipedia in the same light as I look at 'scientific' documentaries from TV. In order to reach a broader audience, TV docs will leave out much of the debate around a subject (such as the big boom theory) and espouse a single theory as being 'as good as fact'.
The producers of a TV show get the last say on which theory will be supported in a documentary. Same goes for Wikipedia in the end, despite the ideal.
I'm not completely trashing Wiki here, I will still use it for what it is often good for (a jumping off point).
[ link to this | view in thread ]
Re: Re: Color Coding
[ link to this | view in thread ]
Re: Re:
The fact is, no process is going to produce perfect results, which is why transparency in the process itself is so important.
Still, I agree it's not perfect. But as a jumping off point (indeed its best role) it is a *godsend*
[ link to this | view in thread ]
i hate when i can't find sources...
wikipedia doesn't need color-coded pages, it needs to do away with the rank system entirely. it was useful in getting people interested and contributing when wikipedia needed contributors. but reverting honest changes doesn't add any value to the project and their voices shouldn't take priority over the democratic whole.
[as long as i'm on the subject, and you can ignore this off-topic rant, i don't see why any entries should be rejected for being too obscure. honestly, how much server space is needed for short entries on, for instance, phds working on important research projects or minor charaters from the star wars universe.]
long story short, wikipedia is mounted on a high horse, and that is exactly how dysfunctional oligarchies get started.
[ link to this | view in thread ]
Re: i hate when i can't find sources...
[ link to this | view in thread ]
Re: i hate when i can't find sources...
I agree! The only criterion for notability should be if someone is interested enough to write a Wikipedia article about it. If it meets all the other criteria (NPOV, references, etc) leave it up, for crap's sake!
[ link to this | view in thread ]
Based on an person's past contributions, WikiTrust computes a reputation score between zero and nine.
This new system creates a public flag based on a ranking produced by the wikitrust system. I spent hours updating and redoing a page only to have someone revert the page in its entirety. Rather then edit my new content the lazy editor who may see this page as his personal hangout has done the best thing to make sure the layman sees only his information.
What does color coding do to prevent this?
I think color coding will only make it harder to become trusted because now a new editor will be consistently flagged while an older editor receives no such penalty regardless of his personal stake in a page or accuracy of his information.
This new feature is nothing more the some foolish attempt to make wikipedia seem more trustworthy to the general public. It does nothing to address the deeper issues plaguing wikipedia as a whole.
[ link to this | view in thread ]
Re: random
People should learn to be skeptical of all sources. Color coding entries only encourages the user to trust wikipedia when he should be questioning the entire site and not just some orange box.
[ link to this | view in thread ]
depends on how you look at it
The question is, is there some way of automating ratings for editors? Someone (as mentioned in a post above) who routinely reverts edits instead of judging them carefully might be suspect - unless the page is one prone to vandalism.
It's a difficult question.
[ link to this | view in thread ]
[ link to this | view in thread ]