Good To See: Wikipedia Moves Forward With Color Coding Less Trustworthy Text

from the teaching-people-to-be-skeptical dept

More than two years ago, we talked about a great idea to deal with the (somewhat misleading) question of the trustworthiness of Wikipedia: color code new edits from untrustworthy editors. Not only would this alert people to at least double-check that particular info, it would remind people that Wikipedia is a constantly changing site. To be honest, I was a bit disappointed that I hadn't heard much about this idea since that summer of 2007. However, apparently, it's been gaining in popularity, and now Wikipedia is set to start using it across the site. Here's how it works:
Based on an person's past contributions, WikiTrust computes a reputation score between zero and nine. When someone makes an edit, the background behind the new text gets shaded orange depending on their reputation: the brighter the orange, the less "trust" the text has. Then when another author edits the page, they essentially vote on the new text. If they like the edit, they'll keep it, and if not, they'll revert it. Text that persists will become less orange over time, as more editors give their votes of approval.
While there are some concerns about how well this will work (and how much processing power it will take), it seems like a worthwhile experiment.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: color-coded, trust, wikipedia


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 31 Aug 2009 @ 4:49am

    A single page of wikipedia has the potential to look like a freaking christmas tree now. Sounds like a plan.

    link to this | view in chronology ]

  • identicon
    NullOp, 31 Aug 2009 @ 5:14am

    Color Coding

    Color coding is the first thing people jump to when they want to 'flag' something or bring notice to it. Its a bad idea. Color coding can confuse and frustrate millions of people that are 'color blind', like me, for instance. It would be much better to use a pattern, symbol or number to denote trust in a document.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 31 Aug 2009 @ 5:20am

      Re: Color Coding

      It says shades of orange, not multiple colors. You'll be fine color-blindness wise, you just need to work on your reading comprehension skills and you're good to go.

      link to this | view in chronology ]

    • icon
      Christopher (profile), 31 Aug 2009 @ 5:46am

      Re: Color Coding

      It's a great idea for most; it's just a bad idea *for you*. It could confuse and frustrate that unfortunate subset of color-blind individual -- so don't use it. It's probably going to be a toggled setting.

      Even if it's not, don't be the kind of person to remove a useful function from the majority, code up a greasemonkey script to see the colors and turn them into a pattern.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 31 Aug 2009 @ 6:00am

        Re: Re: Color Coding

        But this is America where the minorities are now given all the power.

        link to this | view in chronology ]

    • icon
      Marcus Carab (profile), 31 Aug 2009 @ 6:31am

      Re: Color Coding

      That's why I love this site:
      http://colorschemedesigner.com/

      Standard colour-scheme-builder sort of thing, but it has a drop-down menu that lets you choose from 8 different vision disorders (and shows you the percentage of the population with each) so you can see what the colours would look like to each group.

      link to this | view in chronology ]

  • identicon
    Jeff, 31 Aug 2009 @ 6:13am

    Well, time for the holidays

    Let's make a wikipage of a christmas tree, all in colored text.

    link to this | view in chronology ]

  • icon
    Nicholas Overstreet (profile), 31 Aug 2009 @ 6:28am

    Good idea

    I like this idea. There definitely needed to be some way to easily tell recent changes with out looking through the page's revision history. Too bad it took them such a long time to implement something, but still good that they are actually going forward with it.

    link to this | view in chronology ]

  • identicon
    Nick Coghlan, 31 Aug 2009 @ 6:53am

    Still a restricted experiment at this stage

    Reading the Wired article, it looks they're just implementing it for logged in users initially, and even for them it will only be displayed if they click on a specific "Trust info" tab at the top of the screen.

    Still, at least it gives them a chance to see how it goes with the full data set and real time updates as entries change.

    I wouldn't be surprised if this then became a new editorial option in the future - instead of locking a page, editors may be given the ability to flag a page as one which should display trust info to all users, which would also enable an info box at the top of the page explaining what all the orange colouring was about.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 31 Aug 2009 @ 7:10am

    So what happens when one, not-so-liked editor updates a page with something verifiable and accurate but other editors who don't care too much for the first, coat his update in orange or revert to something they prefer instead.

    Hopefully there's some sort of system to avoid this - besides being extra special nice.

    link to this | view in chronology ]

    • icon
      Free Capitalist (profile), 31 Aug 2009 @ 8:13am

      Re:

      So what happens when one, not-so-liked editor updates a page with something verifiable and accurate but other editors who don't care too much for the first, coat his update in orange or revert to something they prefer instead


      You forgot that after they color the unpopular (but verifiable) opinion orange, they may/will ultimately remove it.

      I look at wikipedia in the same light as I look at 'scientific' documentaries from TV. In order to reach a broader audience, TV docs will leave out much of the debate around a subject (such as the big boom theory) and espouse a single theory as being 'as good as fact'.

      The producers of a TV show get the last say on which theory will be supported in a documentary. Same goes for Wikipedia in the end, despite the ideal.


      I'm not completely trashing Wiki here, I will still use it for what it is often good for (a jumping off point).

      link to this | view in chronology ]

      • icon
        Marcus Carab (profile), 31 Aug 2009 @ 8:42am

        Re: Re:

        The nice thing about Wikipedia, though, is that you can go to the discussion page on a topic with more than one side to it and usually see every view presented, dissected and refuted/accepted, and then decide for yourself if you agree.

        The fact is, no process is going to produce perfect results, which is why transparency in the process itself is so important.

        Still, I agree it's not perfect. But as a jumping off point (indeed its best role) it is a *godsend*

        link to this | view in chronology ]

  • identicon
    MRK, 31 Aug 2009 @ 7:19am

    @11
    I was thinking the exact same thing. Wikipedia (and really most wiki sites) have a problem where a few select editors (who make a lot of edits) consider themselves the most trusted editors. Those editors frequently revert changes by new users even if those edits are easily verifiable fact. I suspect this new system will be a be used as a tool for to exert control over the site.

    I hope I am wrong, but human nature is quite predictable in this regard.

    link to this | view in chronology ]

  • icon
    edgebilliards (profile), 31 Aug 2009 @ 9:07am

    i hate when i can't find sources...

    there have been two studies that came out recently that have shown that the nature of wikipedia has changed drastically since its inception. (there's also a long, labored discussion on...slashdot, i think?) long story short, these "trusted editors" just sit on the recent changes page and revert entries by "non-trusted editors" to get their edit count higher (and thus move up the ranks.)

    wikipedia doesn't need color-coded pages, it needs to do away with the rank system entirely. it was useful in getting people interested and contributing when wikipedia needed contributors. but reverting honest changes doesn't add any value to the project and their voices shouldn't take priority over the democratic whole.

    [as long as i'm on the subject, and you can ignore this off-topic rant, i don't see why any entries should be rejected for being too obscure. honestly, how much server space is needed for short entries on, for instance, phds working on important research projects or minor charaters from the star wars universe.]

    long story short, wikipedia is mounted on a high horse, and that is exactly how dysfunctional oligarchies get started.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 31 Aug 2009 @ 9:50am

      Re: i hate when i can't find sources...

      WIkipedia is going down the same road as DMOZ. You know, "people do it better". Well, some people do cheating and scamming better. DMOZ died because the vast majority of editors were in it for their own self interests and commercial uses, and not for the good of DMOZ.

      link to this | view in chronology ]

    • icon
      nasch (profile), 31 Aug 2009 @ 1:09pm

      Re: i hate when i can't find sources...

      [as long as i'm on the subject, and you can ignore this off-topic rant, i don't see why any entries should be rejected for being too obscure. honestly, how much server space is needed for short entries on, for instance, phds working on important research projects or minor charaters from the star wars universe.]

      I agree! The only criterion for notability should be if someone is interested enough to write a Wikipedia article about it. If it meets all the other criteria (NPOV, references, etc) leave it up, for crap's sake!

      link to this | view in chronology ]

  • identicon
    random, 31 Aug 2009 @ 8:28pm

    I don't think this is going to work at all.

    Based on an person's past contributions, WikiTrust computes a reputation score between zero and nine.

    This new system creates a public flag based on a ranking produced by the wikitrust system. I spent hours updating and redoing a page only to have someone revert the page in its entirety. Rather then edit my new content the lazy editor who may see this page as his personal hangout has done the best thing to make sure the layman sees only his information.

    What does color coding do to prevent this?

    I think color coding will only make it harder to become trusted because now a new editor will be consistently flagged while an older editor receives no such penalty regardless of his personal stake in a page or accuracy of his information.

    This new feature is nothing more the some foolish attempt to make wikipedia seem more trustworthy to the general public. It does nothing to address the deeper issues plaguing wikipedia as a whole.

    link to this | view in chronology ]

    • identicon
      random, 31 Aug 2009 @ 8:31pm

      Re: random

      I apologize for the double post but I felt I had to add this.

      People should learn to be skeptical of all sources. Color coding entries only encourages the user to trust wikipedia when he should be questioning the entire site and not just some orange box.

      link to this | view in chronology ]

  • identicon
    irv, 1 Sep 2009 @ 11:13am

    depends on how you look at it

    The theory of having some kind of automated trust measure is very good. True, in practice it can be subverted by an irresponsible editor. But everything about Wikipedia can be (and to some extent has been) subverted by editors.

    The question is, is there some way of automating ratings for editors? Someone (as mentioned in a post above) who routinely reverts edits instead of judging them carefully might be suspect - unless the page is one prone to vandalism.

    It's a difficult question.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 19 Sep 2009 @ 7:33pm

    ss

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.