Another Reason We Need Open Government Data: To Avoid Information Asymmetries

from the database-of-intentions dept

Can the future aggregate actions of people be predicted from relevant sets of data that describe them? That, of course, is what Isaac Asimov's invented mathematical discipline of psychohistory was supposed to do. Some Japanese researchers claim to have made some progress towards that goal:

These guys have used ideas from statistical mechanics to model the behaviour of humans influenced by word-of-mouth interactions and advertisements. In this paper, Ishii and co derive a bunch of equations that they use to model the number of people who'll turn up to see a movie or visit an art show.
Inspired by this work, Nicklas Lundblad has written an interesting speculative piece about what the rise of predictability through the analysis of huge data sets might mean for society and openness. He notes that one of the "theorems" of psychohistory is that for it to be effective the data sets and the predictions derived from them must be kept secret from the populations involved – the idea being that if they were able to analyze that same data themselves, they might change their actions and thus nullify the predictions.

He points out that this creates a tension between predictability and openness:

There is an assumption here that is worth highlighting. And that is that for a democracy to remain open it can not be predictable by only a few. That is a complex and perhaps provocative assumption that I think we should examine. I believe this to be true, but others will say that our democracy already is predictable, in some sections and instances, only to a few and that they build their power base on that information asymmetry, but that it is reasonably open still. Maybe. But I think that those asymmetries are not systematic to our democracy, but confined to those phenomena, like stock markets, where they are certain to be important, but where they also do not threat the nature of democracy as such.

In summary, if we share the data and allow everyone to use it, then predictability goes down.
That, in its turn, is an argument for openness. If data held by a government, say, is released freely, anyone can explore its implications and then be able to modify their actions based on them, and thus escape being a statistical part of the predictability that would otherwise be implied by remaining in ignorance. As Lundblad writes:
If there is a conclusion here it seems to be to explore the amazing value of data under the imperative of openness to the the full extent possible to ensure that our societies gain from this new, fantastic age of data innovation, discovery and exploration that we are entering into, but never compromise on that openness in the pursuit of macro-social predictability.

Follow me @glynmoody on Twitter or identi.ca, and on Google+

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: information asymmetry, open government


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Lawrence D'Oliveiro, 1 Feb 2012 @ 12:31am

    Raw Data Won�t Be Enough

    Having the data is one thing, processing it in the right way is another. What if someone discovers a secret algorithm that gives them greater predictive power over society than anybody else? And what if they won�t share that secret? Can they be forced to give up what some might see as an unfair advantage?

    link to this | view in chronology ]

    • icon
      btrussell (profile), 1 Feb 2012 @ 4:55am

      Re: Raw Data Won�t Be Enough

      Nope. Math would then come under copyright.

      Then that person would share it with the world. See how copyright magically promotes progress?

      link to this | view in chronology ]

      • icon
        btrussell (profile), 1 Feb 2012 @ 4:58am

        Re: Re: Raw Data Won�t Be Enough

        Keep in mind no one could afford to teach/learn math for another 150 years or so.

        link to this | view in chronology ]

      • identicon
        Bengie, 1 Feb 2012 @ 5:44am

        Re: Re: Raw Data Won�t Be Enough

        If that person shares it. You don't have to share anything, you can keep it a secret and never copyright it.

        Luckily, most anything that one person discovers is usually discovered by others. Only a few exceptions to this.

        The other good thing to know is most people who discover truly wonderful things also tend to have the personality to find recognition reward enough.

        link to this | view in chronology ]

  • icon
    rukidding (profile), 1 Feb 2012 @ 12:49am

    You mean like Google's search algorithm?

    link to this | view in chronology ]

  • icon
    PW (profile), 1 Feb 2012 @ 1:00am

    Wouldn't that lead to just a different set of eventually predictable behaviors (how people react to knowing certain info)? Separately, the idea of people reviewing the data presumes that they all know how to read it the same way and come to the same or similar conclusions. That intuitively feels unlikely, much the same way as stock market models lead different investors to different conclusions. The predictability concept also fails to inspire since it's always about what data is being analyzed and while there may be times that it correlates well, short of having perfect data sets for everything, it's unlikely that predicability will hold over the long term. Again, I hold hedge funds and other stock investors as demonstrations of this sort of systemic failure.

    link to this | view in chronology ]

  • identicon
    MichaelG, 1 Feb 2012 @ 2:39am

    financial indicators already like this?

    I think I've read that finance people chase statistics like this. Once something becomes a reliable predictor of the market, everyone piles on and it loses its predictive power.

    I don't know enough about markets to give an example though. Can someone else?

    link to this | view in chronology ]

  • identicon
    Tom Holroyd, 1 Feb 2012 @ 4:04am

    psychohistory

    Asimov is fine and all, but the "theorems of psychohistory" are fictional and written to create drama. (Same for the 3 laws of robotics- not real)
    /pet-peeve

    link to this | view in chronology ]

    • icon
      A Dan (profile), 1 Feb 2012 @ 7:12am

      Re: psychohistory

      Just because it doesn't currently exist does not mean it is impossible. That's the whole idea of science fiction.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Feb 2012 @ 8:33am

      Re: psychohistory

      Funny, I have a PhD in one branch (Sociology) of what Asimov referred to as psychohistory....in fact reading the Foundation novels was how I got interested in it to begin with.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Feb 2012 @ 7:13am

    No Economic Benefit

    Information asymmetry is always bad for the economy. This is what the laws against insider trading are supposed to be about. When an inside trader makes a profit because he has learnt about some company action before the rest of the market, then that profit comes at the expense of other players in the market. Information asymmetry benefits a few, and disadvantages the many. Thus the many find it unprofitable to invest, which then causes lower aggregate economic activity, which then causes unemployment and poverty.

    The global financial crisis was caused by information asymmetry. The misled buyers of CDOs thought they were buying AAA securities. Meanwhile, the sellers knew perfectly well that they were selling junk. When the information asymmetry went away, the price of the securities went down to zero. Vast losses were incurred by the many. The few walked away with vast profits. We are still living through the economic damage so caused.

    link to this | view in chronology ]

    • identicon
      Lawrence D'Oliveiro, 1 Feb 2012 @ 2:40pm

      Re: The global financial crisis was caused by information asymmetry.

      We have a more traditional term for your �information asymmetry�: how about �fraud�.

      link to this | view in chronology ]

  • icon
    timmaguire42 (profile), 1 Feb 2012 @ 8:18am

    As presented, the theory is disproven

    If the theory is valid, it will be able to predict how people will change their behavior upon learning about the data set. If the theory cannot accommodate openness then the theory is invalid.

    link to this | view in chronology ]

  • identicon
    PRMan, 1 Feb 2012 @ 10:45am

    Funny...

    I'm reading a book right now on how the Allied forces did this to the Japanese military in WWII. It's part of the reason Japan started so strongly in the war, because they were smug with their predictions about them.

    link to this | view in chronology ]

  • icon
    jadamslsmo (profile), 2 Feb 2012 @ 9:56am

    Try this author's take on knowing the future

    Ted Chiang's short story "The story of your life" looks at knowing the future and the consequences. It is an excellent read. It is in the book of collected shorts stories of Ted Chiang entitled "The story of your life and other stories".

    link to this | view in chronology ]

  • icon
    Toot Rue (profile), 14 Feb 2012 @ 4:48am

    Alan Kay said 'The best way to predict the future is to invent it'.

    I tend to believe that the more accurate our societal modelling becomes, the more it will be used to control societies, almost certainly to benefit those in control.

    Perhaps an equally relevant SF reference would be Leto's golden path...

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.