Was A French Court Correct In Blaming Google For Its Google Suggest Suggestions?

from the still-not-convinced dept

We recently wrote about yet another (the third one we know of) ruling in France that found Google liable for what "Google Suggest" suggested. Google Suggest, of course, is the autocomplete function that tries to guess what you're searching on, based on what other people searched on after typing the same letters. Of course, more recently, that's been expanded to Google's Instant Search, where it actually shows full results as you type. We suggested that the problem here was that French courts did not understand the technology.

Journalist Mitch Wagner, who I tend to agree with more often than not, claims that we got it wrong, and that the French courts do understand the technology perfectly fine: and they still decided to side against Google (and, separately, we should mention against Google's CEO, as if he had anything to do with the suggestions in question):
But actually the French court understands what's going on. Google raised just those issues in its defense, and the court disagreed. "The court ruling took issue with that line of argument, stating that 'algorithms or software begin in the human mind before they are implemented,' and noting that Google presented no actual evidence that its search suggestions were generated solely from previous related searches, without human intervention," according to Computerworld.
He goes on to suggest that it can (sort of) be compared to a product liability case, where if you make a product that does something "bad" (such as suggest libelous search results) that it should be your responsibility:
Is it appropriate for Google to build a search engine that automatically generates results with no intervention to be sure those results aren't libelous, defamatory, or otherwise harmful?

This is a problem that goes beyond people accused of crimes. Many companies are unhappy with the results that comes up when you search on industry terms. If you make hats, and you're not in the first page of results that come up when searching the word "hats," then you're dissatisfied with Google. Does that make Google wrong? Does it matter if your hats are, in fact, better and more popular than companies with search terms ranked higher?
I'm sorry, but I don't buy it. I understand Wagner's point, but I think the French courts still don't really understand the issues. It's not a question of whether or not it's appropriate, it's a question of whether or not it's even possible. How does Google build a search engine that simply knows whether a suggestion might be considered by a court of law to be libelous? As for the different rankings, those are opinions, which should be protected speech (last we checked). If Google's results aren't good, that's an opening for another search engine. Blaming Google because you don't like how the algorithm works is still a mistake, and I don't think the French courts really recognize this at all, no matter what they say.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: defamation, france, liability
Companies: google


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 30 Sep 2010 @ 2:38am

    protected speech

    "As for the different rankings, those are opinions, which should be protected speech (last we checked)"

    maybe in the US , but not in France, where some opinions
    can't be expressed legally. for instance racist speeches.

    link to this | view in chronology ]

  • identicon
    LZ7, 30 Sep 2010 @ 2:59am

    France

    I would say, that I have a fairly deep understanding of Google's AI, and the underlying philosophy that drives it's evolution. To put it bluntly, these accusations are based on moronic assumptions. I happen to know for a fact that the algo has a mind of its own, and it gets smarter with every iteration. It's the worlds largest neural network after all and if Google can be held liable for what it suggests, then so can every other smart application.

    I love how, software is purely an idea... until it's patent time, then it's an "Invention".

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 30 Sep 2010 @ 5:58am

      Re: France

      No-one is to blame for what the Skynet does?

      link to this | view in chronology ]

      • icon
        Berenerd (profile), 30 Sep 2010 @ 7:04am

        Re: Re: France

        I blame Jon Conner. If not for him the bots wouldn't have come back in time at all, giving those people the ability to reverse engineer the arm and chip.

        link to this | view in chronology ]

  • icon
    Richard (profile), 30 Sep 2010 @ 3:02am

    Maybe the french are just still annoyed over the whole "search for 'french military victories' and google suggests 'did you mean french military defeats?'" thing that 4chan did ages ago.

    link to this | view in chronology ]

  • icon
    Richard (profile), 30 Sep 2010 @ 4:01am

    Product liability

    He goes on to suggest that it can (sort of) be compared to a product liability case, where if you make a product that does something "bad" (such as suggest libelous search results) that it should be your responsibility:

    Such a liability, if taken seriously, would shut down the whole of computing. All software has bugs and therefore can produce undesired results. Most software vendors have a pretty all-embracing liability disclaimer in their license agreements - and for good reason. Only a small subset of safety-critical software is tested to a high enough standard to allow liability to be accepted - and even then there are occasional problems (the RAF NI security Chinook crash for example).

    This strand of computing could not survive on its own. Do we really want to go back to 1939?

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Sep 2010 @ 4:22am

    Even if the algorithm "begin(s) in the human mind before they are implemented", the result of said algorithm most definitely doesn't come from a human mind. I don't think they're discussing the algorithm itself, just the results... so I don't see how that sentence makes sense. As for Google not proving there's no human intervention, the court hasn't proved that Google doesn't have aliens making the algorithms, which will negate the whole "human mind" thing. I think they should.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Sep 2010 @ 5:24am

    The only way Google knows how to remedy that is to actually censor.

    http://edition.cnn.com/2010/TECH/web/09/29/google.instant.blacklist.mashable/index.html?h pt=Sbin

    There is no human possible way to know what it is libelous or not at the moment without human intervention and even then it is not possible to do it with a 100% certainty of the results. The French just don't want the feature apparently. There is no viable solution to this problem that doesn't involve labor intensive, cost intensive and inevitably unreliable solutions, so the only sensible thing to do is to remove that feature from the French views.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Sep 2010 @ 5:36am

    There is a vast difference in law between the French and the Americans.

    The accessory of current French Alar is the Napoleonic Code.
    In the Napoleonic Code there was a de facto presumptin of guilt.
    Psychologically if the law does not explicitly permit something it is forbidden.

    US law is based on English Law.
    In the English Law there was a de facto presumptin of innocence.
    Philosophically, if something is not forbidden in law it is permitted.

    link to this | view in chronology ]

  • identicon
    abc gum, 30 Sep 2010 @ 5:37am

    Customers who bought a dictionary also bought a thesaurus.

    What ?!
    The nerve of those people - I'll sue them for that !

    link to this | view in chronology ]

  • identicon
    NNN, 30 Sep 2010 @ 5:57am

    "The accessory of current French Alar is the Napoleonic Code.
    In the Napoleonic Code there was a de facto presumptin of guilt."

    This is totally false !
    In France there is also the "présomption d'innocence"
    http://en.wikipedia.org/wiki/Presumption_of_innocence

    "Psychologically if the law does not explicitly permit something it is forbidden."

    hahaha can you really imagine that ?

    link to this | view in chronology ]

  • identicon
    si, 30 Sep 2010 @ 6:13am

    This is totally false !
    In France there is also the "présomption d'innocence"
    http://en.wikipedia.org/wiki/Presumption_of_innocence


    false or not are you seriously citing wikipedia as a credible source of information?!

    link to this | view in chronology ]

    • identicon
      Josef, 30 Sep 2010 @ 6:43am

      Re: Off topic

      "false or not are you seriously citing wikipedia as a credible source of information?!"

      Ummmm. I hear a lot of people down wikipedia as not credible. Usually those people have done no research and have nothing to dispute information they are presented from wikipedia.

      Im just bringing this up because I generally use wikipedia and then cross reference with other sources, depending on the topic. I've found it to be very accurate. So Im wondering why so many people with no alternatives feel otherwise.

      link to this | view in chronology ]

    • icon
      Marcus Carab (profile), 30 Sep 2010 @ 7:18am

      Re:

      are you seriously citing wikipedia as a credible source of information?!

      Are you seriously that out of touch?

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Sep 2010 @ 6:20am

    No search, no find. France disappears into the cloud! Voila! No burqa, no freedom. No more revolution, just EU. No travel to Europe. Europe sucks.

    link to this | view in chronology ]

  • identicon
    A.H., 30 Sep 2010 @ 6:40am

    French?

    It's painfully obvious to anyone with even the smallest amount of understanding of computers and programming that the French courts are:

    A) Completely Ignorant of Computer Matters
    B) Completely Clueness in General
    C) Have stock with the companies bringing the lawsuits
    D) Drunk
    E) Have a croissant up their butts
    F) All of the Above

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 30 Sep 2010 @ 7:27am

      Re: French?

      You forgot about:

      G) Enraged at being called, "surrender monkeys"
      H) Secretly ashamed at really being surrender monkeys
      I) Jealous at the influence of the USA and the comparative insignificance of France
      J) Mad as hell that English is more important than French
      K) Frustrated that they are too stupid to invent their own search engine that is anywhere near as good as Google
      L) Generally vindictive and spiteful
      M) Full of themselves
      N) Suffering from numerous other personality defects

      link to this | view in chronology ]

  • icon
    taoareyou (profile), 30 Sep 2010 @ 6:41am

    Standards

    It's virtually impossible for Google to adhere to the various legal standards of every single country in the world. What is not legal in France is legal in the U.S. If your country deems Google's search to be illegal, take actions to prevent your citizens from accessing it. You cannot enforce your rules on every other country.

    link to this | view in chronology ]

  • icon
    Sean T Henry (profile), 30 Sep 2010 @ 7:01am

    ?

    So Google should just disable the suggestion feature for google.fr and place at the top of searches and the homepage "Missing features? Find out why." Or just set the default to no suggestions and have the user decide to turn it on with a notification on the page that it is the users decision to turn on and we are not liable for suggestions...

    link to this | view in chronology ]

    • icon
      btr1701 (profile), 30 Sep 2010 @ 11:25am

      Re: ?

      Or just shut down their physical offices in France and leave the country altogether. Their site would still be accessible to the people of France (assuming the government didn't block it), but they wouldn't have to deal with crap like this. They could just ignore these lawsuits, let the French courts issue default judgments against them and then laugh when the plaintiffs come trying to collect.

      link to this | view in chronology ]

  • icon
    crade (profile), 30 Sep 2010 @ 7:30am

    If your search suggests "Bill Clinton is a moron",
    google is making the claim that this could be what you want to search for, they are not making the statement that Bill Clinton is a moron at all.

    How could there claim that "this could be what you want to search for" be untrue?

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Sep 2010 @ 10:35am

    The "Automated" Defense

    So, if I create an automated device or process that does bad things, should I not then be responsible for what it does?

    So, for example, that I rig up a shotgun with a tripwire on my property to keep "bad guys" out. If it then winds up killing neighborhood children who get on my lawn, should I then be able to just say "Hey, it's not my fault. It's an automated device! There's no way I can make it actually know who's a bad guy and who isn't!" Or should I be held responsible anyway on the grounds that I shouldn't have implemented such a device in that case? But that might discourage my "innovation".

    So, should "automation" be a defense, as Mike contends, or not?

    link to this | view in chronology ]

    • icon
      crade (profile), 30 Sep 2010 @ 11:08am

      Re: The "Automated" Defense

      Your example isn't really automated.. It's triggered by a tripwire. :)
      The google example isn't really automated either, it's triggered by people searching for text.

      The question is really if the tool maker should be held responsible for the actions of the tool's users. The search tool doesn't do anything automatically.

      link to this | view in chronology ]

    • icon
      Rikuo (profile), 30 Sep 2010 @ 11:16am

      Re: The "Automated" Defense

      I'm gonna start off with the obvious: a search on Google is nothing like rigging up a shotgun on your front lawn. Potential libel and wholesale slaughter are two different things.
      Even if you want to equate the two, you are actively setting up a system that can do physical harm. What's happening with Google is that users (not Google itself) are searching for "XX is YY" (where X is a name and Y is a negative adjective). The algorithm then notes that so many thousands/millions of people have searched for "XX is YY", and if I start typing "XX is" it will add in "YY", because more than likely, that's what I'm searching for.
      Here's an example that I thought up. Say, a library hosts books and has a fancy robotic mechanism that picks up and deposits books in front of me based on what I search for. Say I type into the computer "The Holocaust Didn't Happen" or "Politician X is a Hypocryte", and it dumps books that are about what I searched for, and it gets more accurate based on a user telling the computer "Yes, this book is pertinant to the topic". Is the library at fault? They didn't write the books, they merely have them on a shelf. The computer doesn't know its libelous. Should the books be consigned to being unknown because its against the law to search for something libelous?

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 2 Oct 2010 @ 2:55pm

        Re: Re: The "Automated" Defense

        I'm gonna start off with the obvious: a search on Google is nothing like rigging up a shotgun on your front lawn. Potential libel and wholesale slaughter are two different things.

        Straw man alert: nobody was saying it was.

        Even if you want to equate the two...

        Umm, no, that was *your* strawman.

        link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 30 Sep 2010 @ 11:24am

      Re: The "Automated" Defense

      So, for example, that I rig up a shotgun with a tripwire on my property to keep "bad guys" out. If it then winds up killing neighborhood children who get on my lawn, should I then be able to just say "Hey, it's not my fault. It's an automated device! There's no way I can make it actually know who's a bad guy and who isn't!" Or should I be held responsible anyway on the grounds that I shouldn't have implemented such a device in that case? But that might discourage my "innovation".

      You didn't really mean to make that argument, did you?

      We're not saying it's okay because it's "automated," but because it's a function of what the overall users do. Users searched on those terms, it's accurate.

      Besides, setting up a gun to shoot people is to set up a system specifically designed to perform an illegal act. Reporting what people are searching for is not.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 2 Oct 2010 @ 2:48pm

        Re: Re: The "Automated" Defense

        You didn't really mean to make that argument, did you?

        It's a question, not an argument. Please learn the difference. In fact, it's actually questioning *your* argument. Sorry.

        We're not saying it's okay because it's "automated,"...

        "How does Google build a search engine that simply knows whether a suggestion might be considered by a court of law to be libelous?" seems to be asking that question. Likewise, how does one rig up a shotgun booby-trap that "simply knows" when it is firing in a way that might be considered by a court of law to be a reasonable level of force in a particular situation? Could it be that if one can't then maybe, just maybe, they shouldn't be rigging up such a thing?

        Besides, setting up a gun to shoot people is to set up a system specifically designed to perform an illegal act.

        So you're telling me that it is illegal to shoot people, in any situation, where you are out there in California? I'm not familiar with California law so I'll just have to take your word on that, but I would ask you: if it is illegal to shoot people in any situation in California, then why is it that every California cop I've seen has a gun? Decoration? Interesting.

        Now, where I live it is not illegal to shoot people in certain situations, so as far as I'm concerned your argument to the contrary fails on factual grounds. But, the courts here have ruled though that setting up shotgun booby-traps is illegal because they may fire even when they shouldn't. In other words, automation is no excuse around here for doing something that would otherwise be illegal.

        Reporting what people are searching for is not.

        Apparently a French court disagrees with you. Somehow, I have a feeling that they're not letting you dictate otherwise to them, either.

        link to this | view in chronology ]

    • icon
      btr1701 (profile), 30 Sep 2010 @ 11:27am

      Re: The "Automated" Defense

      > So, if I create an automated device or process
      > that does bad things, should I not then be
      > responsible for what it does?

      Your argument assumes that factually showing what people around the world are searching for is a "bad thing".

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 2 Oct 2010 @ 2:52pm

        Re: Re: The "Automated" Defense

        Your argument assumes that factually showing what people around the world are searching for is a "bad thing".

        Not, it doesn't. However, that does seem to be the determination that was made by the court. If you have a problem that, then perhaps you should address your concerns to the court.

        link to this | view in chronology ]

        • icon
          btr1701 (profile), 7 Oct 2010 @ 8:26pm

          Re: Re: Re: The "Automated" Defense

          > > Your argument assumes that factually showing what people around
          > > the world are searching for is a "bad thing".

          > Not, it doesn't. However, that does seem to be the determination that
          > was made by the court.

          And such a ruling is logically and philosophically incompatible with a free society, so I guess the French judiciary has tacitly admitted that they're no longer living in one.

          > If you have a problem that, then perhaps you should address your
          > concerns to the court.

          Or I could just address them here, like I did. How's that?

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 8 Oct 2010 @ 6:00pm

            Re: Re: Re: Re: The "Automated" Defense

            Or I could just address them here, like I did. How's that?

            I think the word is "ineffectual".

            link to this | view in chronology ]

  • icon
    btr1701 (profile), 30 Sep 2010 @ 11:15am

    Searches

    Since Google's auto-complete function only shows what people are factually searching for, it seems like the French courts (and those who agree with them) are saying that it's illegal to create a service that reports factual information about the search habits of internet users.

    link to this | view in chronology ]

  • icon
    Rikuo (profile), 30 Sep 2010 @ 12:26pm

    Here's a scenario the French court didn't think about

    What if I'm a historian on a certain subject, and I search on Google deliberately for a libelous statement.
    For example, take the David Beckham case de jour, where a prostitute has claimed he paid her to sleep with him. What if in twenty years, I write a biography of David Beckham, and I want to write about this episode in his life. So I type into Google "David Beckham Prostitute", and it spits out links to her blog or something, which will be kept in some archive.

    link to this | view in chronology ]

  • icon
    Andrew (profile), 30 Sep 2010 @ 12:30pm

    "The court ruling took issue with that line of argument, stating that 'algorithms or software begin in the human mind before they are implemented,' and noting that Google presented no actual evidence that its search suggestions were generated solely from previous related searches, without human intervention."

    Doesn't this come a little too close to questioning safe harbour provisions? I can't address the second part of this statement (though it would surprise me greatly if more than a handful of possible suggestions had been subject to human intervention), but this blog's comment system, for example, was conceived in the human mind too. Yet if I were to write something libellous here, Techdirt would rightly not be held liable despite republishing my comments to the world.

    link to this | view in chronology ]

  • identicon
    Mitch Wagner, 30 Sep 2010 @ 5:53pm

    Thanks for the follow-up!

    Google already screens Google Instant for search results that are inoffensive, so screening results is not ridiculous.

    I'm not saying I agree with the French courts here. I'm concerned that we're creating a future where public perception trumps reality, and if most of the Google-using population believes a thing to be true, Google will spit it back, even if that thing is actually false.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.