Publishers Of Certain Belgian Newspapers Continue Effort To Not Be Found Online

from the why-not-stop-publishing-online dept

Here's an idea: why don't the various French- and German-language Belgian newspapers stop publishing their newspapers online? After their ridiculous win against Google for sending traffic their way without first paying them, the group of newspaper publishers is now going after Yahoo for the same thing. There really is an easy solution. Yahoo, Google and everyone else should simply refrain from linking to these newspapers. If they really want to be left alone, to lose all that traffic and to lose all that relevance, that's their own decision. In the meantime, how long will it be before someone else comes along and figures this is a cash cow and starts suing? At this point, perhaps everyone should just sue Google, Yahoo, Microsoft, Ask and others for daring to link to them.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    bendodge, 19 Jan 2007 @ 9:32pm

    yeah

    if they aren't in the indexes, they might as well not exist

    link to this | view in chronology ]

  • identicon
    Mousky, 19 Jan 2007 @ 9:57pm

    Have the French- and German- language Belgian newspapers also sued libraries for making copies of their newspapers available? How about cafes that have newspapers lying around? Or how about those workplaces that buy one subscription for an office with 20 or so workers? When will companies learn to embrace, use and exploit technology to their benefit and not their detriment?

    link to this | view in chronology ]

    • identicon
      Stephane, 20 Jan 2007 @ 4:21am

      Re:

      >

      Allowing Google to steal your content is done at the company's detriment and at Google's benefice.
      Not the opposite, as you might think.

      A lot of people are getting bored of those little google ads everywhere, or by search results showing ebay at top... page rank algorithm, yeah, sure, but with some "adjustments" of course ;-)

      link to this | view in chronology ]

  • identicon
    Josh, 19 Jan 2007 @ 10:13pm

    it could be worse!!!

    Think of it this way; Anyone with a website could sue any of the search sites they want....
    Why, you ask...

    'cause they store cached pages "copies" of the site.

    It's technically illegal.

    Think about that!

    link to this | view in chronology ]

    • identicon
      Dosquatch, 20 Jan 2007 @ 4:17am

      Re: it could be worse!!!

      'cause they store cached pages "copies" of the site. It's technically illegal.

      Actually, Google has been sued for exactly this, and the cached copies were found not to violate the law.

      link to this | view in chronology ]

    • identicon
      Mark, 20 Jan 2007 @ 9:46am

      Re: it could be worse!!!

      Not in Japan

      link to this | view in chronology ]

  • identicon
    Josh, 19 Jan 2007 @ 10:25pm

    in fact

    if you want to get technical you are breaking the law by viewing a website because a copy is stored on your computer.

    link to this | view in chronology ]

  • identicon
    ipanema, 20 Jan 2007 @ 5:46am

    I think that's the most sensible thing to do. NOT TO PUBLISH ONLINE! Too touchy. Are they carrying the best news anyway? Suing for linking has become a habit. Sad.

    link to this | view in chronology ]

  • identicon
    Starky, 20 Jan 2007 @ 9:55am

    robots.txt

    Is Robots.txt not good enough for people anymore???

    link to this | view in chronology ]

    • identicon
      Jess, 21 Jan 2007 @ 6:02am

      Re: robots.txt

      I agree. If a company doesn't want their site or select pages of their site indexed by a search engine. its as simple as creating a file (robots,txt) to exclude themselves. A lot of BS lawsuits flying around right now. There should be a law to protect companies and people against frivilous lawsuits!

      link to this | view in chronology ]

    • identicon
      Dam, 21 Jan 2007 @ 2:43pm

      Re: robots.txt

      Problem is simple - the French and the Germans.

      Need I say more?

      link to this | view in chronology ]

  • identicon
    Howard Bowen, 21 Jan 2007 @ 12:36am

    At this time in history when lascivious promiscuity is a basis for which to fabricate entertainment formats around, and truthfull fact is waning in journalism, perhaps the courts should clog thier dockets with a class action suit against George Bush as the villian in masterminding Huricane Katrina.

    link to this | view in chronology ]

  • identicon
    Jeff, 21 Jan 2007 @ 10:39am

    .

    ROBOTS.txt

    Wtf

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 21 Jan 2007 @ 11:56pm

      Re: .

      The problem with robots.txt is that if it does not exist, bots assume they have the right to index and redistribute the copyrighted material.

      Instead every site that wants to be indexed should have a robots.txt file that grants access to the bots, not the other way around.

      link to this | view in chronology ]

      • identicon
        Chris Maresca, 22 Jan 2007 @ 10:33am

        Re: Re: .

        Er, the web is a PUBLIC forum. By default, everything is accessible to everything. If you want to stop a indexer like Google or Yahoo, you only to put one file, with TWO lines in it in your web root:

        User-agent: *
        Disallow: /

        That's it. If you can't do that, perhaps you should re-consider if publishing anything on the web is something you should really be engaging in. It's sort like expecting the inventory on your shop to be safe if you don't put a door on the storefront....

        As for server load, those two lines will stop ANY indexer from looking at ANY of your files.

        Chris

        link to this | view in chronology ]

  • identicon
    shimon, 21 Jan 2007 @ 5:52pm

    problem in belgium is simple

    well , belgium wanna give u good beer and great chocolates, and just wants to let'em be their way :)

    well if i make a website and i choose is not to be indexed, cose i want-it free of that 35-60% search engine generate on a server, and have-it free for personal acces, what's google's problem ?

    do you guys have any ideea how much indexing is needed to keed damn search robots off your server trafic?

    it seems great service from search engine is not free, cose someone's paying for the trafic search bots generate

    so have a beer and sit relaxed if is good, must be from belgium :)

    link to this | view in chronology ]

  • identicon
    Jurgonaut, 22 Jan 2007 @ 5:38am

    German? Or Dutch?

    Mike, shouldn't it be "French- and Dutch-language"?

    I think you have +- 10 native Belgians having German as their mother tongue...

    link to this | view in chronology ]

  • identicon
    Chris Maresca, 22 Jan 2007 @ 10:34am

    Replying to individual comments broken

    BTW, in the last Techdirt update, not only did something change so that articles are only 200 px wide, but replying to individual comments also appears to be broken.

    Chris.

    link to this | view in chronology ]

    • identicon
      dennis, 22 Jan 2007 @ 1:59pm

      Re: Replying to individual comments broken

      Chris,

      You can set up Techdirt to view in wide mode through the Preferences page -- from there, you can also set comments to view in threaded mode as well.

      Cheers,
      Dennis.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.