Publishers Of Certain Belgian Newspapers Continue Effort To Not Be Found Online
from the why-not-stop-publishing-online dept
Here's an idea: why don't the various French- and German-language Belgian newspapers stop publishing their newspapers online? After their ridiculous win against Google for sending traffic their way without first paying them, the group of newspaper publishers is now going after Yahoo for the same thing. There really is an easy solution. Yahoo, Google and everyone else should simply refrain from linking to these newspapers. If they really want to be left alone, to lose all that traffic and to lose all that relevance, that's their own decision. In the meantime, how long will it be before someone else comes along and figures this is a cash cow and starts suing? At this point, perhaps everyone should just sue Google, Yahoo, Microsoft, Ask and others for daring to link to them.Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Reader Comments
Subscribe: RSS
View by: Time | Thread
yeah
[ link to this | view in thread ]
[ link to this | view in thread ]
it could be worse!!!
Why, you ask...
'cause they store cached pages "copies" of the site.
It's technically illegal.
Think about that!
[ link to this | view in thread ]
in fact
[ link to this | view in thread ]
Re: it could be worse!!!
'cause they store cached pages "copies" of the site. It's technically illegal.
Actually, Google has been sued for exactly this, and the cached copies were found not to violate the law.
[ link to this | view in thread ]
Re:
Allowing Google to steal your content is done at the company's detriment and at Google's benefice.
Not the opposite, as you might think.
A lot of people are getting bored of those little google ads everywhere, or by search results showing ebay at top... page rank algorithm, yeah, sure, but with some "adjustments" of course ;-)
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: it could be worse!!!
[ link to this | view in thread ]
robots.txt
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: robots.txt
[ link to this | view in thread ]
.
Wtf
[ link to this | view in thread ]
Re: robots.txt
Need I say more?
[ link to this | view in thread ]
problem in belgium is simple
well if i make a website and i choose is not to be indexed, cose i want-it free of that 35-60% search engine generate on a server, and have-it free for personal acces, what's google's problem ?
do you guys have any ideea how much indexing is needed to keed damn search robots off your server trafic?
it seems great service from search engine is not free, cose someone's paying for the trafic search bots generate
so have a beer and sit relaxed if is good, must be from belgium :)
[ link to this | view in thread ]
Re: .
Instead every site that wants to be indexed should have a robots.txt file that grants access to the bots, not the other way around.
[ link to this | view in thread ]
German? Or Dutch?
I think you have +- 10 native Belgians having German as their mother tongue...
[ link to this | view in thread ]
Re: Re:
Chris.
[ link to this | view in thread ]
Re: Re: .
Er, the web is a PUBLIC forum. By default, everything is accessible to everything. If you want to stop a indexer like Google or Yahoo, you only to put one file, with TWO lines in it in your web root:
That's it. If you can't do that, perhaps you should re-consider if publishing anything on the web is something you should really be engaging in. It's sort like expecting the inventory on your shop to be safe if you don't put a door on the storefront....
As for server load, those two lines will stop ANY indexer from looking at ANY of your files.
Chris
[ link to this | view in thread ]
Replying to individual comments broken
Chris.
[ link to this | view in thread ]
Re: Replying to individual comments broken
You can set up Techdirt to view in wide mode through the Preferences page -- from there, you can also set comments to view in threaded mode as well.
Cheers,
Dennis.
[ link to this | view in thread ]