AFP Gets Confused As To How The Internet Works
from the did-someone-just-wake-up-from-a-decade-of-sleep? dept
It is 2005, right? Sometimes you have to wonder when you read stories about a news agency like Agence France Presse (AFP) deciding that they should sue Google News over linking to their stories. Oh, the horror. How dare Google give them more traffic? Did someone in AFP's legal department sleep through the past ten years of the internet? AFP is complaining about the reuse of headlines, the reuse of the lead, and the reuse of images. All three of which are clear fair use instances. However, much more importantly, these all drive traffic to these AFP stories. It's hard to believe that there are still companies out there that don't get this simple fact. There are billions of dollars being spent by people trying to get better placement in Google, and here's one company suing Google for millions for daring to link to them.Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Reader Comments
Subscribe: RSS
View by: Time | Thread
Robots.txt?
[ link to this | view in chronology ]
Re: Robots.txt?
If you tell Google to disallow certain areas, it will not cache those areas (allegedly), but it will still go into them. In order to truly disallow Google, one needs to add special META tags to all pages -- tags that only Google honors. Shame on Google for not adhering to the established standard on this front.
Not that this changes anything with AFP.
[ link to this | view in chronology ]
Re: Robots.txt?
[ link to this | view in chronology ]
No Subject Given
[ link to this | view in chronology ]
Re: No Subject Given
Seems to me that (at least from a legal point of view), AFP's licensing agreement should simply oblige its partners to include an appropriate robots.txt with the licensed stories. Then, if Google doesn't honour these, AFP might have a case.
[ link to this | view in chronology ]