A boatload of European publishers claim search engines are “building [their business] on the back of kleptomania.”
From the FT: The group of publishers, which includes the International Publishers’ Association, the European Federation of Magazine Publishers and Agence France Presse, is seeking meetings with Charlie McCreevy, the European Union’s internal market commissioner, and Viviane Reding, the commissioner responsible for media. It would not rule out legal action to enforce copyright or “collective action.”….
….Mr O’Reilly likened the initiative to the conflict between the music industry and illegal file-sharing websites and said it was not a sign that publishers had failed to create a competitive online business model of their own.
“I think newspapers have developed very compelling web portals and news channels but the fact here is that we’re dealing with basic theft,” he said.
Oh Please! Stop driving in the rear view mirror, folks.
22 thoughts on “So…This Mean Quaero Is a No Go?”
Scoff all you want, but the European publishers’ point is completely valid. All media businesses, whether print, t.v., radio or Internet, are eventually based on getting paid (one way or another) for proprietary content. Even bastion of open information Google won’t let you use their proprietary content (search results) without paying them (showing their ads). The analogy to music and file sharing is right on. If they’re smart, U.S. media companies will follow suit and all of a sudden the Googles and Yahoos of the world will lose a lot of power. Sure, the media companies will lose some traffic in the short term but in the long run they need to protect their content’s value and not be owned by search engine/portals, if it’s not already too late.
Indeed, we do have in Europe (like elsewhere) our bunch of nerdy conservatives and please do not forget to include your source alias the Financial Times in. But I really don’t see what “Quaero” makes here. I don’t think FT is a “Quaero” supporter at all, FT is rather looking for joining NAFTA!
Rumsfeld was right about Old Europe…
The analogy to music and file sharing is totally off. We’re not talking about third parties reproducing all their content online. In fact, a news provider can upload a robots.txt or block IPs to prevent the search engines from indexing or caching their site…if they dare. However, if they go that route, they won’t “lose some traffic in the short term” but will lose a lot of traffic in the short term and long term.
“Deutsche Telekom” will no longer participate in an active role at the Quaero project. They will be only be a observer. (What ever the role of a observer will be …).
I think Quaero will die with a slow and quiet death.
This should be interesting to follow.
The AP recently offered up a similar complaint.
While the reaction in the blogosphere is predictable, the realities to the news producers is that the value in their productions is potentially diminished when somone wants to create a “news site” without paying for access to the content.
I’d be interested in seeing what the economic realities are and I don’t think it is as cut and dried as Ken or John make it out to be. There is some real risk here to Google.
As I note in my posting today, this is a bit like suing dentists and doctors for putting newspapers in their waiting rooms. Since when does a news organization sue someone for bringing them viewers?
The world of News 2.0 knows that Google drives traffic to their sites. If the old world of news doesn’t figure this out, the market will figure it out for them.
One phrase: Robots.txt.
Robots.txt is an all-or-nothing proposition, and, unfortunately, the only proposition the aggregators are offering to content providers. There’s no way to say “you can display the title and first line, but only index the rest”.
Since the main content of most news stories is in the first few sentences, it’s too easy to spoil the whole story in the “summary”, and make clicking-through unnecessary. I think what the news providers are saying is that there’s no agreed limit on how muh the summarizers can display.
An interesting discussion, but what does this have to do with Quaero? Ahh right, the people behind Quaero happen to live on the same continent as the European publishers. By that logic I can assume US publishers speaks for Google, right? Actually, not everyone in Europe is in agreement on everything (kinda like how the US is a big diverse place with differing opinions).
John, I’m a fan of the blog, but tighten up the logic please.
The view you are expressing about Quaero is one I shared in observation of the U.S. search community’s response in an interview with The Economist recently (for March). It will be interesting to see what they publish.
Ed, I hear you, but my sentiment was driven by the fact that Quaero is run by two huge European publishers, Thompson and Bertlesmann, and I don’t think it’s going out on a limb to say they are somewhat sympathetic to the WPA cause.
Putting aside Quaero, of course the real issue as noted by previous comments is the perceived threat search engines pose to traditional media in general.
I actually think what is happening with music is indeed instructive here. Basically traditional media makes its money by packaging content — and generally packaged for one particular way of consuming the content (e.g., reading a newspaper from front to back, or consuming an entire music album). Search comes at the content from a non-traditional direction such that the traditional packaging cannot always effectively capture value for the media company packager. This “threat” to the traditional media package is no different than the minor uproar over the perceived threat of “deep linking” to web sites of a few years ago.
The obvious solution is that the content must be made as much as possible into self-contained “objects”. In other words, the packaging must be more distributed into, and tightly bound with, the individual content — whether that means subscription by content object, or advertising that is tightly bound to the content object. It also means the object should include enough meta-information such as a compelling summary, author information, etc, that is outcompetes snippets generated by search engines. That’s certainly the way my business has organized its content.
Long term, I don’t see any other viable approach with regard to any type of content and media businesses. And as in the case of the music industry, the faster people get that, the less the pain and suffering . . .
John I’m surprised you dismiss the EEC publishers so quickly given your own sympathy with the idea that quality publishing deserves better treatment than it gets now.
All of us feeding at the adsense trough should wonder if we simply concede too much of the money to Google.
robots.txt is not an all-or-nothing proposition. just look at afp.com:
Disallow the directories you don’t want Google to scan. Dump the content you want indexed into your other directories. In reality, I suspect that the problem these publishers are experiencing are more off-line than on-line. If not for a declining number of paid print subscribers, we wouldn’t even be having this discussion.
Jojo> The fact that “Deutsche Telekom” decided to be only an observer for the time being is rather something salutary for the “Quaero” project itself and my comment can apply to “France Telecom”, “Thomson” and “Bertlesmann”.
Actually 98% of “Quaero” inventiveness will come from the myriade of great startups and university labs selected in this project.
But remember Google w/o Adsense would be only a nice algorithmic performance.
It’s hard to have it both ways: “Government please leave us alone to do as like,” and “Government please set some guidelines for us since we cannot do it ourselves.” they seem to be saying that as long as they can do what they are in China (and others are as well) they will do it. The only option they see is some kind of goverment intervention. It may be that the Gov needs to do something, but I would rather like to see Google live by their “do not do evil,” regardless of our government’s policies (or the lack thereof).
Oops, the above post was meant to go in another thread. Not sure what happened. Sorry.
Of course you can stop Google’s spider, but I think that’s missing the Europeans’ point here. Google’s business would be minimal and the Europeans’ and other content providers’ businesses would be much stronger if Google had to get consent to use proprietary content in the first place – I don’t think the big, branded content providers would have given it to a start-up with no traffic but an apparently nice search alogorithm. While I’m sure most people reading this blog think having to opt in to allow spidering is crazy, this is the battle Google is now fighting with book publishers (and there is a reason you see a copyright at the bottom of every web page). Now that Google is so big and powerful, it’s difficult for content providers to opt out of having their content taken. So was it ok for Google to take billions of dollars of value from content providers without permission before those providers understood a new technology and its implications? Should companies have to opt out of having their proprietary information used for profit by a new technology they might not even know about or should they have to opt in? I think these are pretty interesting questions that people assume are answered but may not be. And yes, I am the same person who wrote the first comment to this post but no, I am not and never have been in old media (I’ve been working in the Internet space for 7 years).
Its the game of chicken, Each publisher can
independently disallow google but unless
they all do it together it wont impact
google (or the other aggregators). Hence
you see them banding together.
Ah, the game of chicken. The problem that will arise if the publishers follow that path is that the cost/benefit for each publisher will not be the same. Some publishers will be more highly dependent on traffic from Google than others. Some publishers will have monetized their traffic more than others. If you’re not getting much traffic from Google or you haven’t monetized your traffic stream, then throwing up a robots.txt file is a painless sacrifice. But, for another publisher, they who is drawing substantial revenues from their web site, banning Google and seeing their traffic drop by 60% will be an unpalatable sacrifice. And, it only takes a handful of publishers to bolt for the whole boycott to come crashing down.
The AP built Yahoo. If Yahoo didn’t get access to the AP (and Reuters) news… there wouldn’t be much to Yahoo today.
So since the newspapers – which run the AP – are scared of Yahoo and Google News… they have no one to blame but themselves.
Deutsche Telekom is apparently doing its own search thing. The are launching a portal of their own (www.suchen.de). in addition, German yellow pages publishers who are connected with DT, are building a search index: