Mathew Ingram at The Nieman Journalism Lab wrote a post today entitled “Google helps newspapers — period” , intended at least in part to rebut two of my recent posts on the subject: “Is Differentiated Content Enough To Save Newspapers?” and my earlier “Yes, Virginia, Google Does Devalue Everything It Touches“.
He cites a proposal I made regarding the use of robots.txt. I’ll excerpt that proposal fully here:
I’m curious what would happen if a critical mass of publishers used robots.txt to stop being crawled–and publicly announced that they were doing so. In the short term, they’d lose a significant amount traffic–and that short-term hit in the current economic climate might amount to fiscal suicide. But in the long term it may be the only way for publishers to prove their own brand value, something they may have to do in order to bring Google and their other bêtes noires to the negotiating table.
Ingram responds:
If there were a finite market for news and information, then the search engine could be accused of devaluing it — but that’s not how information works. In fact, oceans of interchangeable news make certain kinds of content even mor…
Mathew Ingram at The Nieman Journalism Lab wrote a post today entitled “Google helps newspapers — period” , intended at least in part to rebut two of my recent posts on the subject: “Is Differentiated Content Enough To Save Newspapers?” and my earlier “Yes, Virginia, Google Does Devalue Everything It Touches“.
He cites a proposal I made regarding the use of robots.txt. I’ll excerpt that proposal fully here:
I’m curious what would happen if a critical mass of publishers used robots.txt to stop being crawled–and publicly announced that they were doing so. In the short term, they’d lose a significant amount traffic–and that short-term hit in the current economic climate might amount to fiscal suicide. But in the long term it may be the only way for publishers to prove their own brand value, something they may have to do in order to bring Google and their other bêtes noires to the negotiating table.
Ingram responds:
If there were a finite market for news and information, then the search engine could be accused of devaluing it — but that’s not how information works. In fact, oceans of interchangeable news make certain kinds of content even more valuable, not less.
I have to admit that I don’t follow this argument. First, I’m not even sure what it means for the market for news and information not to be finite. Attention is certainly finite. I’ll assume that what Ingram meant is that the market isn’t fully tapped, and hence that it is possible to add to the total value, rather than simply to redistribute it.
He then says:
if a newspaper or media outlet finds its business model severely impacted by the fact that Google excerpts a single paragraph of a news story, then it deserves to fail
But I’m not arguing that Google is doing harm to the news sites through the use of excerpts–perhaps this argument is directed at someone else. Indeed, if newspapers wanted out, they could get out by using robots.txt. Instead, they not only allow Google in, but invest in SEO to get excerpted as often as possible. The current business model for online newspapers depends heavily on Google as a source of traffic.
But that’s also the problem. Here I disagree with Ingram and echo what Nick Carr has said: Google has become a powerful middleman for online content, much like Wal-Mart for physical goods. That’s great if you’re a consumer that likes a year’s supply of pickles for less than $3; not so great if you’re the premium pickle vendor in a catch 22: sell on Wal-Mart’s terms or forgo the nation’s leading grocery seller as a distribution channel. You can read about the Wal-Mart / Vlasic story here. Is the physical goods market a finite market in a way that the market for news and information is not?
Ingram continues:
if you are adding more value through context and analysis, then there are many more ways to monetize that than by slapping simple banner or text ads on it — which seems to be the only thing that Daniel and others can imagine newspapers doing.
Actually, my imagination is hardly limited to ad-supported models. As regular readers here know, I’d like to live in a world where people pay for digital content just as they pay for other goods and services they value. But I live in the real world, in which I don’t see any viable alternatives to the ad-supported model showing up soon.
Ingram concludes:
But if you are actually adding value, wouldn’t you like as many people to find out about it as possible? Cutting yourself off from the world’s largest search engine is like cutting off your nose to spite your face.
That argument strikes me as equivalent to one that, if you’re trying to sell a house, you should price it at a dollar to attract as many buyers as possible. Or that suppliers should do whatever is necessary for Wal-Mart to sell their goods at high volume. Not all supplier relationships lead to sustainable business models.
The problem, as I see it, is that when readers “find out about” a news article through Google, they read it in a hit-and-run fashion that doesn’t give the newspaper a chance to build a relationship with readers. From the reader’s perspective, the article may as well be published by Google. That’s great for Google’s brand equity, but not so great for the newspaper’s. I’m realistic that no newspaper can afford to indivdiually cut off Google or search engines in general. It’s a prisoner’s dilemma. But I am curious to see what would happen if a critical mass of publishers did so in concert. Yes, that would cost them and Google money. But it’s not irrational as a negotiating tactic if it leads Google to consider a distribution of rents that is still worth Google’s while but more favorable to publishers than the present one.
Ingram works in the newspaper industry, and he’s clearly put a lot of thought into this topic. Nonetheless, I remain unconvinced by his arguments. Perhaps we can find our way to a common ground. Whoever is right will hopefully convince the other that he is wrong!