Is Differentiated Content Enough To Save Newspapers?

7 Min Read

The Guardian headline sums it up: “Big newspaper sites ‘erode value of news’, says Sly Bailey“. Sly Bailey is the  chief executive of the Trinity Mirror, one of the UK’s largest newspaper publishers. Here’s what she has to say:

A consumer is now as likely to discover newspaper content on Google, visit our sites, then flit away before even discovering that it was the Daily Mirror or the Telegraph that created the content in the first place.

Or worse, they may visit an aggregator like Google News, browse a digital deli of expensive-to-produce news from around the world, and then click on an ad served up to them by Google. For which we get no return. By the absurd relentless chasing of unique user figures we are flag-waving our way out of business.

So far, so good: she’s making the devaluation argument we’ve hopefully all seen by now, and one I agree with.

It’s where Bailey goes next that intrigues me:

She called for a change to the accepted norms, arguing that publishers could “reverse the erosion of value in news content” by rejecting a relentless quest for high user numbers, in favour of a move away from “gene

The Guardian headline sums it up: “Big newspaper sites ‘erode value of news’, says Sly Bailey“. Sly Bailey is the  chief executive of the Trinity Mirror, one of the UK’s largest newspaper publishers. Here’s what she has to say:

A consumer is now as likely to discover newspaper content on Google, visit our sites, then flit away before even discovering that it was the Daily Mirror or the Telegraph that created the content in the first place.

Or worse, they may visit an aggregator like Google News, browse a digital deli of expensive-to-produce news from around the world, and then click on an ad served up to them by Google. For which we get no return. By the absurd relentless chasing of unique user figures we are flag-waving our way out of business.

So far, so good: she’s making the devaluation argument we’ve hopefully all seen by now, and one I agree with.

It’s where Bailey goes next that intrigues me:

She called for a change to the accepted norms, arguing that publishers could “reverse the erosion of value in news content” by rejecting a relentless quest for high user numbers, in favour of a move away from “generalised packages of news” to instead concentrate on content with “unique and intrinsic value”.

On one hand, of course it’s necessary for publishers to offer unique value, regardless of how they can monetize it, or else they commoditize themselves by default. On the other hand, it may not be sufficient. Without effective monetization, publishers create value but don’t capture it. That’s fine if you are Wikipedia (which certainly offers content with “unique and intrinsic value”) and manage to get by on donations. But it doesn’t work so well if you are an online newspaper whose efforts serve more to line Google’s pockets than your own.

Let me make this last point more concrete. Say that you’re an online newspaper, and you invest in developing unique content. Google will happily index your content (assuming you allow it to), and thus you create value on the web. You can even monetize some of that value, by delivering ads to people who visit your site. But Google delivers at least as many ads to those same people, with much less effort. Moreover, as long as Google is the gateway to your content (which is the status quo), you’re unlikely to change that distribution of rents, or to build reader loyalty.

What you really want as an online publisher is for people to seek out your content, not just to stumble into it through search engines and aggregators. I’m curious what would happen if a critical mass of publishers used robots.txt to stop being crawled–and publicly announced that they were doing so. In the short term, they’d  lose a significant amount traffic–and that short-term hit in the current economic climate might amount to fiscal suicide. But in the long term it may be the only way for publishers to prove their own brand value, something they may have to do in order to bring Google and their other bêtes noires to the negotiating table.

There are alternative strategies, such as requiring registration or putting up pay walls. But those have the disadvantage that they break the broader link economy (though I may be using the phrase slightly differently from Jeff Jarvis), which on the whole is quite different from the relationship between publishers and search engines / aggregators. I at least believe that The Guardian obtains more brand credit from someone clicking through this post than from someone seeing it in a sea of search results or aggregated news articles. I recognize that the distinction isn’t always black and white, e.g., aggregators like Techmeme concentrate heavily on a small set of sites that readers recognize over time. In general, however, I’d say there is a difference between following a deliberate citation vs. clicking through a link produced without any human intentionality.

I realize I may come across like a romantic, emphasizing the human element, but the distinction I’m after isn’t sentimental. Rather, it’s the idea that the long-term value of publisher depends on readers knowing and caring who the publisher is. They need to break through the commodified experience of search engines that, by design, dilute the differentiation among brands. In any case, the current path for many publishers looks like tragedy without the romance. Those that aim for long-term survival will have to take some chances to buck this inertia.

Link to original post

Share This Article
Exit mobile version