Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    sales and data analytics
    How Data Analytics Improves Lead Management and Sales Results
    9 Min Read
    data analytics and truck accident claims
    How Data Analytics Reduces Truck Accidents and Speeds Up Claims
    7 Min Read
    predictive analytics for interior designers
    Interior Designers Boost Profits with Predictive Analytics
    8 Min Read
    image fx (67)
    Improving LinkedIn Ad Strategies with Data Analytics
    9 Min Read
    big data and remote work
    Data Helps Speech-Language Pathologists Deliver Better Results
    6 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Mathew Ingram: Google Helps Newspapers
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Uncategorized > Mathew Ingram: Google Helps Newspapers
Uncategorized

Mathew Ingram: Google Helps Newspapers

Daniel Tunkelang
Daniel Tunkelang
8 Min Read
SHARE

Mathew Ingram at The Nieman Journalism Lab wrote a post today entitled “Google helps newspapers — period” , intended at least in part to rebut two of my recent posts on the subject: “Is Differentiated Content Enough To Save Newspapers?” and my earlier “Yes, Virginia, Google Does Devalue Everything It Touches“.

He cites a proposal I made regarding the use of robots.txt. I’ll excerpt that proposal fully here:

I’m curious what would happen if a critical mass of publishers used robots.txt to stop being crawled–and publicly announced that they were doing so. In the short term, they’d  lose a significant amount traffic–and that short-term hit in the current economic climate might amount to fiscal suicide. But in the long term it may be the only way for publishers to prove their own brand value, something they may have to do in order to bring Google and their other bêtes noires to the negotiating table.

Ingram responds:

If there were a finite market for news and information, then the search engine could be accused of devaluing it — but that’s not how information works. In fact, oceans of interchangeable news make certain kinds of content even mor…

More Read

The Challenge of Making ERP Systems More Configurable
What does Twitter do with Our Conversations?
CES Showcases Content Anywhere. Here’s a Real-World Reality Check.
Keep Your Data Scientist…Send Me A Data Artist!
The Practice Mentality

Mathew Ingram at The Nieman Journalism Lab wrote a post today entitled “Google helps newspapers — period” , intended at least in part to rebut two of my recent posts on the subject: “Is Differentiated Content Enough To Save Newspapers?” and my earlier “Yes, Virginia, Google Does Devalue Everything It Touches“.

He cites a proposal I made regarding the use of robots.txt. I’ll excerpt that proposal fully here:

I’m curious what would happen if a critical mass of publishers used robots.txt to stop being crawled–and publicly announced that they were doing so. In the short term, they’d  lose a significant amount traffic–and that short-term hit in the current economic climate might amount to fiscal suicide. But in the long term it may be the only way for publishers to prove their own brand value, something they may have to do in order to bring Google and their other bêtes noires to the negotiating table.

Ingram responds:

If there were a finite market for news and information, then the search engine could be accused of devaluing it — but that’s not how information works. In fact, oceans of interchangeable news make certain kinds of content even more valuable, not less.

I have to admit that I don’t follow this argument. First, I’m not even sure what it means for the market for news and information not to be finite. Attention is certainly finite. I’ll assume that what Ingram meant is that the market isn’t fully tapped, and hence that it is possible to add to the total value, rather than simply to redistribute it.

He then says:

if a newspaper or media outlet finds its business model severely impacted by the fact that Google excerpts a single paragraph of a news story, then it deserves to fail

But I’m not arguing that Google is doing harm to the news sites through the use of excerpts–perhaps this argument is directed at someone else. Indeed, if newspapers wanted out, they could get out by using robots.txt. Instead, they not only allow Google in, but invest in SEO to get excerpted as often as possible. The current business model for online newspapers depends heavily on Google as a source of traffic.

But that’s also the problem. Here I disagree with Ingram and echo what Nick Carr has said: Google has become a powerful middleman for online content, much like Wal-Mart for physical goods. That’s great if you’re a consumer that likes a year’s supply of pickles for less than $3; not so great if you’re the premium pickle vendor in a catch 22: sell on Wal-Mart’s terms or forgo the nation’s leading grocery seller as a distribution channel. You can read about the Wal-Mart / Vlasic story here. Is the physical goods market a finite market in a way that the market for news and information is not?

Ingram continues:

if you are adding more value through context and analysis, then there are many more ways to monetize that than by slapping simple banner or text ads on it — which seems to be the only thing that Daniel and others can imagine newspapers doing.

Actually, my imagination is hardly limited to ad-supported models. As regular readers here know, I’d like to live in a world where people pay for digital content just as they pay for other goods and services they value. But I live in the real world, in which I don’t see any viable alternatives to the ad-supported model showing up soon.

Ingram concludes:

But if you are actually adding value, wouldn’t you like as many people to find out about it as possible? Cutting yourself off from the world’s largest search engine is like cutting off your nose to spite your face.

That argument strikes me as equivalent to one that, if you’re trying to sell a house, you should price it at a dollar to attract as many buyers as possible. Or that suppliers should do whatever is necessary for Wal-Mart to sell their goods at high volume. Not all supplier relationships lead to sustainable business models.

The problem, as I see it,  is that when readers “find out about” a news article through Google, they read it in a hit-and-run fashion that doesn’t give the newspaper a chance to build a relationship with readers. From the reader’s perspective, the article may as well be published by Google. That’s great for Google’s brand equity, but not so great for the newspaper’s. I’m realistic that no newspaper can afford to indivdiually cut off Google or search engines in general. It’s a prisoner’s dilemma. But I am curious to see what would happen if a critical mass of publishers did so in concert. Yes, that would cost them and Google money. But it’s not irrational as a negotiating tactic if it leads Google to consider a distribution of rents that is still worth Google’s while but more favorable to publishers than the present one.

Ingram works in the newspaper industry, and he’s clearly put a lot of thought into this topic. Nonetheless, I remain unconvinced by his arguments. Perhaps we can find our way to a common ground. Whoever is right will hopefully convince the other that he is wrong!

Link to original post

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

sales and data analytics
How Data Analytics Improves Lead Management and Sales Results
Analytics Big Data Exclusive
ai in marketing
How AI and Smart Platforms Improve Email Marketing
Artificial Intelligence Exclusive Marketing
AI Document Verification for Legal Firms: Importance & Top Tools
AI Document Verification for Legal Firms: Importance & Top Tools
Artificial Intelligence Exclusive
AI supply chain
AI Tools Are Strengthening Global Supply Chains
Artificial Intelligence Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Social Media ROI is About the People

4 Min Read

Smart Grid Heavy Hitters – Jon Wellinghoff, Chair of US Federal Energy Regulatory Commission – part 1

2 Min Read
Image
Uncategorized

Virtual Reality: A New Creative Medium Where the Default State Is Belief

5 Min Read

Missing! Results from the field of change management

3 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data
AI chatbots
AI Chatbots Can Help Retailers Convert Live Broadcast Viewers into Sales!
Chatbots

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?