Big Data: Google’s #1 Weapon against Web Spam

6 Min Read

Larry Page and Sergey Brin started Google in 1998. They were pioneers in big data before it was even a thing. Google’s algorithms have changed over the years, but big data has always played a key role. Today, the search engine giant is using big data to combat the growing threat of web spam.

Evolution of Google’s Application of Big Data

In the late 20th Century, most search engines determined rankings by keyword density and paid endorsements. The quality of search engine results was very poor. Brin and Page invented the pagerank algorithm in 1998 to provide higher quality search results for their users. SEO by the Sea has a very insightful article on the role big data plays in Google’s search algorithms. The post is now a few years old, but the results are still very applicable.

The new pagerank algorithm evaluated chains of hyperlinks from other websites to determine client rankings. The new algorithm crawled indexable pages across the Internet, which required some of the most sophisticated big data tools of the time.

The new model was a drastic improvement over search engines that relied solely on keyword density, which helped make Google the most popular search engine in the world. However, it created a new problem – link spam.

A specific group of spammers known as “black hat SEOs” caught on to the new changes. With previous search engines, their strategy to boost rankings was to fill their webpages with the keywords they wanted to target. When they started trying to rank in Google, they would generate spammy links from lots of different web pages across the Internet. They use Hadoop based tools to extract data on hundreds of millions of websites and use them to assign website rankings in near real-time.

Matt Cutts, the head of the web spam team at Google, has pledged to find new ways to fight spam. He stated that the company would manually penalize sites that were caught selling links and violating other content guidelines. They also updated their algorithms to identify websites that had a lot of unnatural link structures. Those sites are regularly penalized to discourage spam.

Moz provides a detailed overview of the algorithms they use to fight spam.

“[Google engineers] laud their ability to quickly roll out new spam data as they receive it so that within minutes of new spam being created, they can identify it. What does this say about what we do? Think about the recent hit on link networks. Google can quickly discover and identify spam, and as of the Penguin update, they can roll it out globally in a hurry. And in case you didn’t think I was on to something here, the next spam fighting method says it all.”

Most big data systems require servers to assimilate data from numerous sources. Google is no exception. They aggregate web spam data from their own search bots and from user reports. Matt Cutts has said that a lot of their data on web spam comes from reports from users, which they urge more people to file.

“There are several ways that we intend to use the data. Our current algorithm detected the paid links above just fine, but these outside reports are a great way to measure (and then improve) the precision and recall of our existing algorithms on independent data. Next, the reports help build datasets for future algorithms. So the data help us build the next generation of algorithms to improve quality. It also lets us work on new tools and techniques to improve how we detect paid links. Finally, we can investigate and take direct action on many reports that we receive,” Cutts writes.

SEO Reseller states that this is changing the future of search engine optimization. Brands must understand that it will be more difficult to game Google’s algorithms, which means they must look for white hat optimization strategies.

New Big Data Solutions Will Help Google Eliminate More Web Spam

Targeting spam links is one of the biggest challenges Matt Cutts and his colleagues face at Google. Despite the growing prevalence of web spam, they have made significant progress in recent years, because they have used some of the most sophisticated big data tools in the world.

They will have an easier time as new Hadoop based algorithms help them identify web spam and penalize sites accordingly.

Share This Article
Exit mobile version