Google recently made a major change in its algorithms in order to improve the quality of search results. It affects about 12% of searches, making it much more significant than its frequent smaller tweaks.
It’s a timely development for people who are claiming that Google is losing the war against spam.
Google’s Amit Singhal and Matt Cutts announced that the update is intended to penalize “sites which are low-value add for users, copy content from other websites or sites that are just not very useful.” Conversely, it aims to reward sites with “information such as research, in-depth reports, thoughtful analysis and so on.”
The general consensus is that the move penalizes content farms, large websites that churn out huge volumes of content, often of low quality and written to attract maximum search engine hits and therefore revenue.
Many consider Demand Media, owner of eHow, to be the classic content farm, although they tell a different story. Interestingly, Demand Media’s properties actually seem to have benefitted from the update, while article marketing sites such as Ezinearticles have seen major losses in the rankings, prompting its CEO to make some big changes in editorial standards.
But How Can a Computer Assess “Quality?”
The most interesting aspect of the algorithm change is the fact that a computer is trying to evaluate a very subjective concept: how “good” is a particular piece of writing.
Stopping sites that “copy content from other websites” is simple enough: Google goes after scrapers and autoblogs and makes sure they don’t come up for search queries.
But assessing the quality of the content and the thoughtfulness of the analysis sure seems like something that would need to be done by a human.
Although Google tightly guards the secrets of its algorithms, it’s clear they can do some amazing things. Overall, they rely heavily on inbound links to measure authority, but this update is more concerned with on-page factors.
Google can look at the keyword density to see if a site is overusing keywords in an attempt to manipulate the rankings. It also might take into account the time users spend on the page and the bounce rate (% of visitors who leave the site instead of viewing more pages). The algorithm can also evaluate outgoing links to other sites to see if it appears the writer is backing up his or her writing with authoritative citations.
While none of these factors by itself can assess a website’s quality, just the right combination of thousands of them might be able to get the job done.
But ironically, Google actually seems to recognize the need to have human intervention, which seems to go against their mission. First, they have revealed that employees change index ratings in order to improve search quality.
Also, as I mentioned in my post about the problem with online tutorials, Google just unveiled a new Chrome extension that lets surfers block sites from your own search results that you deem to be low-quality. It will send the information to Google who can make changes based on user behavior. (This data was not used in the recent update, however.)
What do you think about all of this?