There has been a lot of press lately about Google and its ongoing effort to “improve” the quality of search results. A few weeks ago the NY Times did an article essentially exposing some blatant black hat techniques used by JC Penny to get top ranking. Read article here
. What made a big impression on me was that Google did not discover this with their “sophisticated algorithms”, it was discovered by a journalist. Granted once this was brought to Google’s attention JC Penny was knocked back several pages in the search.
In other news, Google announced a major change to its algorithm designed to remove “low value sites”; specifically it would appear to be aimed at sites that automatically scrape content to generate thousands of text rich pages, (but all either duplicate or gibberish). These sites tended to get some high rankings in the past. The situation is still unfolding, but unfortunately, it appears that some “innocent sites” got punished.
Both cases demonstrate that Google is not perfect. Google strongly prefers refining or changing its ranking algorithms to find the perpetrators as opposed to manually reviewing sites. This is understandable because there are billions of web pages out there and Google could not afford to hire 100s of thousands of people to do that police work. Google knows that an algorithm change is not going to perfect. There is going to be sites that are cheating that it still does not catch and there are going to be honest sites that will get unfairly punished. I am not sure what is acceptable to Google, but I don’t think the numbers on either side will ever get to zero.
Bottom line, some innocent’s bystanders are going to suffer, some cheaters are still getting away with cheating, but for most of us, the changes will be fair. Just hope you aren’t the innocent bystander.