Part of our philosophy behind optimizing websites for the search engines is based in the fact that higher search rankings generally lead to higher revenues if your other site elements are done right.
Good SEO most certainly leads to higher rankings. How high also depends on the quality of content, usability and the industry/keywords you’re optimizing for.
Much of the craft of ranking high in the search engines stems from a close watch of what’s going on with Google and the algorithm it uses to crawl and rank sites.
As you may know, there’s over probably 200 factors they consider important.
One of those is content and as I can tell you from personal experience, content is a central pillar to successfully using the Internet to drive leads and revenue to your business.
But some ‘low-quality’ sites are seeing a significant decline in their traffic.
The main reason you ask?
Some interviews with Google spam chief Matt Cutts and Google researcher Amit Singal offer some insights into Google’s Panda update…continue reading for a quick summary.
In late 2009, Google’s Caffeine update improved the search engine’s indexing process. This provided Google with lots of content – some good, some not so good.
Cutts comments in a Wired Magazine interview that many sites were simply producing content from the perspective of “What’s the bare minimum that I can do that’s not spam?”
Without the ability to consistently define what a low-quality site is, many webmasters and SEOs believe Google is utilizing more human reviewers. Cutts and Singal say they’re trying to develop an automatic system to screen this.
Of course, Google doesn’t let the entire cat out of the bag in regards to their algorithm and in a way, understandably so.
But from this interview, we can discern what they’re looking and penalizing sites for.
We can however discern how Google is asking human reviewers several questions to determine if it ‘trusts’ a site or not. And that seems to be the big key – does Google trust your site.
Large domains like CNN.com or Walmart.com generally are trusted.
But smaller sites are looked at more carefully…ones where content seems questionable and untrustworthy to third-party review are seeing some declines in rankings.
The takeaway here?
Be sure you take steps to ensure your site is trustworthy. In regards to content, be sure you put original content online…don’t take things from other people and rehash them. Many of the sites who saw drops in rankings and traffic (one site who lost ground lost 10% of its revenue and had to subsequently reduce staff) were engaging in a practice called ‘content scrubbing’
In essence, content scrubbing is a situation where a site may have content to similar (…or not unique enough) from another site.
It’s not that anyone did anything technically wrong but rather a case where some sites have seen a sudden drop off in rankings and traffic.
Did your website experience any sudden drop like this recently?
If so, drop us a quick comment and tell us about it…if you’re gathering content from various sources and not re-purposing it enough, it’s possible this is the reason.