For starters, I want to point out that the so-called over-optimization penalty was mentioned, half-fast and off the cuff by Matt Cutts - the mention can be clearly heard here; you tell me what you think.
Now, I'm calling all the logical mathematical SEOs out there, them or just the logical programmic SEO web developer types - something doesn't sit well with me with this so called over-optimization penalty. Can anyone else see how strange it would be for Google to depreciate rankings based on;
- keyword density?
- similar anchor text?
- un-natural link building?
The news has been spreading for YEARS that Google's the Tip Top of search engines but now suddenly all those GREAT results prior to this "announcement" were poor and they need to "re-evaluate" the content they've been giving us?
Like here, there is too much speculation;
- Keyword/phrase over usage
- Too many redirects (?)
- Same/similar anchor in back links
- Same Niche same server (?)
- Doorway/thin affiliate (?)
- Link schemes/cheap backlink packages
- Anchor texts that are too similar (?)
- Less or near duplicated content (?)
- Inorganic and/or unnatural /paid inbound links
- High keyword densities
(My question marks)
Really? Now too many redirects which help users and Google are going to be added to the over-optimization penalty list? Really...Duplicate content? Really? That's a problem for Panda isn't it?
I don't believe there is one. I believe the "Semantic Web" (as covered by WSJ) may be something for us to more and more look forward to but if you see your website suddenly drop in the search results, I'd consider the drop related to any of the same reasons websites drop in the SERPs all the time.
Lastly, I'll suppose that if there actually is an over-optimization penalty, it's near the bottom of the list of importance and probably truly affects less than .01% of the search results - I'll bet'cha!