Site icon Search Engine People Blog

How To Avoid an On-Page SEO Over-Optimization Penalty [Post-Penguin Era]

Stay Connected with Us!

Were you hit by any of the Panda or Penguin updates from Google? Are you struggling to recover your rankings or have you made it through so far but are still worried about the next update? Whether you consider yourself an SEO expert or just a weekend warrior, it'd be wise to check your sites now and re-evaluate them to make sure there is nothing that could be seen as over-optimization of on-page factors by the search engines, given the increasing trend of stricter filters and penalties.

The first step is to not let personal bias get in the way of evaluating your site (E.g., "But hey those are good pages! It took me 2 days of PHP coding to get them to work!") or professional bias ("We've had that section on our site like that forever, the CEO might not be happy if we changed it.") You'll probably have a gut feeling of what is good and what isn't, so just go with it. Lay out the basics and figure out what you need to do first. You can worry about the specifics and the implementation later.

After you've gone though my tips and perhaps some others as well on avoiding/recovering from over-optimization penalties, you should create a prioritized list of items, ranging from "must change" to "would be nice" from an SEO-standpoint. After that, you will be in a position to convene with the decision makers and stakeholders of the company to go over what is feasible and what the trade-offs of would be of correcting any possible SEO over-optimization of the site.

Also, any technicalities and implementation questions/concerns will surface during this time and is why I recommend not worrying about the details too much in the beginning. This (like all SEO stuff) may take some negotiation on your part and you may have to sell your ideas to management, but it should be easy if you provide a few examples of similar companies/sites that lost or won big time from the recent Panda or Penguin updates. A few resources for examples are available here and here. Gains or losses of 30-90% from search is nothing to sneeze at, and should grab their interest.

So let's jump in, here are the most important tips to ensure you don't fall victim to an on-page over-optimization penalty now or in the future:

Make every page useful (And kill all duplicate & redundant pages)

Each page on your site needs to serve a purpose for your visitors... Either you want them to find it through your site's navigation, or want them to land on it via the search engines. The days of creating lots of similar pages that are solely for the search engines to rank but have no tangible value for visitors except to trick them to come to your site (and possible click an ad) is coming to an end. You can still do it, but these pages are what are called "thin pages" and doorway pages and is what Google has been really cracking down on lately.

The value that you provide for your visitors can be anything, but it needs to be something. Catch my drift? If your pages don't provide any stand-alone value or are nearly identical to other pages you have on your site (or other pages on the web), either beef them up with new content or 301 redirect or canonical them to a similar page on your site that does.

If you must have the page for visitors but don't want Google to see it, you can add a noindex tag or block it via robots.txt. All landing pages created for Adwords/paid search should be noindexed and blocked as well, even if you don't link to them from the site.

As for redundant pages; Google treats them as duplicate content, and this is what can get you penalized. There shouldn't be a bunch of pages on your site that are optimized for very similar keyword variations--very similar being the key.

So for example, if you had a section on your site dealing with car insurance, you wouldn't want to have separate pages targeting Cheap Car Insurance in Orange County, Cheap Auto Insurance in Orange Count and Affordable Car Insurance in Orange County. There is no tangible difference here for your visitors (cars and autos are different, really? How?) , and you can easily have one page that targets all of these keywords quickly and simply, e.g., Cheap and Affordable Car/Auto Insurance in Orange County. Done deal, and it even looks more realistic and less like it's spam.

Visitors don't like to see multiple pages on a website for nearly identical things--they get confused and think you're playing games with them, and so does Google. Don't worry if each of your pages is not laser-targeting a specific keyword phrase either. Google is smart enough and familiar with synonyms to see all those keywords on your page and rank you for all of them.

Note: There are still advantages of not doing as I say and being more spammy and making multiple pages for the same topic (you can sometimes get better rankings for the exact keywords), but the risks now outweigh the benefits and sooner or later every site that engages in this behavior will be penalized or filtered to some degree. If you're not convinced, just be glad you got away with it until now, but start to think about how to change your ways.

Avoid boilerplate templates

This is related to the previous topic of duplicate content, but is a separate and different issue. Have you seen been on a site and visited multiple pages and seen the following type of content and wondered, hmmmm, did someone really write this stuff?:

"The 2012 Honda Accord is powered by a 4-cylinder or 6-cylinder gasoline engine and gets between 18-32 MPG. It seats up to 5 adults and is priced from $21,800 to $30,700."

"The 2012 Chevrolet Silverado is powered by a 6-cylinder or 8-cylinder gasoline or diesel engine and gets between 12-23 MPG. It seats up to 6 adults and is priced from $22,000 to $43,200."

Obviously a database is being called here and a sentence is being generated on the fly using the values that are available. No one actually wrote the content. Many large sites have pages with content created like this, and that alone is not a bad thing (after all, you did learn something new and valuable about the two cars mentioned above), however, if this is the extent of the content on a page and this method is duplicated across most of your pages, Google will see through it and probably devalue you.

On the other hand, if this type of boilerplate text only makes up a small portion of the content on your pages and there is a real car review for example, or other unique information on each page, Google will realize you're not trying to spam and will recognize whatever the value this text provides users.

The moral of the story is if its too good/simple/cheap/fast to be true, then it probably is.

This is unfortunate for many people, but its now a reality. It all began with the Google Panda update, and the issue is further compounded when the pages that have boilerplate text also have lots of advertising and are highly optimized for competitive keywords.

If the pages are redundant as well, you're almost certain to be looking at a penalty if you don't already have one. Google and other search engines see stuff like this and say to themselves, "Hey, this site is really trying to rank for all of these competitive keywords, but there is really nothing new or useful here and each page probably took 5 seconds to create. People looking for these things would probably prefer the other site that took more effort to build and has unique content, so we should probably rank it higher in the SERPS as well."

Don't be a keyword stuffer

I'm not sure why, but many people still think that the path to SEO success is to keyword stuff the crap out of everything including your meta titles, descriptions, URLs and page copy. Not true... In fact, it hasn't been true for about, what 9 years? (Think Florida Update.)

Keywords are still important of course, (how else would Google and other search engines know what your page is about? They're not that good yet at determining intent), but that doesn't mean you have to repeat yourself and jam your keywords and related words everywhere that you possibly can.

When I do SEO for a site, my rule of thumb is to make sure my keywords are included, but I don't go out of my way to repeat anything or add synonyms for SEO reasons. I identify the primary and secondary keywords that I want to target for each page and make sure to include them once in the meta title, description and page title (H1), while paying attention to make sure the text still makes sense and looks nice and readable (there's no point to rank well in Google but get a low click-through rate due to un-compelling copy. People are more savvy now and wont always click the first result if it looks spammy or keywordy. Think about that.)

How many times the keywords should appear on the actual page besides the meta tags is debatable, but it should be there at least a few times.

Naturally occurring synonyms and related words will make up for the rest of the SEO relating to the keywords you're targeting. At any rate, I don't pay attention to keyword percentages anymore, and neither should you. As for the URLs, I tend to not make them an exact match and like to throw in some variation, but I usually include the primary and secondary keywords. This is not scientific, just my personal experience and what I've read over the years. Also, pages with 1-5 words in the URL tend to do well in my experience.

Check all your links

Make sure each link on your site has a legitimate purpose. Check each page and all of your site-wide footers, sidebars and other templates, and make sure all internal and external links aren't spammy or over-optimized. Here are the most important things to consider:

Don't go crazy with ads

Have you ever been on a web page where the only thing you see at first are a bunch of ads? And you have to scroll down to find what you were looking for and even then it seems that everything is wrapped by advertisements? Kind of annoying and makes you trust the site a lot less, no?

This is called ad over-optimization, and Google has really been cracking down on this lately. In fact, a large site that I work on was hit by Panda 1.0 in February 2011 for this very reason, along with having some redundant pages and boilerplate text issues. When all of these were removed and the site cleaned up, we quickly recovered and have been penalty free ever since. (We removed approximately 30% of the total number of pages on the site, but our traffic actually increased by around 15% after we got out from under the penalty! The site has continued to grow and get record traffic and it just proves that having over-optimized and duplicate pages doesn't help in today's SERPs. We're also making more money from less ads, so go figure (they are better ads though).

The point is that Googlebot is now smarter than ever, and knows what is an advertisement and what is the content portion of your site. It doesn't like it when a page has no real substance, especially if this lack of substance is above the fold (the portion visible without scrolling).

This ad de-optimization is a relatively simple fix, but you will probably lose some revenue when you change/reduce your ads and the implementation and relocation of ads may turn out to be a bit of a challenge. It's worth it though, especially if you have a branded website that you can't afford to lose traffic or face on. Plus too much ads may be hurting conversion anyway if they're annoying to the user.

Anything Else?

Nope, this is pretty much it. If you get a good handle on these anti-overoptimization tips, you'll be well-of and on your way toward successful and sustainable SEO. (Which is what should be your goal anyway in today's marketing and SEO climate). If you have any specific questions that go beyond the scope of this post (e.g., When is cloaking okay or how to do it, what really is a paid link, enterprise-level SEO etc.) feel free to shoot a comment and I'll try my best to respond quickly. Or, below I've listed some great resources if you want to find out more about on-page and other over-optimization:


Special thanks for the images I shamelessly borrowed (in order of appearance):
Matt Cutts Referee: https://www.searchenginejournal.com/googles-over-seo-optmization-penalty/41517/12-1/

Copy Cat: https://www.barrywise.com/2008/10/seo-issues-with-duplicate-content-htaccess-robots-and-urls-part-1/
White Keyword stuffing image: https://www.hpnetservices.com/blog/keyword-stuffing-google-penguin-update/
Magnifying glass: https://www.bahaiperspectives.com/2008/04/06/social-networks-are-we-being-carried-down-stream/