Site icon Search Engine People Blog

How To Test & Optimize PPC Ad Campaigns (Part One)

Stay Connected with Us!

You planned, researched, built and launched your perfect Pay-Per-Click (PPC) campaign. It's out in the world bringing customers to your store, whether physical or digital. Now you can just sit back and start counting the money right?

NO!

PPC should never be treated with a "set it and forget it" mindset. Yes, you put a lot of work in to the planning and research so that you picked great keywords. You wrote compelling ad copy. You got all the setting right during your build. That's a lot of work, but it's static. You need to monitor performance, test and optimize.

Making Decisions With Data

Before we jump into the "how to" section we need to understand how we're going to make decisions. For some of these steps below, like an irrelevant search query, you'll be able to act based on a single click. However, for most of the steps below you'll need to accrue a certain amount of data upon which to base your decisions. I won't get into statistical significance here, but I recommend a simple calculator like this one from KissMetrics for quick A/B decisions.

Now, on to the good stuff. Here is a roadmap to get your started:

Step 1: Check Your Search Queries

With a new campaign, this will be the first area to check and start making some optimizations. In AdWords you navigate to the Keywords tab of the interface and then select the "Search terms" box, like this:

For Bing, it's pretty similar, but you need click on the "Details" drop down box and click on All or Selected from the Search Terms section like so:

This report will show you the exact searches that people did. Some of them will be really long. Many will have misspellings. Every click here cost you money. So what do you do?

This will prevent your ad from showing on that exact search again. Notice the radio box for either the ad group or campaign level. If you select campaign it will prevent it from showing in any ad group of the campaign. When choosing, think to yourself, "would I be okay if this triggered an ad in another ad group?" While the system defaults to ad group I recommend going with campaign unless you have a compelling reason.

Step 2: Review Keyword Performance

When judging keywords, make sure you've got enough data to make decisions. If you're basing your decision on Click-Through Rate (CTR) then you'll likely be able to move more quickly, but judging on conversion rate may make more sense. A general rule of thumb from statistics is to wait until you have 30 results. For CTR, that means you want to look at a minimum of 30 clicks in your chosen time frame. For conversion rate, look for 30 conversions.

The AdWords interface can be daunting. I just counted the number of possible columns in an AdWords account and it was 69. That's a really big bite to try and chew, so here are a few of my favorites to look at:

The same rules can be applied to cost/conversion columns as well, but will likely take longer to get the necessary data points. As expected, look for areas with desirable cost/conversion and optimize to get more of them.

For illustration, here are 5 keywords and their performance statistics:

Notice that our 5th keyword here is "Below first page bid" and that is manifesting itself as a far lower than average CTR with poor position. However, the keyword is responsible for 2 sales which total $164. Pretty good on $11.52 in spend, so this is a great candidate for a bid boost!

Step 3: Review Ad Performance

Now let's look at our ad performance. Following best practices your campaign build should have included at least 2 ad variations in each ad group. Let's skip right to an example:

Here you'll see two ads that have accrued a large amount of clicks and a healthy number of conversions. This is an ecommerce client so a conversion represents a sale and the conversion value column is showing revenue. The main question here would be "Is one ad significantly better than the other?"

Starting with CTR we can see there is about a half a percent difference, which would represent a 2% increase. However, if we plug this into the statistical significance calculator mentioned earlier we see that the difference is not statistically significant. We can therefore not say confidently that one ad is better than the other based on CTR.

But what if we looked at the conversion rate? Yet again, there is a 2% difference, but again our calculator tells us it's not a significant difference. These two ads are virtually equal unless we look in the value columns. There we see that Ad #1 generates an average of $4.06 in revenue for every click while Ad #2 only generates $2.65. Therefore, if I was to pause an ad to begin a new test, Ad #2 would be the "loser" here.

To Be Continued...

As you might imagine, there are many ways to test and optimize. We've covered 3 areas here today, but stay tuned for Part 2 where we'll look at additional optimization areas such as device modifiers, geographic modifiers and ad extensions.

Series: