If you're like me, you created a Google Analytics account, added the basic code snippet to your site, checked your stats a few times, and never really figured out how to use it to make decisions about what to do on your website.

If that's your story too, you've been through Google Analytics 101. You know how to find out what pages on your site are getting the most traffic, where that traffic is coming from, each page's bounce rate, the course they take through your site, etc. But you may not know what to do with that data.

That's partly because you need a little more data than you get with just the basic tracking snippet. So now it's time for your next class -- Google Analytics 102. In this class, you'll learn how to add a little more tracking code to your site that will tell you what's working on your site, and what's not.

Prerequisites

In case you haven't completed Google Analytics 101, here are a few things to do to get started.

  1. Create your Google Analytics account (free).
  2. Add the tracking snippet to your webpages (if you did this a long time ago, you may need to update to their latest code snippet).

If you're done with that, it's time to take the next step.

We're going to do two things today to help you discover which links between your webpages are moving traffic. For example, if you have banner ads for your own products on your site, which ones are getting clicked? You can use this information to split test different banners, calls to action, etc.

Tracking Ad and Content Views

Before you can tell what the click-through rate for a particular banner or link is, you need to count how many times it's displayed. If you only have that link on one page, you can figure that out by looking at the number of times that page gets loaded.

But otherwise, you need to add a little piece of JavaScript code to count impressions. This is what it looks like:

<script>ga('send','event','banner','view','bannerName',{nonInteraction:true});</script>

I'd suggest putting this code right next to whatever it is your tracking. Just be sure that it appears after the tracking code snippet on your page. That will happen automatically if you put the tracking code snippet in the <head> section of your webpage.

The first two arguments, "send" and "event" specify that you want to send an event to Google Analytics. Pretty straightforward.

The next two, "banner" and "view" can be changed as appropriate. For example, if you're tracking something other than a banner, you might change "banner" to "link", "call-to-action", or whatever it is you're tracking. You can make up whatever name you want for it. "view" can be changed to anything you want, too. You might call it "impression", if you like that better.

The next argument should identify whatever it is that you're tracking impressions of. For example, you might come up with a unique name for each of your banners, and put the name of the one that's getting displayed there.

The purpose of {nonInteraction:true} is to tell Google Analytics that this event wasn't triggered by user interaction. For example, we're not using it to track a click on something on the page. This affects how Analytics calculates the bounce rate for the page. If this were an "interaction" event, Analytics would report your bounce rate as 0%, because Analytics would think that every page view had led to an interaction, even if the visitor didn't click through to another page on your site.

With that code added to your site, you'll start to see data added to the "Behavior/Events" section of your Google Analytics reports, telling you how often each banner, or whatever your tracking, is getting displayed.

Next, let's find out how many clicks each is getting.

Tracking Clicks Within Your Site

It is possible to count clicks on the page where the clicking is getting done. You can do some really unreliable counting by adding a simple onClick handler to the link you're tracking. But without making things a lot more complicated than we want to today, there's no way to be sure that the click will get counted before the browser leaves the page to follow the link.

What we're going to do instead is add some parameters to the links themselves so that the clicks get counted once the visitor arrives on the page they clicked through to. If the links are between pages on the same site, this isn't an ideal approach, because URL parameters are designed to be used to track traffic coming from other sites. But it will get the job done.

First you'll need to be sure you have the Google Analytics tracking snippet on the page they land on after they click. Once that's done, a little change to the link URL takes care of the rest.

For example, if you have a banner that was linking to your sales page at http://example.com/product/landing-page.html, you might change that link to http://example.com/product/landing-page.html?utm_source=example.com&utm_medium=banner&utm_campaign=mybanners&utm_content=banner-a

You've probably seen stuff like that in links you've clicked before. Now you get to find out what it means.

You can put pretty much anything you want in all of the link arguments listed above. But each is intended to have a particular meaning.

  • utm_source identifies a website, search engine, newsletter name, or other source of traffic.
  • utm_medium is for the method used to drive the traffic, for example, "email", "cpc" (cost per click advertising), "video", or, in our case, "banner".
  • I'm not sure why utm_campaign is required, but it is. It identifies a particular promotional campaign. Put whatever you want there.
  • Finally, use utm_content to identify the specific banner, content, link, call to action, or whatever it is that you're tracking. To make analysis easier, use the same thing that you used for "bannerName" in the previous section.

With your links updated, the tracking data will start rolling in, and you'll be able to figure our what percentage of people who see your banners, links, etc., are clicking.

Which brings us to...

Split Testing

Having click-through data is nice. But having click-through data for multiple alternatives is better, because it enables you to test them against each other and pick the one that works the best.

I'll show you here how to do some simple split testing. For more powerful options (for example, to keep showing the same ad to a person on every page they view, to avoid having to edit the code on every page where you want to test, or to use events rather than URL parameters, as mentioned above), you might use a tool like TriggerNote.

On a PHP webpage, you can run a simple split test using code like this (some of this code will look familiar from above):

<?php
switch(rand(1,2)) {
case 1:
   ?>
   <a href="http://example.com/product/landing-page.html?utm_source=example.com&utm_medium=banner&utm_campaign=mybanners&utm_content=banner-a"><img src="/img/banner-a.jpeg"/></a>
   <script>ga('send','event','banner','view','banner-a',{nonInteraction:true});</script>
   <?php
   break;
case 2:
   ?>
   <a href="http://example.com/product/landing-page.html?utm_source=example.com&utm_medium=banner&utm_campaign=mybanners&utm_content=banner-b"><img src="/img/banner-b.jpeg"/></a>
   <script>ga('send','event','banner','view','banner-b',{nonInteraction:true});</script>
   <?php
   break;
}
?>

That code uses the function call "rand(1,2)" to randomly select a number between 1 and 2 (you could add more options and change the "2" to the number of the last option). That number is used to select which of the banners to display, along with the JavaScript code needed to count views of that banner.

The only difference between the two alternatives is that, in 3 places, one says "banner-a", while the other says "banner-b" (in the link arguments, where the image is loaded, and in the view tracking code).

So, now we're displaying two alternative banners on the site, and Google is counting how many times each is displayed and clicked. The final step is analyzing the results to decide when we've tested enough and are ready to declare one of the alternatives the winner.

Statistical Analysis

If you Google split testing, you'll find all sorts of recommendations for how many impressions on clicks you need before you can end your test. But the truth is that there's no specific number.

That's because the number you need depends on the click-through percentages of the alternatives, and how much different they are. If one is getting 5.1% click-through and the other is getting 5.2%, you're going to need to run the test for a long time before you know whether the one is really better, or whether the difference is random. If one is getting 5.1% and the other is getting 8.5%, you won't need to test nearly as long.

The only way to know for sure when it's time to end the test is to do a statistical analysis of the numbers. Fortunately, you don't need to know how to do the math. Instead, you can simple plug the numbers into our free Split Test Analyzer, and it will crunch them for you.

First, you need to find the number of impressions and clicks for each alternative in Google Analytics. The number of impressions (and clicks, if you ended up using events instead of URL parameters) can be found in Behavior/Events. The number of clicks (if you used URL parameters) is a little harder to find.

The number you're after can be found by going to Acquisition/Campaigns/All Campaigns. Just below the graph, you'll see several options labeled "primary dimension". From "Other", selection "Acquisition/Ad Content".

Also, if you're tracking clicks between sites and are using URL parameters to count clicks, remember to look on the referring site(s) for the numbers of impressions, and on the site you're linking to for the number of clicks.

Enter the numbers into the Split Test Analyzer, and it will give you the click-through percentage for each, plus what's called a "confidence interval". For example, if the confidence interval is 70, then there's a 70% chance that your test results are valid, and a 30% chance that they're just the result of random chance. If the confidence interval is 95, then there's a 95% chance that your results are valid, and only a 5% chance that it's just random.

I strongly recommend running your tests till you have a confidence level at least 90, if not 95. This is what statisticians recommend, and for good reason. I've seen plenty of tests that got up above 80% confidence, but later went the other way, or there ended up being no real difference between the alternatives.

If you're still reading, you have all the information you need to pass Google Analytics 102, and discover what's really working on your website.