Archive for Web Analytics


Bear with me for a minute, basketball fans.

If you work in digital analytics, you are familiar with the concept of the Key Performance Indicator (KPI). A KPI is a piece of data, shown over time, that gives you immediate insight into how your business is performing against your goals. Sometimes they are very general (such as Orders per Visit, a.k.a. Conversion Rate) and sometimes they are more specific (for example, Bounce Rate for visitors coming from search). These things are lifeblood of some business goal you’ve set. A business typically has several KPIs that they monitor every day. And not every metric is a KPI; a common rule is that it isn’t a KPI unless it’s something that, if decreasing below acceptable norms, would cause your business to take immediate action to rectify.

If you don’t work in digital analytics, but you are an NBA fan, we can finally explain KPIs to you using NBA statistics, a language you probably already speak. Here it is, courtesy of

How do basketball teams win games? While searching for an answer to that question, Dean Oliver identified what he called the “Four Factors of Basketball Success”:

Shooting (40%)
Turnovers (25%)
Rebounding (20%)
Free Throws (15%)

The number in parentheses is the approximate weight Mr. Oliver assigned each factor. Shooting is the most important factor, followed by turnovers, rebounding, and free throws.

The article goes on to explain that each of those four factors is expressed in a rate: Effective Field Goal Percentage, Turnover Rate, Rebounding Rate, and Free Throw Rate. These have all the markings of good KPIs. I want to be as good as I can be in each of those four areas, and if I succeed, I’m almost definitely going to win basketball games.

If I were compiling a basketball team, or coaching a basketball team (or advising a basketball team on how to begin analyzing itself), those would be my first KPIs. Those are the metrics that I would use to gauge success. And while basketball, like business, has one metric that trumps all others (for basketball, it’s wins; for business, it’s profit), these are strong leading indicators of a team’s ability to win.

So, basketball fan, think of your digital analytics friends as something like basketball coaches who are looking at effective field goal percentage and benching that wing player who won’t stop taking threes early in the shot clock, or a GM who sees that his team is weak in rebounding and therefore targets an athletic big man in the NBA Draft. It’s clear to an NBA fan, looking at how his team is performing in each of the Four Factors, how a coach or GM might address a deficiency in these areas, just as analysts are great at coming up with recommendations when a KPI is struggling and needs to improve.

In fact, that’s the great thing about KPIs: they provide a really nice, simple jumping off point for analysis. Why were the Jazz so bad at eFG% this past season? We can begin to answer that problem for management with some very specific advice, especially when we add in analysis of shot location and lineups/rotations. Why are my web site visitors who arrive after performing a Google search leaving so quickly? We can look at that user segment and see what they’re doing and where they are running into roadblocks, or look at our landing pages and analyze them for effectiveness. Same thing.

So now you’ve got something to talk about with your digital analyst friends. And digital analysts, you can ask your NBA friends how their team’s turnover rate has been trending lately. Your next cocktail party is sure to be a smashing success!

In 2011, Good Changes Take Time

This is not an Omniture-versus-Google Analytics post. This is not a Google Analytics-bashing post.

This is a post in which I defend a decision that I helped (in some tiny way) to make when I was a Product Manager on the SiteCatalyst team at Adobe.

In 2011, businesses rely heavily on their web analytics data. Analytics may not be where we’d like to see it yet, but it certainly has come a long way over the past decade. And the more critical this data becomes, the more resistant customers will be to uncomfortable change.

SiteCatalyst 15 introduced major changes to the Omniture platform. This brought some great features with it: segmentation, breakdowns on all conversion variables, Processing Rules, etc. But it also introduced change. Specifically, it affected the way that SiteCatalyst determined the number of visits to customers’ sites/apps. In most implementations, it was a minor change, but in some cases it was noticeable in client data. (You can read all about these changes here.)

Because this platform change potentially affected things like year-over-year comparisons and conversion rate, some of our customers weren’t comfortable making the change to the new platform right away. They told us that they wanted some time to understand the new platform and its effects on their business.

As I explained on Eric Peterson’s epic (and awesome) Google+ thread last week:

The feedback that we got on this was that it was significantly painful for many users to have that change occur, but that they acknowledged the improvement. So, if you’re Adobe, do you make the change and alienate some users, or do you hold off and alienate other users?

Alyson Murphy echoed this thought:

Can you really expect Omniture to implement a major paradigm shift without alienating a ton of customers? People love their comparison data. Look at how difficult it is for some companies to shift to SiteCatalyst 15. If a relatively small change compared to what you are suggesting causes that much pain, a huge paradigm shift isn’t going to go over too well with many of their clients.

The day Alyson and I made these comments, Google announced in a blog post that it had also changed the way it calculates Visits. Now, any new traffic source encountered by a user (paid search, natural search, affiliate, social media, etc.) would instantiate a new visit/session. The reaction from customers in the blog comments has been. . . interesting:

I am see weird stuff bounce rate up 50% time on site down 75% this happened from 11th August. on most visits it count each page viewed as a new visit.

Good Grief, less then a 1% change!
“Based on our research” I would love to know how you conducted this research.

I am seeing 20% increase in visits, I thought I had finally broken free of Panda!

How I am supposed to evaluate these new metrics on steroids vs my previous metrics?

My average time on site has fallen from 7+ minutes to 12 seconds. Each visitor seems to visit the same page 6 times causing my bounce rate to be ridiculous.

I think that this update is an example of someone fixing something that wasn’t broken. Now analytics is useless.

The update makes my data virtually useless. It makes no sense.

Over the weekend, I feel that around. half of my visits are returning visitors, and the same guest may have seen the same item up to 10 times. In return, my bouncerate sky-high.

It is a vital part to have a website to have a reliable analysis program, but GA is certainly not very reliable right now, and in my case the data produced now are completely useless.

I think Google Analytics is a great tool. I use it. I use it from time to time on this blog and others, and I like it. This isn’t a complaint about Google Analytics. It is a statement about the way an upgrade which may actually be a very good thing (in terms of helping customers understand their customers and improve conversion) was handled in two different cases. I’m sure someone could explain why that last poster’s average time spent dropped precipitously, and why the new data is more accurate or more actionable.

But that’s not the point.

Conclusion? In 2011, you CANNOT just slap platform changes into your analytics platform, call it good, and expect businesses to adjust on the fly.

When I joined the Product Management team at Adobe in May 2010, we were in the midst of having this conversation with users. On one hand, it was disappointing to hear that so many felt that their users and their businesses needed time—in some cases, at companies with hundreds of users, a lot of time—to prepare for platform changes that everyone agreed were exciting. On the other, it was great to know that SiteCatalyst was that critical to various business processes even outside of the analytics team. But it’s really hard to explain to an executive why conversion rate suddenly dropped by 5% because your analytics tool changed. That’s what required time.

I’m proud that we listened to these customers and that we both a.) released a product with significant platform improvements and b.) created a system that allowed users to prepare before having these changes dropped on their plates. Is the SiteCatalyst 15 upgrade process perfect? Certainly not. But, as I mentioned above, there are considerations beyond simply the need to prepare for a change in visit calculations, and I know for a fact that Adobe continues to adapt and optimize the upgrade process.

(Also, in all fairness, Omniture has been fairly accused to making changes on the fly in the past. For example, in 2006, a SiteCatalyst point release corrected the way that search engines were identified, updating the platform to use both referring domain and referrer query string for improved accuracy. Like I said, this isn’t a tool-versus-tool argument. It’s an observation about the importance of data.)

One more thing: Anyone who tells you that only the analyst matters is fooling you. Anyone who tells you that your analytics tool only needs to serve the analyst is living in a dream world. That may have been true in 2005, but that is not how the real world works in 2011. People all over the business need data. Yeah, sometimes it’s just a perfunctory year-over-year visits comparison. Does it improve on-site conversion? Maybe not. But it matters somewhere else, to someone. Probably to someone who can influence the success of analytics in the business. Analysts had few problems with the platform changes that SiteCatalyst 15 brought, but they knew that, in order for them to succeed and to be trusted to help guide the business, their users needed to know what is going on in SiteCatalyst and not to have metrics changing unexpectedly.

So, when people say, “How come Omniture hasn’t delivered the kitchen sink yet?” remember that this isn’t a fantasy world where wholesale changes can slide right into the businesses painlessly. Google’s platform change proved that, as did the feedback we got from customers at Adobe.

Yo dawg! I heard you like SiteCatalyst. . .

I’ve been asked a few times recently how a company can report on SiteCatalyst dashboard usage by its own employees. If you’re running an analytics practice, there are a few reasons why you might want to do this—for example, knowing which dashboards are most (and least) popular helps you manage your list of dashboards and optimize your dashboards by learning which features internal consumers prefer. (I know Adam Greco will like this! See his Idea Exchange submission here.)

You can set this up in just a few simple steps.

First, create a new report suite to capture dashboard usage. My RSID for this report suite is “gainescorpdashboards.”

Second, on each of your dashboards, click “Open Editor” (in v15, this is the “Layout” button). Under “User Content,” choose to add an image reportlet to your dashboard. Put it someplace inconspicuous, such as at the very bottom of the dashboard. The image URL should be a hard-coded beacon, like this one:[AQB]pageName=GainesCorp%20Main%20Dashboard&c1=Ben%20Gaines

(Set the protocol to https because SiteCatalyst uses https and that’s where you’re viewing the dashboard.)

Add a hard-coded beacon to a reportlet

As with many old-school mobile implementations, you’ll need to hard-code the values that you want to capture. In the example above, I’m keeping it really simple: page name is the name of the dashboard, and prop1 (c1) is the name of the dashboard author/owner. Another possibility is to use a list prop (or, in v15, a multi-value variable) to capture data about the various reportlets in each dashboard. Remember to URL encode!

Third, save that reportlet and save the dashboard changes.

Whenever a user loads that dashboard, a simple hard-coded beacon will be passed into the new report suite that you created. You’ll have pathing enabled on page names, so you can see how users move from dashboard to dashboard. The depth of the data isn’t anything to write home about, but (as I said above) an implementation like this can yield insights that help you better manage your use of dashboards and better serve your users.

For example, take a look at this Next Dashboard report, which would have been possible based on the example described above:

Next Dashboard Report

People are going to leave dashboards. That doesn’t necessarily concern me. However, it’s very interesting that 37% of my users are switching to a different dashboard at some point later on in their SiteCatalyst session. I can begin to ask what they’re getting out of “Fall 2011 Campaign Dashboard” and to explore whether something needs to be added to the main dashboard to better serve users. This is an extremely rudimentary example, but hopefully you can see the point.

There are a few limitations that I should mention.

  • We can only capture static data about the dashboard. This isn’t a JavaScript implementation, and the image reportlet doesn’t accept JavaScript. This means we can’t capture the username of the person viewing the dashboard.
  • This reportlet needs to be added to every dashboard individually. You can copy a dashboard, but you’ll still need to edit the reportlet to reflect the new dashboard. Some effort required.
  • This doesn’t work for distributed reports. A beacon won’t be fired when someone opens the dashboard in PDF form. Depending on your situation, that may be a non-starter or it may be completely irrelevant.
  • People may ask about that weird blank reportlet. Yep.

So there you have it. A (relatively) easy way to capture some rudimentary (but valuable) data about your company’s dashboard usage in SiteCatalyst. Don’t just assume people are using the dashboards you’re building! Hopefully doing some basic analysis on dashboard usage will prevent you from wasting time on pointless dashboards and allow you to optimize your dashboards to better serve your company’s needs.

The move

(I thought this post would be easier to write.)

I don’t want to bury the lead, so I’ll just say it: At the end of this week, I will be leaving my post as Product Manager on the SiteCatalyst team at Adobe and taking a position as Manager of Research Analytics for ESPN. I’m tremendously excited, although I will miss many people, places, and things that my family and I have come to love during our time here in Utah, and specifically at Omniture/Adobe.

(Fortunately, the world is a lot smaller than it used to be. I’m still going to pester you, Jeff Jordan. I’m keeping your number in my phone. You’ve been warned. Oh, and Ambria? I’ll be giving out your e-mail address to everyone who wants to participate in one of your beta tests.)

The past five years at Omniture (now Adobe) have been an honor. I feel it’s important to mention that there is probably one company on the planet that could wrestle me away from Adobe at this time, and that is ESPN. In case my blog hasn’t made it clear already, I’m a sports nut who loves analytics and grew up in New England. ESPN combines all three of those things. (Plus, Bristol is only two hours from Fenway Park.) The point is that I am not making this move out of frustration, disenchantment, or fear about the future.

I don’t want anyone to think otherwise for even half a second.

My greatest concern is that people in the #omniture community that I helped build on Twitter will jump to foolhardy conclusions. That’s the downside of having been one of public faces of a brand on social media—when you leave, it never looks good. I know this because I’ve been in the “rush-to-judgment” camp before. For example, I wondered about Comcast when Frank Eliason left last year. How could Comcast have lost Frank? Things must be really bad for him over there.

How could Omniture have lost Ben? They didn’t. ESPN won me. There’s a huge difference.

I’m looking forward to writing often here as I begin to explore life as a daily practitioner of analytics. I performed analysis frequently as a Product Manager (and previously) at Adobe—as I hope you’d assume, we do use SiteCatalyst heavily to analyze and optimize SiteCatalyst—but I also spent a lot of time on other things. Fortunately, I’ve been taught well by mentors too numerous to name, and I hope to do them proud.

You can expect plenty of continued involvement in the analytics community, as well. In fact, I hope that I can participate in new and exciting ways, now that I won’t be a “second-class citizen” (as described—correctly, I think—by Jennifer Day on Emer Kirrane’s blog). On this site, I’m hoping to continue to write posts similar to those I’ve been publishing on the Omniture blog since 2009, discussing implementation, analysis, and more, as well as whatever else I decide is worth writing.

So, there you have it. If you’re ever in the Bristol, CT area, please drop me a line.