In 2011, Good Changes Take Time

This is not an Omniture-versus-Google Analytics post. This is not a Google Analytics-bashing post.

This is a post in which I defend a decision that I helped (in some tiny way) to make when I was a Product Manager on the SiteCatalyst team at Adobe.

In 2011, businesses rely heavily on their web analytics data. Analytics may not be where we’d like to see it yet, but it certainly has come a long way over the past decade. And the more critical this data becomes, the more resistant customers will be to uncomfortable change.

SiteCatalyst 15 introduced major changes to the Omniture platform. This brought some great features with it: segmentation, breakdowns on all conversion variables, Processing Rules, etc. But it also introduced change. Specifically, it affected the way that SiteCatalyst determined the number of visits to customers’ sites/apps. In most implementations, it was a minor change, but in some cases it was noticeable in client data. (You can read all about these changes here.)

Because this platform change potentially affected things like year-over-year comparisons and conversion rate, some of our customers weren’t comfortable making the change to the new platform right away. They told us that they wanted some time to understand the new platform and its effects on their business.

As I explained on Eric Peterson’s epic (and awesome) Google+ thread last week:

The feedback that we got on this was that it was significantly painful for many users to have that change occur, but that they acknowledged the improvement. So, if you’re Adobe, do you make the change and alienate some users, or do you hold off and alienate other users?

Alyson Murphy echoed this thought:

Can you really expect Omniture to implement a major paradigm shift without alienating a ton of customers? People love their comparison data. Look at how difficult it is for some companies to shift to SiteCatalyst 15. If a relatively small change compared to what you are suggesting causes that much pain, a huge paradigm shift isn’t going to go over too well with many of their clients.

The day Alyson and I made these comments, Google announced in a blog post that it had also changed the way it calculates Visits. Now, any new traffic source encountered by a user (paid search, natural search, affiliate, social media, etc.) would instantiate a new visit/session. The reaction from customers in the blog comments has been. . . interesting:

I am see weird stuff bounce rate up 50% time on site down 75% this happened from 11th August. on most visits it count each page viewed as a new visit.

Good Grief, less then a 1% change!
“Based on our research” I would love to know how you conducted this research.

I am seeing 20% increase in visits, I thought I had finally broken free of Panda!

How I am supposed to evaluate these new metrics on steroids vs my previous metrics?

My average time on site has fallen from 7+ minutes to 12 seconds. Each visitor seems to visit the same page 6 times causing my bounce rate to be ridiculous.

I think that this update is an example of someone fixing something that wasn’t broken. Now analytics is useless.

The update makes my data virtually useless. It makes no sense.

Over the weekend, I feel that around. half of my visits are returning visitors, and the same guest may have seen the same item up to 10 times. In return, my bouncerate sky-high.

It is a vital part to have a website to have a reliable analysis program, but GA is certainly not very reliable right now, and in my case the data produced now are completely useless.

I think Google Analytics is a great tool. I use it. I use it from time to time on this blog and others, and I like it. This isn’t a complaint about Google Analytics. It is a statement about the way an upgrade which may actually be a very good thing (in terms of helping customers understand their customers and improve conversion) was handled in two different cases. I’m sure someone could explain why that last poster’s average time spent dropped precipitously, and why the new data is more accurate or more actionable.

But that’s not the point.

Conclusion? In 2011, you CANNOT just slap platform changes into your analytics platform, call it good, and expect businesses to adjust on the fly.

When I joined the Product Management team at Adobe in May 2010, we were in the midst of having this conversation with users. On one hand, it was disappointing to hear that so many felt that their users and their businesses needed time—in some cases, at companies with hundreds of users, a lot of time—to prepare for platform changes that everyone agreed were exciting. On the other, it was great to know that SiteCatalyst was that critical to various business processes even outside of the analytics team. But it’s really hard to explain to an executive why conversion rate suddenly dropped by 5% because your analytics tool changed. That’s what required time.

I’m proud that we listened to these customers and that we both a.) released a product with significant platform improvements and b.) created a system that allowed users to prepare before having these changes dropped on their plates. Is the SiteCatalyst 15 upgrade process perfect? Certainly not. But, as I mentioned above, there are considerations beyond simply the need to prepare for a change in visit calculations, and I know for a fact that Adobe continues to adapt and optimize the upgrade process.

(Also, in all fairness, Omniture has been fairly accused to making changes on the fly in the past. For example, in 2006, a SiteCatalyst point release corrected the way that search engines were identified, updating the platform to use both referring domain and referrer query string for improved accuracy. Like I said, this isn’t a tool-versus-tool argument. It’s an observation about the importance of data.)

One more thing: Anyone who tells you that only the analyst matters is fooling you. Anyone who tells you that your analytics tool only needs to serve the analyst is living in a dream world. That may have been true in 2005, but that is not how the real world works in 2011. People all over the business need data. Yeah, sometimes it’s just a perfunctory year-over-year visits comparison. Does it improve on-site conversion? Maybe not. But it matters somewhere else, to someone. Probably to someone who can influence the success of analytics in the business. Analysts had few problems with the platform changes that SiteCatalyst 15 brought, but they knew that, in order for them to succeed and to be trusted to help guide the business, their users needed to know what is going on in SiteCatalyst and not to have metrics changing unexpectedly.

So, when people say, “How come Omniture hasn’t delivered the kitchen sink yet?” remember that this isn’t a fantasy world where wholesale changes can slide right into the businesses painlessly. Google’s platform change proved that, as did the feedback we got from customers at Adobe.

11 comments

  1. Thanks, Ben. You really hit it out of the park on why “Good Changes Take Time.”

    I see it every day in Adobe Insight installations as well. Since Insight allows customers to completely re-configure and re-cast their data 6 ways to Sunday, but that doesn’t make it “easy” (organizationally) to do so either. My clients face the same battles around expectations that everything stays “the same” at the same time that their peers expect big changes that open up new possibilities for analysis.

    It takes a constant eye toward what “could be”, while slowly pulling people along and keeping the other eye on what needs to remain to provide the baseline and the comfort that the business has come to rely upon.

    • Ben says:

      See, what I’ve done here is explain a problem without even attempting to discuss a solution. I am NOT saying that the SiteCatalyst 15 upgrade plan is the gold standard for platform changes. You have correctly grasped my point, which is that vendors need to account for the fact that organizations have latched onto analytics products and will not be thrilled by platform changes which throw them for a loop and cause their businesses to doubt the trustworthiness of analytics.

      It sounds like that has happened to Insight clients, although perhaps not for the same reasons. It’d be great to hear from some Insight users to help the rest of us understand how they fought (and won) that battle.

  2. Ashley says:

    Good points, Ben.

    I have been lucky in the past to work with executives who were happy provided that we, the analysts, knew what the differences were when comparing data.

    IMO, an analyst really earns their bonus when they look past differences in measurement and focus on what the data is actually telling them. Differences in the way things are measured are merely an inconvenience, another thing to remember when calculating.

    @techalytics

    • Ben says:

      Lucky, indeed! That sounds like the perfect place for an analyst to be. Unfortunately (or fortunately, depending on your perspective), there is a very real problem with the trust of analytics data at many organizations, and seeing unexpected changes in conversion rate or similar metrics only hurts the analytics practice. I’m glad to hear that you haven’t been held back by such problems!

  3. Ben, really excellent post. As digital measurement and analysis matures, the tools used need to match that maturity with stability. So that, as processes are built to make decisions with the data, organizations are not interrupted on their path to incorporated the data into decision making and operations.

    Releases still need to happen, but as Adobe has modeled with the release of SC15, they need to happen responsibly. Those who use the tools to create value will be extremely grateful.

    • Ben says:

      Thanks, Michael. I’m glad you enjoyed it. While neither the SC 15 release nor the GA release is perfect, I think the reaction to both is a great thing for the industry, as each underscores the degree to which organizations have come to rely on data.

  4. Tim Wilson says:

    Great post, Ben. The one slightly different angle that I’ll add — it’s not so much a counterpoint as another perspective on the same challenge — is that changes (which, in the absence of the hassles of the change itself, are improvements) scare people. Having gone through several “tool replacements” at different companies over the years, I’ve lived through “our comparisons will be whacked” hysteria, and I’ve had to mitigate those concerns (typically, by running two tools in parallel for several months to get a handle on the differences under different scenarios). In many cases — SC 15 is a good example — allowing “time to prepare for the change” really equates to more time to wring hands and delay the inevitable (and ultimately positive) switch. There is something to be said for taking the lumps and moving on so that the transition becomes “the not relevant past” rather than the future as quickly as possible.

    With GA, I suspect that we’re seeing major sample bias in the comments. It’s quite possible that a very small number of sites had very odd underlying visitor behavior, and the redefinition of the session brought this to light. If the new data is more closer to a “true” representation of reality…it’s a fundamentally good thing! But, it’s a lot easier to simply blame the tool than to step back, read the change that was made, and dig in to try to understand what is going on and why the change had a dramatic change on your site.

    Having said that, it seems like Google could have: 1) done a better job of timing the rollout (and communicating the *upcoming* change through the GA interface), 2) sniffed out sites that would see a dramatic shift and then done some *analysis* to get to the root cause so that their documentation could explain what was happening more effectively.

    • Ben says:

      Great comment, Tim. First, you’re absolutely right that in some cases it’s better to dive in head first and make the change, but I think that with the widely varying businesses using analytics (and their widely varying levels of maturity within analytics), that decision is better made from client to client rather than at the product level, wherever possible. I understand that Google’s model for GA doesn’t make it easy to do that sort of thing. One SiteCatalyst customer rolled out v15 to his company with zero advance warning and trained on the fly, which (I think) is what you’re suggesting in your first paragraph. He had no major problems, despite having (I think) dozens or hundreds of users. Other companies told us in no uncertain terms that a forced upgrade to SiteCatalyst 15, without time and help from Omniture to prepare their organization, would wreak havoc on the trust they’d built up throughout their user base.

      And of course you’re right that we could certainly be seeing some sample bias in the comments. I was surprised, however, not to see anyone (either a Google rep or an analytics practitioner) responding there to defend the changes. The first few comments on the post are positive; but shortly it turns into a solid block of complaints. But, again, this really isn’t about Omniture and Google; as I pointed out, Omniture certainly has done this sort of thing in the past (fortunately, not within the past five years), and it certainly led to complaints back in 2006. The point is that it’s dangerous to roll out major changes to the way data is processed to an entire customer base without giving those users time to consider how it will impact them, their analysis, and their business.

      My complaint with the post is that I didn’t lay out a roadmap for what practitioners should do in the face of major analytics platform changes. You’re much closer to it in the first paragraph of your comment than I was in my long-winded post. Getting through the transition as quickly as possible is definitely an option, and I bet we could name the characteristics of an organization that would be prime for this kind of a move.

    • Tim, that’s an excellent insight but ultimately the issue is that google made this change mid-month with zero warning, was stone silent to the complaints on twitter and their blog, then (based on comments from the community) appeared to make a change that “fixed” the huge data changes. Based on the lack of comments from certified partners such as ROI Revolution, I assume they were caught off guard also. I’m wondering if this dust-up will cost GA some enterprise level customers.

    • Tim Wilson says:

      The “no response from Google” — not even an acknowledgment of, “Er…we didn’t expect this to happen; we’re looking into it and will let you know as we figure out what is going on,” would have been in order. This, in my experience, is SOP for Google. I’ll file that away on my list of things to try to get a better understanding/rationale on at the partner summit next month (having never been to one, I have no idea if I’ll get satisfactory answers there).

      As this has played out, it’s looking like the issue was a QA one — Google rolled out the change…with a bug. The bug got fixed pretty quickly (again, with no communication from Google that they had even recognized it as a bug and were working to fix it). I wonder how this would have played out if this had been a smoother rollout? There still would have been some grumbling, but we’re seeing no discernible change in the macro metrics across our clients. How many people would have even noticed a change and/or been sufficiently tuned in to read the announcement?

      I know, Ben, I’ve dragged it back to being “about Google.” :-) Your fundamental point that changing a platform takes planning, communication, and careful timing, is an unassailable truth.

      • Coincidentally, google made a change to gmail that stopped the arrow keys from working at about the same time. And, coincidentally , they finally noticed and responded to complaints yesterday. Perhaps everyone rushed out changes late last week before bowling?

Leave a Reply

Your email address will not be published. Required fields are marked *