Archive for Analytics

Good GM, Bad GM: Late Bloomers and Draft Prowess

I don’t expect to receive an answer to this question since I know there only three of you out there reading my blog, but I’m going to ask it anyway.

Let’s say you’re evaluating an NBA GM’s drafting/scouting ability. Should he get credit for picks who ultimately turned into solid players, but did so only after leaving the team that drafted them? Take, for example, Kris Humphries. I know you think he’s overpaid, but don’t forget that after his two seasons with the Jazz, everyone believed he was a total bust. He notched just 0.1 total Win Shares during his first two seasons in the NBA. But fast forward a few seasons and Humphries has tallied a totally respectable 10.7 Win Shares while averaging a double-double over his past two seasons with the Nets. So should Kevin O’Connor get credit for drafting Humphries, a serviceable NBA starter, even though the Utah Jazz never benefited directly from that pick?

This is vaguely similar to questions digital marketers face around multi-touch attribution. If a user arrives at your site by clicking a paid search link in Google but does not purchase, and then a month later arrives at your site by typing your address into his browser and this time he does purchase, should that original paid search click-through get credit? If so, how much? It’s a little different because most NBA players would have been drafted eventually anyway; if Kevin O’Connor hadn’t picked Humphries, someone else would have, and we’d be wondering whether that person deserves credit.

I can see arguments both ways. A GM who picks a player who only pans out later in his career might have correctly read the player’s potential, and we should reward that GM for his vision. But a GM’s job is to deliver concrete wins to his team via the draft, and a late bloomer does not help his cause. In case anyone is out there reading, leave me a comment: what do you think?

Sloan Sports Analytics Conference: Day Two

As I have been tweeting, blogging, and updating Facebook about the Sloan Sports Analytics Conference, I keep hearing comments like: “I wish I could go, but I will never be able to” and “this is definitely on my bucket list now.” A few things about this. First, at $500 for non-student admission, it’s a bargain; even with hotel and airfare I’ll bet most of you could do this conference for well under $1,500, and it will be a truly memorable experience. So start saving a few dollars a week in a cookie jar today. Second—and I don’t think people realize this—anyone can attend. You don’t need to work for ESPN or an NBA franchise. My sister has a friend who does Account Management for Google, but he loves sports and analytics so he pays his own way to fly out every year from the Bay Area. All are welcome.

The other thing which pleasantly surprised me (I touched on it yesterday) is that you do not need to be a Ph.D. candidate in advanced statistical modeling or econometrics to thoroughly enjoy the conference. There were some sessions that really stretched that area of my brain, and others that were accessible even to non-sports fans, let alone non-academics. So don’t let that scare you away.

Highlights and thoughts from day two:

  • There was a request on Facebook to hear more about Bill James. As the godfather of the Sloan Sports Analytics Conference, he was definitely one of the stars of the show. Unfortunately, I didn’t feel like he shared much new information about himself or his work. Simmons did a live BS Report with James at the end of day one, during which they rehashed his rise from unheralded part-time stats geek to patron saint of sports nerds, but it’s all stuff we read about in Moneyball. Day two featured a “Boxscore Rebooted” panel with James, John Thorn (MLB historian), and John Dewan (baseball info solutions), but most of it is so widely accepted in these circles at this point (“Wins for a pitcher is too arbitrary! The Internet makes analytics easier!”) that I barely took notes. To be honest, it felt really strange to be so nonplussed by this guy who quite literally invented advanced baseball analytics. In talking to a few other attendees, I got the sense that I’m not the only one. We all have tremendous deference for what Bill James has given the world, but he does not seem to be on the bleeding edge of sports analytics anymore.
  • The conference (at least this year) leaned heavily toward basketball, probably because the millennials who dominated the conference are in a demographic where the NBA is excelling, whereas I believe I heard that the average baseball fan is between 45 and 55 years old.
  • I reviewed three basketball-themed research papers in yesterday’s post, but the best was yet to come. The two best research papers by far (in my mind) dealt with “spacial analytics” in the NBA, meaning the study of where the 10 players and the basketball are placed on the floor (or in the air) when key events occur, as opposed to pouring over isolated numbers to obtain insight. Jared Dubin of Hardwood Paroxysm did an excellent job reviewing these two presentations (he even has screen captures), so I won’t go into too much detail.
  • I will add that I thought the rebounding study was not fully matured—it presented a ton of potential to help teams understand how to position players in rebounding situations, but it wasn’t quite there yet. Key Insight: Teams’ offensive rebounding percentage decreases significantly the farther the shooter is from the basket, until you get to the three point line. Behind the three point line, offensive rebounds are more common than for mid-range jumpers. Especially considering that neither a mid-range jumper nor a three is likely to generate a lot of free throws, it stands to reason that mid-range jumpers are the least effective shot on the floor (which we’ve all kind of known for some time, but it’s nice to have data to back up the theory).
  • My absolute favorite research paper was Kirk Goldsberry’s creation of the “Range %” number, a statistic which tells us the percentage of spots on the floor where players are effective scorers (defining “effective scorer” as “one point per FGA”). The average NBA player is effective from 17.2% of the 1,284 spots on the floor that Goldsberry measured by breaking the floor into a grid. Even though Tyson Chandler leads the NBA in FG%, he is far below average in Range %, scoring effectively from just 4.3% of the floor (not that anybody thinks Chandler’s high FG% means he is a great shooter). Dubin recaps the top few players in Range %. I was mildly surprised that Steve Nash beat out Ray Allen for the top spot (using data from 2006-2011), but mostly the data confirmed what you would expect in terms of the most comfortable shooters and least comfortable shooters. Key Insight: I can’t express this in terms of a specific recommendation, but Goldsberry’s most immediately applicable contributions are “heat maps” which show exactly where players are effective scores, as well as where they are less effective but still love to try. If I were a coach, I would buy Goldsberry’s technology (he did have a chance to share his methods with Mark Cuban at the end of the conference, so I assume the Mavs will be employing it shortly) and try to get my defense to force opposing scorers to the spots on the floor where they can be coaxed into shooting despite low effectiveness. Similarly, I would design plays that put my players in the best position to score from spots on the floor where they shoot well. It’s not rocket science, but I believe it would work. Isn’t this the kind of thing that Shane Battier has been doing for years? (And where has he been getting his data? Presumably just from video scouting. Goldsberry’s method is more complete.)
  • Best panel of the entire conference in my mind was Saturday’s “Fanalytics” featuring Bill Simmons, Jonathan Kraft, Tim Brosnan (EVP Business, MLB), Nathan Hubbard (CEO of Ticketmaster), and John Walsh (EVP, ESPN). It was supposed to have Mark Cuban on it as well, which would have been even better, but Cubes was running late. The whole thing was about improving the fan experience through technology. This one deserves sub-bullets:
    • The NFL is improving in every possible metric except for fan attendance. Today’s fan needs to be able to use the Internet on mobile devices (for Twitter, fantasy football, live video of other games, etc.) or they won’t come to the game. Mark Cuban doesn’t want people using their cell phones at NBA games, but the NFL recognizes that its whole fan experience model is different because of the pace of the game (frequent stops and starts) as well as the nature of the NFL (everything is happening all at once, on the same day of the week).
    • Kraft: “We’ve spent $2-3 million [to upgrade WiFi infrastructure at Gillette Stadium] in the last couple of years.” He continued, saying that to allow 70,000 people to stream video over WiFi at Gillette would cost “literally tens of millions of dollars.”
    • Simmons asked whether it would be feasible to charge different prices for tickets not just by section, but by individual seat. For example, one section at an NBA game might stretch from the baseline to almost mid-court, and why are those tickets priced the same? I had wondered this, since obviously the technology to price tickets on a seat-by-seat level exists. The answer is that if a fan sees that the guy next to him has a different face value on his ticket, he is likely to get resentful and angry. So they do it by section and live with the fact that this isn’t fully optimized pricing.
    • On the night of the AFC Championship Game, when Kraft wanted to relive Billy Cundiff’s missed FG, he went not to NFL.com, but to YouTube. This is surprising since the NFL maintains strict media rights, and the video was available on NFL.com. Why YouTube? Because a guy sitting in the endzone had the perfect angle to film the kick sailing wide left, and had uploaded the video. It was the best angle Kraft had seen. The lesson regarding NFL media rights and fan-shot video? “You can’t stop it, so you better start learning how to use it.”
    • TicketMaster operates both a primary ticket vendor and a secondary market vendor (TicketsNow), so they can use Omniture (nice shout out for my friends) to analyze ticket re-selling and compare with original sales. According to Hubbard, “Technology is showing us that our tickets are worth more than what we’re selling them for.”
  • Weird recurring theme of the conference was presenters’ inability to pronounce player names. The professionals did not have this problem, but the student researchers did. The two most egregious (and there were others) were the old classic “Da-RON” Williams instead of “DARE-in” and the even-less-excusable Kevin “Dur-ONT” instead of “Dur-ANT.” I mean, Kevin Durant is a top-three player. How can you be presenting on the NBA at a conference of sports nerds and not pronounce his name correctly?

I could keep going, but I need to stop somewhere. Suffice it to say, SSAC was an absolute blast. I can already see myself looking at certain aspects of game action and the sports world at large a little differently, in a good way. As I said to people numerous times during the conference, I will definitely be coming back, even if I have to plunk down my own money to do it. Sports and data, together at last. I think it’s a beautiful thing.

Sloan Sports Analytics Conference: Day One

It’s time for a different sort of analytics conference. The eMetrics festivities may not start until Monday on the west coast, but 2,200 sports dorks gathered in Boston this weekend to talk about sports analytics in the annual MIT Sloan Sports Analytics Conference.

What does this mean? Well, the sports world is full of data. Every dribble in basketball, every pitch in baseball, every snap in football generates new data points that we can analyze to understand the games we love. Last season, the Dallas Mavericks used advanced analysis to determine that their best starting lineup included J.J. Barea. They made the change. . . and won the NBA title. Analytics isn’t just for business anymore. That is what this conference is all about.

I won’t give a travelogue. Instead, some general, brief highlights and observations, from the perspective of a sports fan and digital analyst.

  • I know I just said that the conference is about analyzing the game and the players, but I was surprised at the amount of a.) sports strategy discussion devoid of data, and b.) sports business analytics (e.g., StubHub discussion ticket sales analytics; ESPN, NBC, and others discussing the world of media rights). There really is something for just about everyone.
  • There is a LOT of crossover between digital analytics and sports analytics. Maybe the tools are different, but the principles and challenges are the same. The basketball analytics panel featured a bunch of quotes that could have occurred at eMetrics or Omniture Summit:
    • “There is ‘counting things’ and there is ‘analyzing the things you count.” -Dean Oliver, ESPN Stats & Info
    • “Statistics do two things as a coach: they allow you to figure stuff out, and they allow you to communicate.” Jeff Van Gundy, ESPN analyst and former NBA coach
    • “”A lot of times in analytics, you don’t want to come out with a single number.” -Oliver
    • Oliver also talked about preparing insights for coaches, and said that he used “very few numbers” in these reports, instead translating everything into words that coaches (i.e., executives) could understand.
  • The people at this conference are crazy smart. 73 professional teams and something like 175 colleges are represented. I couldn’t even follow a lot of the math/statistics in the research papers. Unlike some conferences I’ve attended, I was mentally worn out by the end of the day. Great feeling.
  • This conference is a tremendous value. Admission was less than $500, in exchange for which you get to see the greatest minds in sports debate cutting edge strategy and analytics, and they are all accessible. If you ever wanted to ask ESPN’s John Hollinger a question about NBA analytics, this is the place to do it. People like Bill James wander the halls just like anybody else.
  • Jeff Van Gundy was a revelation today. Everything he touched was comedic gold. We’ve become familiar with his wit during the ESPN NBA broadcasts, but he was in fine form today, tossing out sardonic commentary at every opportunity. Everyone I’ve talked to has agreed that we all need more Jeff Van Gundy in our lives.
  • Just a few sports insights and possible recommendations, if you’ll indulge me. The great thing about sports analytics (for me) is that it’s REALLY easy for me to see the kinds of recommendations you might make based on the data.
    • One study of performance under pressure showed that the home team shoots free throws worse than usual in late-game, high-pressure situations, whereas the away team is unaffected. The reason, they hypothesized, is that the crowd tries to avoid distracting its own team in these situations by getting very quiet, which inadvertently allows the player to focus on the action of shooting, causing them to “overthink” the shooting motion. The away team has fans yelling and jumping during their free throws throughout the game, so there’s no real difference. Recommendation: Fans shouldn’t get silent during home team free throws late in the game.
    • Another study took the concept of plus-minus and broke it out by individual skills, making it possible to see how players impact their teams in very specific ways beyond top-level stats. They also demonstrated that some skills are synergistic, meaning that putting two players who excel in a certain area on the floor together make both players (and other team members) better in that area than they would be otherwise. The whole ends up greater than the sum of its parts. Recommendations: Find synergies and build rotations to maximize plus-minus in key areas. For example, put players who create turnovers on the floor together to get even more bonus turnovers.
    • Finally, a study attempted to show the relationship between  experience and playoff success; do teams require experience in order to succeed in the postseason, as is often assumed? The answer was no. Experience does not matter among players. Young teams fare as well in the postseason as experienced teams. However, coaches who have coached in the postseason before perform better in subsequent playoffs. Recommendations: Depending on your team’s situation, consider not overvaluing veteran leadership. Also, look for head coaches who have coached in the postseason (even if they haven’t won titles).

It was a very full day, but tomorrow looks great as well. Time for bed so I can fill my brain with more sports analytics tomorrow.

Ten Things Your Vendor Wishes You Did Better

Before joining ESPN, I spent a little over five years at Adobe (formerly Omniture, prior to 2009 acquisition) as an enterprise software vendor. While I had four different titles during those five years, the overall theme of my time there was “customer relations,” as I moved from technical support to community management to product management. I gave a lot of thought to the different kinds of customers I worked with. What made some companies really successful in their interactions with us (and likely with other vendors), while others—using the same products/solutions—struggled to get value out of the relationship?

Anyway, I was honored to be invited to present at Web Analytics Demystified’s first ACCELERATE conference in San Francisco last month. The WAD team thought it might be nice to have me present on ten ways to get more value out of vendor relationships, and I tweaked that slightly, giving it the title “Ten Things Your Vendor Wishes You Did Better.” It was intended to be slightly edgy—hopefully we all realize that there are no perfect software vendors, but I wanted to be clear there are also no perfect clients, and (despite what some conversation out there may suggest) vendors are not actually evil, two-headed, fire-breathing monsters. My goal was to give listeners some points that they could use at they work with their vendors so that both sides of that relationship can benefit and improve together as partners. (As I said when I got to the podium, I am the Dr. Phil of ACCELERATE.)

So here are the ten things that your vendor wishes you did better, with some articulation on each. These are in no particular order, and while they were written to apply to digital analytics (and that is how I will speak of them), they likely carry weight in other types of enterprise software as well.

1. Teach your internal users that not all problems are the vendor’s fault. When I got to ESPN, we had an intern who had just finished doing a survey and holding focus groups to help us understand some of the problems that our internal analytics customers were facing in working with our software solution of choice. One of the themes that emerged was that they were having a hard time finding the reports they needed.

Don't just blame the vendor!It would have been easy for us to lay that on the vendor and complain about the non-intuitiveness of their interface. But my VP wisely pointed out that our vendor gave us the ability to customize the UI nearly five years ago; we simply haven’t taken advantage of that feature. So is that on them? Maybe partially. But it’s also partially on us, and if we make that clear to our users, they will be able to maintain a bit more trust and confidence in the tools that we are providing to them.

Similarly, if your analytics implementation is three years old, don’t immediately complain that your vendor “doesn’t understand your new business initiatives.” They are there to support you, but ultimately your company is responsible for updating your implementation as your strategy changes.

Take some responsibility and hold yourself partially accountable for implementing, maintaining, and supporting the vendor solution. Everyone will look better in the long run if you do so.

2. Don’t be “the client who cried wolf.” A few years ago I worked with a customer for whom every issue was more than critical—it was (they claimed) cause for termination. We ended up in a cycle where they would call in and demand an immediate resolution “or else.” That “or else” also included the promise to publicly embarrass us using social media or any other means available to them.

Your vendor is not your puppetIt is neither wise nor responsible for your vendor to react to every product complaint or feature request in the same way. In other words, if everything is nuclear, nothing is nuclear. And if you keep insisting that everything is nuclear, it becomes difficult for the vendor to discern which issues really are mission-critical, and which are merely important. As a result—somewhat paradoxically, it might seem—you actually risk getting worse service. In the case I just mentioned, I believe we maintained a high level of support, but we certainly did not shift around resources every time we got a phone call from this group. If you act like a bully, expect to be treated like a bully. It’s the only way for vendors to maintain some semblance of order and progress.

There are certainly scenarios where a relationship has deteriorated to the point where it might be time to look elsewhere. Maybe you’re not seeing the value you expect over a long period of time. Just make sure that you don’t play that card unless you really mean it. And if that remains your ace in the hole, it will carry some real weight when you play it, and your vendors will do whatever they can do to see you through to resolution.

3. Share your business with them. I spoke with one Account Manager who said that this was the most important thing that her clients do to help her serve them. A few years ago she was working with a food delivery business. In the course of her conversations with that team, they had explained what they do, and why they are passionate about their brand and their product. They had gotten her into their product catalog.

Your ideal account team on the vendor side—sales, account management, consulting, support, etc.—functions almost as an extension of your own team, right? They are more than just representatives—the people you call when something is broken or when you need to purchase a new product/feature. What better way is there to create an extension of your team than to share with them what you’re trying to do? Help them feel like a part of the team and they will do just that. So tell them what’s going on at your company. Tell them why what you’re doing is neat and how you’re changing the world.

When this Account Manager saw her client’s product catalog, she actually became a customer by purchasing some food for delivery from them. How is that for closing the loop? Now not only does she know what their business goals are, but she has been through their whole site experience, and she is uniquely qualified to offer strategic and tactical advice. That makes her happy, and it certainly helps her client get more out of the relationship.

4. Use the right resource for the question. What would happen if your vendor salesperson sent an important and time-sensitive question about your contract to your Network Operations team instead of to you? There are a few possible outcomes:

  1. Network Operations fumbles around trying to figure out who is the right person to answer the question. The e-mail finally makes it to you, but precious time is wasted.
  2. Network Operations wastes a bunch of their own time and resources to answer the question themselves, when you may have known the answer off the top of your head.
  3. Network Operations assumes they know the answer and fires off a quick (but incorrect) response, creating an awkward situation months later.

To some extent, the same outcomes are possible when you send to your one of your vendor’s product mangers a question that is really ideal for their support team, or when you submit a request for detailed consultation to your account manager instead of to your consultant. It is likely that your account team has specialized roles that are uniquely suited to serve your varying needs. Talk to your Account Manager and find out what resources are available to you all across their company. You can always rely on your Account Manager as a point person, but take advantage of access to people whose roles are uniquely designed to match your differing needs.

There certainly is a time to call product management as opposed to support or sales, but if you can learn whom to contact in various scenarios, you will get faster, more expert care and you will find yourself much less frustrated with the process of getting the answers you so desperately seek.

5. Participate in the community. I know I get a lot of value out of participating in my vendor’s community, but how does it help the vendor serve me?

Smiling community facesFirst, understand that by “community” in this case I am not referring solely to Twitter (although that may be a large part of it). This includes opportunities such as beta testing, customer advisory board participation, and more. When I was at Adobe we launched the Idea Exchange, which allowed customers to share their product enhancement ideas with one another. Not only did that give us, as a vendor, some phenomenal ideas (many of which we implemented), but customers frequently solved each others problems.

And that’s my point: even if all you do is sit on the sidelines and listen—you watch the tweets or the ideas or the forum posts—you will be exposed to some of the forward-thinking solutions that other customers are creating. You will be hip to all of the latest news coming out of your vendor’s headquarters. It’s also a great way to establish relationships outside of your account team. When I operated the @OmnitureCare account on Twitter, I was never part of anyone’s dedicated account team, but the relationships I established through the community allowed me to serve customers efficiently and, often, publicly—to the benefit of casual observers who may have had similar questions.

6. Don’t assume telepathy. My colleague at ESPN once told me, “People think that our vendor should be able to read our minds without us having to tell them what we expect and when we expect it.” What a recipe for disaster!

You hate when people do this to you:

  • “Hey Ben, can you tell me whether our campaign was successful?”
  • “How do our metrics look?”
  • “Did our change to the home page lead to more engagement?”

Vendors are not Dr. X.These are vague questions that are tremendously frustrating to serve. I know that a ton of analysts out there get these questions frequently from their colleagues, because I see the complaints on Twitter. Why do we think that our vendors are any better at reading our minds?

Be clear. You don’t just want an executive dashboard; you need one that has a specific set of KPIs and addresses specific business requirements, and perhaps it needs to look a certain way. Oh, and you need it by a specific date so that you can show it off at your next quarterly department meeting. Even this is too vague, but hopefully you get the point. Hopefully you have smart, talented vendor representatives working with you, but (thankfully?) they are not Miss Cleo. So be as clear with them as you wish people were with you, and they will amaze you a lot more frequently.

7. Hold that periodic call and make it count. This happened all the time: I would field an angry (sometimes borderline violent) complaint from a customer via Twitter. I would talk to the Account Manager to ask what is going with this client, and he or she would (often with a wistful sigh) respond some variation of:

  • “We have a weekly call scheduled, but he never shows up.”
  • “I want to meet with them, but they won’t accept my invites, e-mail, or phone calls.”

Even if you need to yell, hold the callFrankly, as a vendor, I had a hard time mustering up sympathy for an unhappy client who refused to get on the phone with the account team to discuss the relationship every so often. I hope that your vendors are offering some sort of regularly scheduled “check up,” be it weekly, monthly, or even quarterly. More importantly, I hope you are actually engaging with your vendor to make sure that you are both on the same page. If you need to yell at them during the call, so be it. Just make sure you’re having it. Anything is far, far better than nothing.

In an ideal world, this phone call is more than just a discussion about your complaints, product bugs, feature demands, etc. It’s a great opportunity to get your account team involved. Hold them accountable to provide strategic guidance in your use of their products and solutions. Tell them three items of priority to your business, and ask them how this relationship is going to help you achieve those goals during the coming quarter.

I worry that many of the clients I mentioned above—the ones who wouldn’t bother with that scheduled phone call—got in the habit of skipping the call because there were, at some point, no major complaints to address. If there are no showstoppers, why spend the time, right? Wrong. This is your time to really move forward. In most cases, the act of resolving support tickets is not going to change your business as quickly as huddling on your strategy is going to change your business.

8. Understand the reality of resource limitations. This is a true story. Once I was at a basketball game and I tweeted a photo of the court from my seats. I got a response a few minutes later from a client who said—and I know he was mostly kidding because there was a smiley emoticon attached to the tweet—”How dare you attend a basketball game while my bug, #XXXXX, is not resolved!” He may have been mostly kidding, but he was also partially serious.

It would be great if vendors could build everything that we need built, and fix everything that we need fixed, and do it yesterday. This Venn diagram speaks the truthThink 24/7 support, engineering, network, and product staffs spanning the globe and just building, building, building. But it isn’t so, and will never be so.

The reality is that vendors are tasked with looking out over their customer base and their market/industry and determining what needs to get done, and in what order it needs to happen. Sometimes this will be pleasing to you; sometimes it won’t. You will not love every product or feature that they release. That’s okay. Part of their job is to balance your voice with their own interpretation of the future.

It took me a full year in product management to realize that if vendors always build what exactly what their customers are demanding, those customers will never be happy.* The vendor will always end up a step behind because things change so rapidly. We, as customers, ask for the things will help us right now, not necessarily the things we are going to need in a year or two or three.

So do not be upset if a product manager goes to a basketball game. Understand that the world is more complex than the bug you reported, and that the vendor has its own vision and strategy that will hopefully impress you in the long run—even if there are sometimes point releases that make you yawn with boredom.

(* – Note that Cuban is going for high shock factor in the title of his blog post. In reality, listening to customers is critical, and is something that the best vendors do really well. My point is simply that there are also other sources of insight, vision, and innovation which help your vendor make the best decisions.)

9. Prioritize everything. And I do mean everything: support tickets, outstanding questions, bugs, feature requests, consulting projects, and more.

Prioritize everything!This is about expectations. Help your vendor understand what matters most to your business at any given time. This is critical because the list may change frequently. Something that mattered greatly two months ago may have been superseded by a new challenge. Don’t let your vendor go on thinking that the old assignment is still priority number one if it’s not.

This is actually a fantastic opportunity to use data. When I was preparing to move into product management, I asked a developer friend (not with my company) what successful product managers do to help developers. He said, “They bring data. If you can show that Feature X will generate $5 million in profit, or that 80% of customers are demanding Feature Y, it gives our work clear purpose.”

Applied to your interactions with vendors, consider how much easier it is for your account team to understand the priority of a bug fix if they know that you have $400,000 in revenue riding on it. That kind of data helps you prioritize your work, just as it helps your vendor understand just how much their services matter. That makes it much easier for a developer, support agent, or consultant to justify spending an extra few hours getting the issue resolved.

Where I have seen customers prioritize their needs, it has improved their level of service immensely. Account teams and clients are always on the same page, with both groups understanding what needs to be addressed first, and why.

10. Ask! This little list could easily be twice as long, but at 2,685 words and counting, I’ll stop on this point. Every company is different, every vendor is different, and every account team is different, so ask what you can do to help them serve you more effectively. I honestly do not know a single Account Manager who would not appreciate the opportunity to have this conversation, and to discuss how both vendor and customer can get more out of the relationship.

I have seen a number customers who, either intentionally or serendipitously, have applied some of the suggestions that I have discussed here. Are there still hiccups? Certainly. I said up front that no vendor is perfect! But I believe that these customers also understand partnership, and the concept of increasing returns.

So give it a shot; help your vendor help you. Simple concept, worth its weight in ROI.

All photos © ShutterStock

Is Data Ruining Sports?

Who would you rather have: Tris Speaker or Ty Cobb?

Jason Whitlock says that this question cannot be discussed; it can only be answered, thanks to the popularity of the book-turned-movie Moneyball, and sabermetrics, the advanced statistics that baseball fans and writers can now apply to the game as a lens through which to understand and contextualize the game. (Cobb had a better career OPS+, 168 to 157, so I guess he was better.)

someecards.com - Let's see a movie about a baseball genius who leads his team to winning one playoff series in 14 years

Whitlock argues that data is sapping the fun out of the sports. Little Timmy can’t enjoy the game of basketball anymore because nothing is left open to interpretation; there is a “right answer” to every question. Kobe versus MJ. Wilt versus Russell. Jason Whitlock believes it’s not even worth discussing anymore; some pencil-necked geek will inevitably come up with an empirical correct answer.

The problem with Whitlock’s argument is that it absolutely cannot be proven without resorting to data. On what basis does he believe that sports are being ruined for fans? What led him to this conclusion, other than his own personal distaste for advanced statistical measures? Here is some data to suggest that Jason is wrong.

If fans can’t enjoy sports anymore, because of data, how come ESPN keeps seeing excellent ratings for football, baseball, and basketball? When the Yankees played the Red Sox on August 7, it was the most viewed baseball broadcast on ESPN since 2007. The Patriots-Dolphins on Monday Night Football last week “delivered a 10.3 overnight rating, the second highest opening-game rating since ESPN started airing MNF in 2006.” Why are people watching instead of just watching the players’ statistics change in real time, since data has ruined sports?

If fans can’t enjoy sports anymore, because of data, how come attendance in the NBA has not slipped? It has stayed essentially level—around 21 million, near the cumulative total max capacity of all NBA arenas for 41 home games per team—since at least 2004, which is as far back as my NBA attendance data goes. Mr. Thompson and I did some rudimentary analysis of trends in NBA data. Fans aren’t staying home, and they’re not why the NBA is locked out. They’re having fun and enjoying the beauty and the drama of sports. (Data cannot tell you with certainty whether Kobe is going to hit that fadeaway at the buzzer to beat the Spurs; it can only tell you what the odds are.)

If fans can’t enjoy sports anymore, because of data, why is Major League Baseball reporting revenue increases year over year? As MLB reported after last season, the past seven seasons (2004-2010 inclusive) “are the seven best attended seasons in MLB history.” This coincides with the Moneyball era nicely, as the book was published in 2003. MLB revenue in 2010 approached $7 billion for the first time, putting it at around a 6% increase over 2009.

See, in order to prove that Moneyball and Sabermetrics have ruined sports, you’d need to show the world that they are having some sort of quantifiable negative effect. Jason absolutely cannot do that. His argument boils down to fear of needing to defend a position with more intelligence than “well, I just like Kobe Bryant better than Michael Jordan.” The reason people hate data, in sports just as in business, is that it raises the level of conversation and forces them to think more critically about the world.

Jason says we like data because we lack the ability to understand sports viscerally or strategically. I’m not sure what he means. (Oh no, I’m so buried in my data that I can’t tell what defense the Patriots are playing! Is it the 4-3 or the 3-4? Is that called a “blitz?” I can’t tell because, you know, I’m too nerdy to understand football.) This argument is ridiculous. It’s the same thing we hear in analytics for certain dyed-in-the-wool creatives who feel that data is an insufficient way to understand their “art.”

He says, “I saw Player X, and I know he was good, so therefore he’s good.” I’m afraid that’s unrealistic, Jason. See, you have biases. There are things you prefer in players, but that others might not. Errors or flaws you might not see, but that others do. You might see Brett Favre’s greatest game but miss his 20 game-ending interceptions because you were out getting coffee. This is even more likely to be true if your teams or your favorite players (or, if you’re in marketing or UXD, your favorite content/layout/design) are involved. You need data in order to look at the world on an even plane. You think that’s where the fun ends. I’d say that’s where the fun begins.

Ty CobbLet’s go back to Speaker and Cobb. Their advanced statistics are remarkably similar. You could legitimately make a case for either one. Sure, Jason; I suppose a nerd could come to you and say that Cobb had a higher OPS+, and that therefore there is no argument to be made for Speaker. Baseball fans don’t think that way. Speaker won four World Series with the Boston Red Sox; Cobb never won a World Series. Cobb was a terrible leader—perhaps the worst in sports history. His teammates utterly despised him. Yet, baseball fans are far more likely to know Ty Cobb. He was one of the first five inductees in the baseball Hall of Fame, and one could legitimately argue that he was the greatest natural hitter of all time. He is one of two players to accumulate more than 4,000 hits over his career. Despite all of this, there is a strong argument to be made that Speaker, even with his lower OPS+, would be a better player to build around. There is plenty about sports that cannot be quantified, Jason. Data just makes us think a little bit more about the nuances of the games we love.

Here’s another, more current example: Justin Verlander of the Detroit Tigers. A whole bunch of people believe that Verlander, by far the best statistical starting pitcher in baseball in 2011, should be the American League Most Valuable Player. Verlander has a Wins Above Replacement (WAR) of 8.5, which means that if you were to imagine that Verlander were replaced by an average starting pitcher, the Tigers would have won 8.5 fewer games. That is a massive number of wins to attribute to a single player. It’s the best in the league. If you define “value” in baseball as “wins,” you can definitely see how Verlander might be the MVP. But there are plenty of intelligent, knowledgeable Sabermetricians (myself included) who would accept the argument that the MVP should be someone who is on the field every day, playing in nearly every game (whereas starting pitchers only see action every fifth game). It’s a topic of conversation and debate on sports radio regularly. Fans love discussing it, even fans like me who know how statistically dominant Verlander has been. Where would the Yankees be without Curtis Granderson this season? Or the Red Sox without Jacoby Ellsbury at the top of their lineup? You can make legitimate cases for any of these players, each of whom (surprise!) excel in various statistical categories. Could it be that there is more to the MVP race that pure statistics? But Jason, I thought you said that there were no discussions allowed anymore!

I think Jason Whitlock is scared. He is scared that Hall-of-Fame voting in professional sports will someday be reduced to plugging numbers into a computer and seeing who the best players were. (This would guarantee someone like Todd Helton a spot in Cooperstown.) I don’t think anyone, even the great Bill James, would advocate such a hard-line stance. Eric Peterson made this point earlier this month, and I think it was prescient of him to make the distinction, since we’re going to be hearing these anti-data arguments more often as data usage grows, in both sports and business: we like to be data-informed, not data-driven. It’s important for me to know that Ryan Howard’s numbers don’t justify his massive contract, but that doesn’t mean I wouldn’t want him clubbing home runs for my team. (Perhaps that’s the difference between me and the seemingly data-driven Billy Beane, who still hasn’t won the last game of the season.) When I have data to help me understand what I’m seeing, I can put things into context. I can “relationalize” teams, players, and individual plays in new and exciting ways. Yes, Jason, data helps me and many others enjoy sports more than if it were completely up to our eyes.

I think Jason is also is scared that he can’t articulate why he loves John Elway other than “I like him.” I’m not sure why you’re scared, Jason. It seems easy to defend the fact that, even though Peyton Manning has eight more points of career completion percentage, and Tom Brady has a better postseason record, and Dan Marino has more yards, your boy Elway was a winner. He was a better leader than most of those quarterbacks, numbers be damned. Leadership matters, Jason, and it’s not quantifiable, so you’ve got your argument. You’ve got your discussion. And that doesn’t even touch on the physical aspects of Elway’s game that made him special (such as his arm). I could counter by talking about Tom Brady’s decision making, which also isn’t a statistic. (It isn’t just completing the pass that matters; it’s completing the best pass to the best possible target. This will never show on the stat sheet.) At that level—the Montana/Young/Marino/Manning/Brady/Elway level—you’re splitting hairs anyway. We nerds can say, “objectively, so-and-so is the best of all time.” You’re welcome to make a point that isn’t accounted for in the numbers. I don’t see how that should impact your enjoyment of the game or of discussions about the game, other than to make you think.

Maybe you don’t want to be forced to think. If that’s the case. . . tell me, who is ruining the vibrant discussion of sports, you or me?

In 2011, Good Changes Take Time

This is not an Omniture-versus-Google Analytics post. This is not a Google Analytics-bashing post.

This is a post in which I defend a decision that I helped (in some tiny way) to make when I was a Product Manager on the SiteCatalyst team at Adobe.

In 2011, businesses rely heavily on their web analytics data. Analytics may not be where we’d like to see it yet, but it certainly has come a long way over the past decade. And the more critical this data becomes, the more resistant customers will be to uncomfortable change.

SiteCatalyst 15 introduced major changes to the Omniture platform. This brought some great features with it: segmentation, breakdowns on all conversion variables, Processing Rules, etc. But it also introduced change. Specifically, it affected the way that SiteCatalyst determined the number of visits to customers’ sites/apps. In most implementations, it was a minor change, but in some cases it was noticeable in client data. (You can read all about these changes here.)

Because this platform change potentially affected things like year-over-year comparisons and conversion rate, some of our customers weren’t comfortable making the change to the new platform right away. They told us that they wanted some time to understand the new platform and its effects on their business.

As I explained on Eric Peterson’s epic (and awesome) Google+ thread last week:

The feedback that we got on this was that it was significantly painful for many users to have that change occur, but that they acknowledged the improvement. So, if you’re Adobe, do you make the change and alienate some users, or do you hold off and alienate other users?

Alyson Murphy echoed this thought:

Can you really expect Omniture to implement a major paradigm shift without alienating a ton of customers? People love their comparison data. Look at how difficult it is for some companies to shift to SiteCatalyst 15. If a relatively small change compared to what you are suggesting causes that much pain, a huge paradigm shift isn’t going to go over too well with many of their clients.

The day Alyson and I made these comments, Google announced in a blog post that it had also changed the way it calculates Visits. Now, any new traffic source encountered by a user (paid search, natural search, affiliate, social media, etc.) would instantiate a new visit/session. The reaction from customers in the blog comments has been. . . interesting:

I am see weird stuff bounce rate up 50% time on site down 75% this happened from 11th August. on most visits it count each page viewed as a new visit.

Good Grief, less then a 1% change!
“Based on our research” I would love to know how you conducted this research.

I am seeing 20% increase in visits, I thought I had finally broken free of Panda!

How I am supposed to evaluate these new metrics on steroids vs my previous metrics?

My average time on site has fallen from 7+ minutes to 12 seconds. Each visitor seems to visit the same page 6 times causing my bounce rate to be ridiculous.

I think that this update is an example of someone fixing something that wasn’t broken. Now analytics is useless.

The update makes my data virtually useless. It makes no sense.

Over the weekend, I feel that around. half of my visits are returning visitors, and the same guest may have seen the same item up to 10 times. In return, my bouncerate sky-high.

It is a vital part to have a website to have a reliable analysis program, but GA is certainly not very reliable right now, and in my case the data produced now are completely useless.

I think Google Analytics is a great tool. I use it. I use it from time to time on this blog and others, and I like it. This isn’t a complaint about Google Analytics. It is a statement about the way an upgrade which may actually be a very good thing (in terms of helping customers understand their customers and improve conversion) was handled in two different cases. I’m sure someone could explain why that last poster’s average time spent dropped precipitously, and why the new data is more accurate or more actionable.

But that’s not the point.

Conclusion? In 2011, you CANNOT just slap platform changes into your analytics platform, call it good, and expect businesses to adjust on the fly.

When I joined the Product Management team at Adobe in May 2010, we were in the midst of having this conversation with users. On one hand, it was disappointing to hear that so many felt that their users and their businesses needed time—in some cases, at companies with hundreds of users, a lot of time—to prepare for platform changes that everyone agreed were exciting. On the other, it was great to know that SiteCatalyst was that critical to various business processes even outside of the analytics team. But it’s really hard to explain to an executive why conversion rate suddenly dropped by 5% because your analytics tool changed. That’s what required time.

I’m proud that we listened to these customers and that we both a.) released a product with significant platform improvements and b.) created a system that allowed users to prepare before having these changes dropped on their plates. Is the SiteCatalyst 15 upgrade process perfect? Certainly not. But, as I mentioned above, there are considerations beyond simply the need to prepare for a change in visit calculations, and I know for a fact that Adobe continues to adapt and optimize the upgrade process.

(Also, in all fairness, Omniture has been fairly accused to making changes on the fly in the past. For example, in 2006, a SiteCatalyst point release corrected the way that search engines were identified, updating the platform to use both referring domain and referrer query string for improved accuracy. Like I said, this isn’t a tool-versus-tool argument. It’s an observation about the importance of data.)

One more thing: Anyone who tells you that only the analyst matters is fooling you. Anyone who tells you that your analytics tool only needs to serve the analyst is living in a dream world. That may have been true in 2005, but that is not how the real world works in 2011. People all over the business need data. Yeah, sometimes it’s just a perfunctory year-over-year visits comparison. Does it improve on-site conversion? Maybe not. But it matters somewhere else, to someone. Probably to someone who can influence the success of analytics in the business. Analysts had few problems with the platform changes that SiteCatalyst 15 brought, but they knew that, in order for them to succeed and to be trusted to help guide the business, their users needed to know what is going on in SiteCatalyst and not to have metrics changing unexpectedly.

So, when people say, “How come Omniture hasn’t delivered the kitchen sink yet?” remember that this isn’t a fantasy world where wholesale changes can slide right into the businesses painlessly. Google’s platform change proved that, as did the feedback we got from customers at Adobe.