I Hate Which Test Won?

Disappointment Sucks

I was recently privileged to speak at the Velocity Conference in Santa Clara this year. My presentation was on the topic of Conversion Rate Optimization applied to your entire organization.

One of the points was focusing on process, not results. (I’ll discuss that in a future post). An example I used was recent post by Which Test Won that shared how some site got a 972% increase in engagement by making the form longer. Reasonable hypothesis IMO.

My point was simple: Unrealistic expectation creates disappointment. “Expectation is the root of all heartache”, William Shakespeare.

Here’s that section of the talk in a nutshell:

 

 

I love Which Test Won. It’s a great place to get ideas and knowledge of what other have tried and succeeded and failed at. I highly recommend it! But… When someone posts that they got a 972% increase in engagement by increasing the length of a form, that is only going to produce disappointment. Imagine your boss reading those results and then later in the afternoon you report to her that you just completed a round of tests and got a very respectable 7.8% increase in conversions over a 30 day period with over 25,000 conversions tested. Her reaction is not going to be great. Your success will feel like a failure. You’re going to feel like you got crapped on, when the company should throw a raging house party for YOU, at YOUR NEW HOUSE they just bought you! “Surprise!” That won’t happen with unrealistic expectations. (OK, it won’t happen at all. But you get the idea.)

Velocity Conference Jon Correll Presentation

It’s a simple point. If you’re tasked to make changes and run tests, and then show those results to people with unrealistic expectations, you’re not going to be successful, as failure is part of the process. You need to embrace failure and love learning.
I’m not even going to go on a rant about showing these conversion rate “increases”, that are really more a point to the significance, not the true relative increase. If I did rant about that, then I would have to start the post with why I’m a hypocrite in discussing it, as I have done the same.

Back to the “I Hate Which Test Won?” story…

(the day after my presentation)

Random Dude: So, you hate Which Test Won?

Me: Ummm… I’m sorry? I hate Which Test Won?

RD: Yes. You hate Which Test Won… Your talk…

Me: No. No. No!! Not what I meant, is that what you got out of it? Man, I’m worse at speaking than I thought.

RD: Well, it’s clear you did not like the example they shared, and that you don’t like people parading stats around without full disclosure.

Me: Ok. Yes, that’s true. But I started that rant with “I love Which Test Won and recommend it.” Didn’t you hear that? And it wasn’t just aimed at WTW.

RD: Yea, I just assumed they were a sponsor and you didn’t want to offend them.

I continued to explain in detail the problems I’ve seen with those of us in the industry (myself included) of how sharing high level success stories is damaging to the overall process of conversion rate optimization.

I’ll summarize the problem I see in sharing high level stats along with successful tests.

1. Top line numbers mean little, without detail

Sharing top line numbers like relative conversion rate “increases” without the complete detailed data, tells very little of the story.

One example I’ve used in the past, that is STILL on a conversion rate services site (just checked),  shows screenshots where a site got a 71.3% increase over the baseline. (changed slightly to not “out” them.) Take a guess how many conversions this was on…. No, seriously, take a guess. 10,000? 5,837? Please tell me it’s at least 1,000 or so? Nope. A grand total of less than 100 conversions. That ain’t right.

2. Unrealistic Expectations Kill Motivation

I’ve lost count the number of times I spoken with someone that had stopped trying to test, because they felt like it wasn’t getting anywhere.

“Why?” I would ask. “I keep reading about these 47% 89% 157%, we’re getting like 5% or 10%, so why try?

I hear ya brother! I have an example of a travel site we brought in where we got them a 15% increase in sales. Man they were disappointed. (They were hoping for a 50% increase.) An extra $50M to $100M in revenues sucks?!?!? No but unrealistic expectation sure does. (And I royally screwed up, not in results, but in managing their expectation.  As my kids would say “Stoopid!!” Hard lesson learned.)

3. Process Trumps Genius

Following a process is better than being a genius. Follow the process – Learn – And the results will come!!

5% on a $1 billion dollar site is incredible, right? $50,000,000 is not a bad job for a result. Heck, even 5% on and a million dollar site is nice. $50,000 anyone? Sweet!!

It’s hard to bring home the bacon on tests these days. REALLY HARD. Let’s understand the hard work, and focus on the PROCESS of testing and trying new ideas, NOT the results.

Free-Hugs-Star-Wars
I Love Which Test Won

Final thoughts: Go sign up for WTW now!!

Ignore the results, and get ideas!! Seriously, WTW is a great source of ideas, which are the basis for any optimization. Give them a visit! (And no, I don’t get paid to say I love their service.)

 

Talk to a conversion rate expert now

Contact Us

4 thoughts on “I Hate Which Test Won?

  1. Ha! As the founder of whichtestwon i can tell you i hate those case studies with extraordinary results as well. Generally THEY REVEAL more about how bad the original page (or losing version) was than how great the test was. i admire companies like dell which test constantly, so their sites are so optimized that a 5% gain is an exciting win. All those incremental gains Add up to big wins over time of coursE.

    Continual testing equals real success. Our job is to help you come up with ideas for The hypotheses that drive your new tests.

    1. You’re absolutely right Anne. The process of continual improvement is the key. We also need to continually improve not only the process but the people as well. 😉 It’s important to reach out and see what others are doing to help generate and challenge our own ideas (which are often wrong). Our quest should be to learn and grow; to become mind-blowing engines of creativity.

  2. Great post. I agree. I hear more and more people say not to “trust” results at WTW. It gives the feel that the site should be ignored. But i also think that is the wrong approach, for all the lack of transparency, the site is still a very valuable input to a number of discussions regarding split-testing. So thank you for the post and thank you for the WTW.
    /Ole, Optuner.dk Denmark

Leave a Reply to Anne holland Cancel reply

Your email address will not be published. Required fields are marked *