Today is the best time in history to be an evidence-led retailer. Data is available everywhere, tools are improving daily, and the business climate is data-friendly. You hold in your hand the keys to the kingdom. But, you must not squander the opportunity. It will be taken away unless you use it wisely. In this post, Chris Goward, optimization thought-leader, shares how leading e-commerce companies are using optimization processes to gain dramatic revenue lift and insights. You’ll walk away with practical strategies to drive real, tested business growth.

Here’s what you’ll learn:

    • The top ways to optimize your shoppers’ experience.
    • Frameworks and processes to improve your optimization program.
  • Tips to becoming a future-proof optimization champion.

This post is an edited version of the live presentation offered by Chris Goward Founder & CEO at Wider Funnel at the first edition of International E-commerce Day. The host was our very own Valentin Radu, Omniconvert’s CEO. 

Without further ado, let’s dive right in.


Valentin: Welcome to this first edition of International E-commerce Day! I have the honor of introducing Chris Goward, who is the CEO and Founder of Wider Funnel. When companies like Google, Electronic Arts, Ebay, Magento and 1 800 Flowers want great marketing optimization results they call Chris Goward. Chris founded Wider Funnel with the belief that marketing agencies should prove their value.

He wrote the book “You should test that” – and you should read that if you haven’t done this already. The book redefines conversion optimization and outlines how to create dramatic business improvements. He is the brain behind the LIFT model and Wider Funnel systems, conversion optimization strategies that consistently lift results for leading companies.

Today Chris is going to present us the future, “The future of e-commerce belongs to the Optimizers”. Hello Chris, the stage is yours!

Chris: Great! Thanks, Valentin, hello and thanks for the invitation to speak to the audience today and really looking forward to seeing the conversation that happens during the day.

I’m going to talk today about the future of e-commerce, why the future of e-commerce belongs to the optimizers. I want to start with a couple of stories, and I’m going to go through and share some case studies and some stories about real life and what’s happening in business. You know how in most organizations, there’s a conflict that happens between the brand marketers and the response marketers, between marketing and sales or analytics and design or PPC and SEO. There are different mindsets within organizations and decisions are often made without data and without knowing the purpose or the outcomes of those decisions.

You remember your latest website redesign? If any of you have been through a website redesign you know how painful those things are. I want to talk to you today about how there’s an alternative, how that works and why there’s so much potential for you being here and listening to this message. Take this into your organizations and be a champion for change and do something differently.

But first of all, let’s look at this example. Remember this? Have you seen this logo on your phone?

Any of you who are in areas that have Uber you may have seen this icon appear a few weeks ago when Uber decided to change their logo. Now this is a bizarre example where they now have a logo that doesn’t represent the original brand, it’s very different than what they had and it’s not a “U” anymore so it doesn’t imply Uber anymore, we’re not really sure what it even means and in fact, turns out that the design of this was inspired by one of the bathroom tiles of the employees. I’m not kidding about that. And so how did this happen?

Everyone remembers the beloved Uber icon that was on their phone, and so people woke up one morning all of a sudden they couldn’t find their icon anymore because it had changed so drastically and they didn’t know what was happening.

Well, what happened was the CEO decided that he really wanted to redesign and he didn’t want to hire a designer he wanted to drive it all himself and he’s not a designer, he’s an engineer. This is a terrible decision and everyone is shaking their heads wondering how did this happen? Well, this is what we call The Hippo Problem.

The Hippo driven organization where the highest paid person’s opinion is what rules the day, it happens in so many organizations and not just Uber, although this is the most recent example of some silly decision making.

Here’s another one, this is Walmart and I was on their website recently. I’m looking at these offers and I see some Star Wars, and hang on, I’m about to click on the button and it disappears. Now I’m seeing L’Oreal but there’s no button, I can’t figure out how to get back to the previous one that I was interested in. Now there’s “better homes and garden” I don’t know what’s going on here, but the offers are changing so quickly, the messages rapid fire scrolling across the screen, that as a user there isn’t time to comprehend what’s happening, never mind taking action.

This is a personal rotating banner homepage that I’ve talked about for years and yet large organizations are still doing this, using this UX design that clearly is terrible from a user perspective and virtually at Wider Funnel, every time we tested this, it loses and yet people are still doing it because it solves an internal problem.

Wallmart’s homepage rotating banners

And this is a problem in organizations where most decision-making is driven from internal forces, where a designer needs to have more space to create more offers because there are too many departments that are demanding space on the homepage, so the only solution they can find or can think of is to create this rotating homepage banner. And yet from the user perspective, it sucks and it hurts conversions and sales.

I’m going to show you an example of a test we ran. I have obscured most of the page just to focus on the part that’s important. This is an e-commerce subscription website where the visitor signs up to get a monthly delivery of makeup, makeup kit and different cosmetic accessories.

With this new client, we wanted to start with something very simple to make sure the testing works properly. We all know the “submit” button is a terrible button, every conversion optimizer knows that’s the worst button you could possibly use.

We start off with a simple test where we’ll do some different button options, we tested a few of them and to understand also, while we’re testing just a button which is very simple, maybe an emotional driver. So one of the variations was a social signal which tied into the brand this “become a maven” and that created a sense of social inclusion; the other variation was “I want in” so getting the sense of urgency that there’s a gate they want to get through, and find the products they want, the color and beauty kit and the free welcome box.

Whenever I ask this question at conferences, I speak at dozens of conferences every year and invariably some vote for social signal and some people vote for urgency, and there’s an arm wrestle over which one’s going to win. And everyone has an opinion but what’s interesting is when we actually tested this and looked at the real data we found “become a maven” social signal actually reduced their signups, reduced their revenue by six and a half percent, and the urgency one reduced their revenue by half percent.

What have we done here, what have we accomplished? What’s interesting is that a “submit” button, which is the worst practice that never works, worked.

All of these so-called best practices that optimizers have sometimes have exceptions. And of course we went on to continue optimizing after that and found buttons and much more important things to optimize with the layout of the page and the overall messaging as well. But the point is that you should be skeptical of all the best practices, the so-called best practices you’re going to hear today that haven’t been tested.

One of the biggest best practices that I continue to talk about but that continues to happen is the website redesign. Every company has to go through this web redesign every once in a while, right? That’s the economy held belief, except that website redesigns are broken. The process of going through a redesign is broken, it fails.

This is an example of a client of ours, we had worked with them for about a year. This is Josh Bratton, he had a client that was working with us, and then another department came in after we had optimized the website and continued to work on it and decided they need to do a redesign.

They started with a new brand identity, created something really exciting. Everyone was excited about it, they launched this new page and wanted the leads to come pouring in. But in reality, if you’ve ever experienced this, you know what this feels like, that Monday morning with sweaty palms when you click the launch button on the new website and it hurts sales, sales tank.

In this case, it was a lead generation website, but of course, we’ve seen this in e-commerce websites as well every day with people relaunching sites and then it kills conversion rates or something’s broken or something doesn’t work.

And in fact, only twenty-four percent of marketers are extremely happy with their latest website redesign.

You know how much they’ve spent on that, right?

And there are reasons for that, this is a well-known model that I’ve presented over the last little while, about why website redesigns are broken.

They’re so painful and so delayed and over budget that people want to forget about them and so the best of the web though continues to improve, the expectations of websites the expectations of app web applications and mobile continue to improve and if you don’t pay it constant attention to your website, if the design stays static by default you’re getting worse compared to the rest of the web. This is what I call the wedge of sucky-ness where you’re just guaranteed to be getting worse all the time unless you have a strategy for improving in between those redesign gaps.

There’s another problem which we’ve talked about many times. When you launch that new website, or start an eCommerce, you’re changing thousands of things all at once without any insight into understanding which of those changes is actually improving your sales and which one’s hurting it.

Is the new headline improving, is the new product detail page layout helping or hurting? How about the checkout process or the load speed? All of those different things may be helping or hurting and without understanding them there’s no way of knowing.

And so the best of the web don’t do the website redesign anymore. They’re continuously iterating through a process that I call Evolutionary Site Redesign which can also be dramatic but it’s a tested redesign process. When you think about the best of the web, about the Amazons of the world or the Booking.comms they don’t redesign their website, they never do. When was the last time Amazon redesigned?

What they’re doing is constantly redesigning, by testing and iterating the design driving elements, the aesthetic elements, as well as the user experience in the product and the features, all of the things are constantly being tested every day.

I was in Amsterdam just a couple months ago and visited the headquarters there. And they have piles of data and they’re dominating the travel booking market in Europe. They do that by having a floor dedicated to conversion optimization. Hundreds of optimizers taking-up a whole floor, testing thousands of things constantly.

BuildDirect came to us a while ago and they knew they needed to redesign their website, but they wanted to take a data-driven approach. The way we approached this, was to look at the templates and the design driving elements of the templates, break it down and create a new vision of where the design could go and then break it down into a tested process to get there. And by doing that we could test variations along the way, answered questions about the user experience, and even the information architecture, we actually tested the IA different approaches.

By testing the left column as a starting point, they got a 16% lift in sales. And it didn’t change any of the content on the page, of a whole website. The content was the same, all we did was change the design which made it more appealing and approachable and accessible. People were finding the products they wanted easily through that left column navigation.

And then isolating the right column design throughout the site and the navigation header area, and then, of course, the content within the pages as well. And all of these things go through multiple iterations and it’s a constant process, so after that process of just a few months, they ended up with a completely redesigned website with no risk because they had been testing every single element along the way and it looks beautiful and it also happens to increase their sales.

In fact, after 12 months they ended up from sales increased directly attributable to AB testing, which is statistically significant, a million dollars in increased revenue on a monthly basis just from the testing process of iterative improvement.

This is why I say there’s so much opportunity, there are so many problems happening in marketing and in web design and UX and UI that there’s the potential for optimizers to really take charge and do something great is huge. 

There is a problem in organizations that also creates an opportunity for you. There are brand marketers that are so tied to the brand, they’re the advertisers and they believe themselves to be creative geniuses. Hopefully you’re not the one driving this problem, but they believe that their intuition is unique, their insights are powerful and they understand the customer’s mindset.

On the other side of the table, there’s the data-driven marketer, the analyst who shakes their head at the brand marketer and says “you guys don’t have any of the data, how can you know what’s going on?” And the brand marketer looks at the data marketer and says “oh, you’re such a geek! okay, you don’t understand the customer and the potential, you’re just into the minutia”.

And they’re both right!

They’re both right, because if you take only one mindset, one approach, it’s limiting, you’re missing opportunities.

I want to show you a framework and some new elements for how you can capture this opportunity of both of these mindsets in your organization.

Optimization requires humility because you’re often wrong and you have to be able to see another perspective. So the opportunity for you and the opportunity for optimization champions is to become a Zen Marketing Master and to really understand the yin and the yang of marketing.

And I’ve talked about this before and now what we’ve done in the last few months is actually consolidate this thinking into a process and a framework for understanding how the optimization mindsets work and how they can work together.

A zen marketer understands that there’s an intuitive side, there’s the brand side, there’s the qualitative, inspired and fuzzy side to marketing, of understanding the nuance of marketing, understanding the persuasion elements. But there’s also the other side, there’s the hard data side, there’s the proven actionable qualitative or quantitative, logical side of marketing.

And only by embracing the conflict between these, the inherent contrast, can the best marketers come up with the best ideas and validate and prove out which one actually works.

So how does this tie into you practically?

The best optimization process maximizes both the creative and the analytical side, the creative thinking side where inspiration happens and the analytical side where we prove out what ideas are best.

I’m going to show you what we’ve been developing since, a little bit different approach than most optimization companies, so there aren’t that many companies that are actually focused just on optimization like we are, but we’ve taken even a different approach where we’ve only chosen to work with companies that have the traffic and discipline to allow us to test and control experiments, that to really refine the process of optimization, to build a huge database of knowledge to look for patterns and to refine the process to come up with the best optimization program that any company can build.

I’m going to share with you what we’ve just released in the last couple months called the Infinity Optimization Process. And if you really understand it and can embrace it, it’ll totally change the way you think about your optimization. So here it is, and I’ll explain to you why it’s designed this way, because it’s very important.

There are two sides to optimization, I’ve talked about the mindsets. Well, in the optimization process there’s an explore and a validate mindset and they’re very different ways of thinking. The Infinity Optimization Process is shaped as an infinity loop, which implies that it’s a continuous never-ending process and that’s important, that’s the key ingredient to having a strategy of optimization, that the A/B testing is never done.

Now the explore and the validate have to happen separately.

Explore is a creative process, an expansive process of looking for opportunities, is looking at the data from all of the past tests and the persuasion elements through a framework of thinking about the customer and how they’re thinking. And looking for all of those opportunities come insights and you’ll see in the middle of the infinity process is that nucleus, the orange powerhouse where all of the growth and insights happen. From that explore process there are insights that appear about the customer. But those insights are only potential until they’re validated.

Now the validate mindset is totally different, it’s a reductive mindset – it’s one of critiquing and analyzing and reducing options to come down to what really works.

I’m going to show you some examples of how this works, especially on the explore side because a lot of people are not leveraging the real potential and opportunity in the explorer phase. When we look at it, there are multiple components to explore, and the way this is designed is intentional.

There’s User Research, which is understanding the voice of the customer, it’s gathering their input from your customers, and prospects, about how they see your value proposition how they see the user experience, how they see the experiences they’re having with your company and product touch points.

The Digital Analytics includes various aspects of it from your typical Google analytics, site catalysts, to click path to click heat maps and scroll maps which I’ll show you in a moment.

Persuasion principles that are well known, there’s so much research happening right now in the areas of behavioral economics and design and persuasion by design, design for conversion. There are lots of things we can pull in to test and understand which triggers are important for your audience.

The Business Context is important of course. Understanding your competitive environment, the price elasticity and all of the different components that face your customers and then the testing archive. And again at Wider Funnel, we’ve been focusing on building this test archive of thousands of tests to look for patterns.

And then the central component is a framework for thinking about your customer barriers. And there are various frameworks out there. I use for example The LIFT model, it’s the one we’ve developed and it’s the most popular conversion optimization framework there is out there. I’ll explain to you in a moment for those of you who haven’t seen it, a lot of you probably have, but what’s important is that all of these arrows are going through the center.

And the way it’s designed in the Explore, expansive mindset you’re reaching for data and information in and from various components in various areas and then consolidating them through a single picture of the customer and through their mindset and their barriers and how they’re seeing the world.

Let’s look at some examples. So this is, you can see there’s a dot and line happening here, so what this is, is an eye tracking report, so we’re seeing a recording of someone using an eye tracker that we have in-house here.

1800 Flowers Example

This is 1 800 Flowers, one of our clients. And to most marketers, this just looks like some dots and lines going on the screen. But to a trained conversion analyst this is gold, because what this is showing is that the typical F pattern, that a lot of people still think as a best practice, wasn’t happening here. You had the categories of the most important categories were down in the lower left where they assumed people would be looking, but because of the eye flow and the imagery in the design this user was totally missing the important category elements down on the left. This leads to a potential insight of an opportunity to test different layouts to get people’s view into the categories that the marketer wants them to see.

Another example, this is an eye tracking heatmap.

So when we’ve got multiple people doing eye-tracking studies we can get a heat map of the main areas people are looking at. And again in this case, that F pattern failed, you could see people looking at the top and the left side I don’t believe the F pattern works anymore, it’s an old legacy from when navigation used to be on the left side of pages, but people still hold on to this idea. But in this case that pattern looked like it worked but the problem was revenue driving tiles were on the wrong side and so people were missing the opportunity to convert. And so that leads again to another hypothesis for how to improve that.

Let’s look at another example. This is Telestream.

What they do is they sell screencast video editing software online. So people can go and buy online to subscribe and buy the software to download it and install it. We lifted their conversion rate by 26% over six months with a two million dollar revenue increased from optimization. So if you’d like that kind of result, let’s take a look at how that worked.

So what we wanted to test in one of the examples was the product page for ScreenFlow, which was their most important product. And so this was the page.

Now we always take a framework thinking approach by looking at the barriers to customers experiences. And so I’ll just highlight a couple of things. Normally we come up with dozens of problems and there were lots of opportunities we can improve here on this page, but one of them was a distraction. There were three calls to action, one says “try” or “buy” or “watch the video” and so without directing people on what they should do there, there’s competition, they have to think about which action to take and there isn’t any clarity on the price, which was actually a compelling component of this. 

This is a scroll heat map and what this shows is where people in majority are stopping on the page. And you can see what’s happening, here is that red area a lot of people are stopping and not getting down past the green. What this means is that to the conversion strategist this shows a potential insight, whereas most people look at scroll maps and they think they’re just a waste of time, they’re just eye candy to show you know some senior person because it looks fancy but what does it actually tell you?

Well, a trained strategist would look at this and say “oh, you know what’s happening here? Those four tiles are creating a false bottom and below those tiles are the three main value proposition points about the product and people aren’t getting past the false bottom to the value proposition, so that creates a distraction problem”.

So what we wanted to do is test those elements among other things, and in a simple test we isolated the calls to action and consolidated them to create a “download free trial” and “buy $99” that increased their sales by 4,5% by just changing the buttons. Then even more powerfully, simply by moving that value proposition, those features above the tiles increased their sales by11,5%. Wow, so that’s great, that’s a happy customer, I didn’t even have to create new content in this case.

And then, of course, going on from there, what we found is that there obviously is some elasticity with the content here and above the fold, even just making that small change in the button added some elasticity. So we wanted to drill into it even further based on persuasion insight. So we wanted to try adding more tangibility to that header image.

So the first, the control is sort of this aspirational world thing, it’s a little bit unachievable for most people. They’re looking at it and they don’t really relate to how it works with their video editing software well.

So if we added a person and we tested various actually quite a few different iterations of the person and that the screenshot and the angle of the screenshot and found this one worked the best, adding tangibility of someone using the product or enjoying themselves with the product increased their sales by 26%.

That’s a big idea that’s driven by old marketing knowledge. Theodore Levitt, decades ago talked about how if you have an intangible product you want to create tangibility around it, whereas if you have a tangible product you want to create intangibility.

And that’s why you would have seen the late Steve Jobs when he’s presenting the new iPad, he didn’t talk about the technical specs of this tangible product, he talked about magic, right? This is magic, what I’m holding in my hand. It’s not magic, is just a piece of glass and metal but he’s creating intangibility. He was a master salesman, a master marketer. And so that’s what we did in this case, is testing that concept of adding tangibility to something intangible.

Okay, so let’s look at another UX example, I’m blasting through these here to give you as much meat as possible today, and then we’re going to look at some more frameworks in more detail.

This is another e-commerce site: Pine Cone Hill, an Annie Selke company.

In this one, they sell high-end bedding and sheets and pillows, and pillow case and all these kinds of house items. By looking into the data,  the strategists found a pattern here where people were interested in filtering and sorting on the category pages, but there were some hot spots here on “color” and “palette” which seemed to imply that this was one of the most important criteria that people were looking for, they’re looking for specific colors. And yet there was a conflict between color and palette, they meant the same thing, but there’s inconsistent usage.

So that raised the question well, which one’s more important, which one’s more relevant to them and their understanding, which will decrease the cognitive load of trying to translate what’s going on here. So that’s a potential insight, right? That’s happening here from the explorer phase where insights popped out in the mind of the strategist, now they want to validate that.

So how do we do that?

We isolate this element and say “let’s drill into creating an experimental design that will do two things. One, lift revenue, lift their sales, and two, drive an insight that in this case might be a user experience insight”. So we actually spend most of our time on the experimental design and so we take that insight from the explore phase into the validate phase and run it through the validate phase cycle.

Now I’ll show you how this works. It’s this one, you’ll notice is designed differently. In the explore all the areas, arrows are pointing to the center because it’s such a messy process of pulling in data but in validate is highly structured, it has to be highly structured. The scientific process has no wiggle room for creativity for trying something new and it’s important to do it in the right order. And so we start with the PIE framework, this is prioritizing which experiments to run first, that’s critical for managing resource and for the process of optimization. Then creating hypotheses that come from the data that we’ve gathered in explore and then critically the design of experiments, and most optimizers that are out there don’t understand how important the design of experiments is.

We’ll spend a lot of time, in fact, nearly half the effort, of the whole validate phase, is spent on the design of the experiments and really understanding what are we going to get from the result of this experiment. And then running it through UI/UX to match the original or improve it, dev QA the experiment, live testing and of course critically results analysis we’re drilling into the data from the test results can reveal even more insights that lead to more testing and back into the explorer phase looking for more research to validate.

In this case, this was the original category page and so one of the variations we did was simply a redesign, a user experience design where we looked at removing the brand that they should have known that, the size using check boxes which are much more intuitive than the hover links they had there and reducing them to the most popular ones, condensing palette, pattern, and material into dropdowns to get them all above the fold and more easy to use.

And then in the next variation, we simply isolated palette and color and then created other variations where we actually added drop-down color palette hints with the actual colors there to reduce again the cognitive load of understanding the words interpreting them into colors. In this case, it creates much less effort by doing that intuitively.

And then another variation we created a scrolling persistency effect where that navigation, the filtering, and sorting stayed with the users as they scroll through the category page.

Think of these four variations and ask yourself which one do you think won. And I don’t have a pole set up here but think about it for yourself. And you know when I’ve asked this at conferences, most of the time a there’s either the fourth one or the third one usually gets the most votes, so I’ll give you that hint.

When we actually looked at the data, we could see that in the redesign variation of the first one there was a lot more interaction on the filtering and sorting, people found it more approachable and easy to get into than that long list of links they had originally.

Then, of course, everything is driven by A/B testing with statistical significance. So we’re looking at the revenue, wow! Variation A with just the redesign 15.6 % lift. And interestingly we’re finding this to be about the range of lift that can be had when there’s a messy navigation in a category page that can be cleaned up into a better UX we’re often seeing these double-digit improvements in sort of the low tens, but then looking at the scroll feature that included the design and was a 9,8% lift.

Now, if you look at this, was that a winner? Well it lifted sales over the control, but if you isolate and look at the design of the experiments, this was an isolation, it was an isolation against A, because it included, it should have included the improvements from the redesign of A, but included the scroll and so by deducing and removing the redesign effect, that scrolling effect actually reduced their sales, so that’s a potential insight.

And then variation B with the color this one actually increased their sales by 23.6%, a massive win for their sales sitewide.

OK, so we’ve looked at a few different aspects of optimization with the UX, there’s also potential to do much more than just these navigation redesigns and UX tweaks and certainly much more than buttons and copy.

I’m going to show you an example for a cell phone signal booster.

They sell hardware and we ran through the full Infinity Optimization Process over several months. We got a 36% revenue lift after four months. That’s a very large increase, that’s not typical within four months, but a great win. And I want to show you one of the examples of how this works.

This is the control page, the homepage.

And when you look at this you might have some ideas of ways it can be improved. It is a very long page you might have some thoughts, and perhaps some best practices. And we find that a lot of optimizers approach a page by looking for best practices first, but that’s a limited approach. When you look for best practices or you look for tips and tricks, you look for well known optimization principles. Once you’re through that list, where do you go? And that’s why framework thinking is so important.

We’ve talked about the Infinity Optimization Process and at the center of the explore phase is the LIFT model, so I’ll explain how that works and how it helps you to think about the mind of the visitor and come up with unique solutions to what your customers are actually facing on your website.

The LIFT model simply shows the six conversion factors that are impacting your sales right now. The core of it is the value proposition which I think of as an equation that goes on in your shoppers’ mind between the perceived cost of taking action and the perceived benefits.

If the benefits outweigh the costs they’ll have the motivation to act, if it’s the other way around they’ll bounce right away and all the other factors simply enhance or detract from that value proposition.

So there’s the relevance of the presentation, relevance to their needs, relevance of the source media, what did they just click on and are they seeing it on the page, relevance includes things like the color and palette, is the wording relevant to the way they think?

Clarity of the presentation is clarity of the eye flow, the imagery and the call to action. I think about the 1 800 Flowers example, where the layout was causing a clarity problem, they weren’t immediately seeing the categories and it was sort of almost blinded to them because of the way the design was structured it’s an eye flow problem.

Anxiety. Is anything on the page or missing from the page that creates uncertainty in the shoppers’ mind about taking action? Might be shipping costs, do they have to click all the way through to the cart to get your shipping costs? Maybe they want to have that on a product detail page on the PDP, to really understand the full cost of this product.

Distractions. Anything redirecting attention from the primary message or the primary call to action.

And then urgency. Why should they buy now? What’s in it for them to act now? So when thinking about this page, this home page, we can see there were some value proposition and some clarity issues.

Our thought was that they when we looked at how long the page was and I didn’t mention that before but when you see the example the page this on the right-hand side is showing how long the page is.  

And these pages with that length were really popular a couple years ago especially when parallax was really popular and people we’re trying to do fancy things and creating single page websites. Instead of spending so much time on the homepage, we want to get people looking at the products, get them right into the product selecting and making a decision and not spending so much time reading through this long page, a lot of content and some really good value proposition.

We also know from well-known persuasion principles book, if you’ve read Daniel Kahneman who’s a Nobel Prize winner in economics and he talks about how cognitive strain, what I call cognitive load, makes people wary of their actions, is what he says.

We also know from Nielsen Norman group that if a user has decreased confidence in their own abilities, it has a halo effect of making decreased trust in the application of a site that they’re on, which of course leads to less likelihood of them taking action, reducing their conversion rates.

Also, the BJ Fogg Behavior Model shows motivation, ability and that the only way to have action is if your users have the motivation to take action and also the ability. If you can remove conversion barriers, the transactional barriers, and give them motivation, that’s where you are in the money zone, that’s where people will actually convert.

You need to think about both the persuasional and the transactional side of the conversion. It’s not just about UX and it’s not just about persuasion you’ve got to find a mix of both.

In this example, we took the control page which is very long, and created a new and shorter variation. You’ll see that what we did is created a more focused homepage. Instead of having the reversed out purple with white on top, a much more readable white background, with the segmentation, so that people can get into the products they want, it bubbled up the best selling products right below that, so that people didn’t have to scroll very far to get to the best selling products.

You can see the variation A and then variation B was another redesign, with some radical redesign that went a step further to really shorten the page. And what we understood was that the most valuable content behind the tabs was this area of understanding how the signal booster worked, and so we wanted to get that description right on the page and then get them right into the categories that they want. A really simplified design as well as some tabs, the black tabs, had a secondary effect of measuring the interest in different elements. A better cell coverage and by clicking on these tabs you would have more descriptive that was content that would swap out below that to understand the product itself and then get into the categories below them.

A much more dramatic change than simple UX elements. These aren’t just button tests, these are our strategist sitting down and thinking about how do we solve the barriers to conversion for this visitor, what do they really need to understand, what questions are they asking in their mind and what’s the easiest, lowest barrier way of communicating that.

This takes creativity, this is the art of the strategist, coming up with new UX elements new wireframes, new interactive approaches that can solve the same problems.

So which one do you think won? Guess, was it variation A, shortening it or B, with a totally different approach?

Of course, we measured product sales when they ran the test and in this case found A lifted their sales by 15.8% and B by a dramatic 41.4% from the homepage visitors.

So that’s of course, the winning design which we continue to optimize since then. And you know everyone’s happy with this kind of result, this is what it looks like on the call when you show forty-one percent sales lift.

There’s a lot more data behind that I don’t have time to get into all the details, if you want to learn more about that case study you can email us at [email protected] or just go on our website at and we can show you all detail behind this.

There’s another category page test with an 18% lift, and second category page follow-up with a 17% lift and we can get into all that detail later and you know all the happiness that happens as a result if you’re interested in more of that case study.

Okay, because I want to get into another example here that shows you that true power of website optimization is beyond UX, it’s beyond just redesign, it’s beyond even information architecture and completely redesigned with evolutionary site redesign.

You can actually use optimization, the power of A/B testing, to understand persuasional triggers, to understand your customers’ mindset that will impact the whole marketing organization.

This is where the power really gets it’s burning here. I’m going to show you an example. This is a mobile landing page for one of our clients and it’s another subscription model e-commerce which is interesting, and in this case, it’s a totally different target audience though.

It’s not makeup, it’s a monthly geek and gamer box of gear like stickers and t-shirts and stress balls and all kinds of stuff that appeal to the gamer community. And so they wanted to increase their subscribers and their profit margin, but also to understand their customer. So what’s driving this user to subscribe?

Is it because they get a mystery box that they don’t know what they’re going to get every month? Is it the intrigue? Is it about the quality of the products they’re getting? Is it about being part of a social community? Is it about having a selection of products or the savings they’ll get because there are multiple things combined into one and the scale they get? What’s driving their user?

Now you can ask users, but they won’t know, your shoppers can’t tell you what drives them to act. The only way to really understand is to test, that’s why I called my book “You should test that”, right? Because these hypotheses can only be tested when people don’t know they’re in an experiment. So what we did – I thought our strategist did a great job of structuring a smart test that went through cycles to drill into insights.

And again we’re going to start with a button test which is a very rare for us actually to do a lot of button testing to be honest, but this one shows the interesting insight that I wanted to share.

This was the original and you’ll see the “subscribe”, we’ve highlighted here; an isolation that said “join now” which implied becoming part of a community. And so we wanted to understand if that was a persuasional trigger, potentially.

The next variation was “select your plan” which is more tangible, it’s about the tangible aspects of what they’re buying.

Our question was: is this target audience more aspirational in community and inclusion or more fact-based and tangible? We had a lot of votes, a lot of people voted for variation C here.

When it turned out to go into the test, that actually hurt their sales. And so if we had focused on just clearly communicating the product features and the subscription features we could have really hurt their sales, of course that’s why we test, but we also got a slight indicator that perhaps “join now” and the social inclusion was important.

That becomes the new control and we want to push the envelope here even further to find out if this is a robust insight or if it was just a fluke. In this case we combined “join now” which is the social part within variation B a more tangible aspect, so maybe if we can include both, “a monthly box of geek and gamer gear”, very clear on what you’re going to get and now you can join. Or could we push the needle on social inclusion even further with a social proof headline which says “Join 110,023 Geek + Gamers Just like You!”, “Join now”. Which one do you think won?

Well what was interesting is that we revalidated the insight about the customer. So variation B of course was much more clear than the control, the headline was more clear and so that lifted sales. But the social proof headline which didn’t include the clarity of what they got but only talked about the social aspect, the social proof aspect, improved their sales by 14%. It doesn’t even talk about the product, that’s the interesting part.

We could actually test even further from that and we did, by bringing in more social media elements, images, unpacking videos and images of people with their new products and their mystery box and all the things from social and increased their sales continually, increased their sales in a series of really radical tests.

What’s interesting here, is that the testing has not only increased their sales but given them a spotlight right into the customers’ mind, understand why do they subscribe, why do they want to be part of this community. For this target audience it’s social proof.

Now what are you going to take away from this? Social proof works! No, that’s not the insight don’t take that away, because that’s relevant for this audience and context, but it doesn’t work in your context.

How do you understand what works in your context. I’m going to show you one more framework here today, just briefly. It’ll show you how to think about your unique ideas that can drive your success by understanding what’s unique about you.

You’ve got your prospects desires, what your prospects want out of your product category, your features and then your competitors features. Most people are focused on the central area where what you do matches what your competitors are doing and matches what your customers want, you’re out there with your points of parity saying “hey we do that too, we’ve got great service, great products, great shipping” but everyone else is saying the same thing. That’s the Red Ocean, that’s where you’re fighting over a few small fish, but what’s much more powerful is over on this right-hand side,  you’ve got your PODs, your points of difference. Where what you do overlaps with what your customers want or a segment of your customers and where no one else can play. That’s your POD.

A lot of people who have manufactured products and are in technology they’ll spend time down here in the points of irrelevance where you know they built a lot of great features that no one cares about.

And you can test within these PODs and this can actually become a brainstorming session. We’ve done this here at Wider Funnel to understand our target audience and our customers by looking at what we do, what our customers want, what our competitors do and where there’s overlap and where there’s differentiators. And by testing within a framework like this and using the LIFT model, understand barriers the whole point of your optimization program becomes more than just lifting conversion rate, that’s just a metric. It becomes about understanding your target audience, your unique customers and how to communicate with them and how your business can operate and provide new products for this customer.

OK, so we’ve talked about Infinity, we talked about how to separate, explore and validate and aim for growth and insights and that nucleus of powerhouse in the middle.

One last thing I want to share with you is how it looks over time and why it’s important to have a dedicated resource.

This is a real customer, our client of Wider Funnels where we started with an experiment, isolated five elements to understand, to make some improvements and got a 7.37% revenue increase. Then we ran another experiment based on the insights driven from the first one to revalidate whether we could push the needle in those elements and whether they were still important.

We found that some of them were validated, some of them were canceled that did not revalidate, and there were others that were pulled in from other areas of the website. This second experiment got a 15,92% revenue increase, so this was on the homepage. The content page had a potential insight that fed into the homepage experiment, the second one.

Because we’re trying to understand the customer, the customer of course, is on all the different pages and we can feed those insights throughout the experience. And then we see the other layers of experiments coming on there’s there’s desktop and mobile, and then you see this, I call this the telephone wire diagram where we’ve got insights flying back and forth and up and down and across between all of the elements on the website.

Some experiments are getting a lift, some are inconclusive so we’re just getting insights from those results and feeding them into other areas and it starts to look really complex. It might be overwhelming and when you think about it this is only two pages in two contexts segments, two different devices. In reality, they’re likely to be dozens of different segments and content areas and devices and different types of pages, and so it becomes overwhelming for someone inexperienced in optimization; how do you capture all of this data and all these insights?

And this is why an expert in optimization is so critical. It can’t just be the side of someone’s desk, a part-time job, because the opportunity is in understanding what are you learning from all of these tests, what can you learn from all of these tests, what are the potential learnings that can feed into organizational strategy?

Do these experiments actually result in business improvement? We actually went and tested that to validate whether they do. And I’ll show you this example. This one is actually an affiliate marketer where we had done six months of testing and we wanted to validate whether those tests were sticky in their revenue. We wanted to answer that question once and for all.

When we looked at the calculated cumulative conversion rate you can see there are two wins: an 18.8% and 14% and when cumulated together the calculated conversion rate lift should be 32.8%. We took the last winner and then A/B tested it against the original control at the starting point of the six months.

And in that A/B test validation, we found it actually did.The revenue lift was A/B tested and we did an AABB to validate the tool as well and found it was a 36.8% revenue lift. We then ran six more months of testing. At the end of 12 months, we had some other great wins and found the calculated revenue lift was 100,8%. So again we took the last winner at the end of 12 months and the original control from the start of 12 months ago and then validated through an AABB test, 123.7% revenue lift for this affiliate marketer.

What we’re answering here is “do the results of tests stick overtime?”

The numbers aren’t as important because these are obviously very dramatic numbers and we had the luxury in this case of testing on one defined funnel that drove direct revenue so it was a perfect example or element for us to test the concept of testing and prove it.

But then again after 12 months, someone could say was that just a fluke?

Did that just happen to work out for you? So at the end of 24 months we did the same thing and we had some more wins. Every month we’re testing something new, and at the end of 24 months, we had a calculated lift of 259.8%.

And then we took that last winner after 24 months tested against the original one from two years ago and got an AABB test win of 282.2% revenue lift.

And if that doesn’t prove to you that conversion optimization works overtime I don’t know what will but. It absolutely does work and so I’ll leave you with that.

If you want more information, you want to learn more about this topic, maybe you’re new to it or maybe you’re becoming your optimization champion of your organization, you can download a free chapter of my book “” or if you have any other questions, email [email protected] and and we’ll be happy to help you out, and point you the right direction or answer your questions and see if you qualify for our service, if that’s something of interest for you.


Valentin: That was great, Chris, a wonderful presentation! We have questions but we don’t have enough time. I think we should at least take one question.

“What is the minimum number of visitors you need in order to start this optimization process?”

Chris: So conversion optimization does take traffic for sure, to run these kinds of tests. And the way we do it is to focus on high traffic websites that can run rapid cycle A/B tests so we can get results very quickly. For lower traffic websites, tests will take a little bit longer, you might have to test more dramatically or test keyword variations. So normally we’re looking at a minimum of fifty to a hundred thousand monthly unique visitors. You can test with less usually if you’ve got five or ten thousand monthly visitors you can start testing and your test just might take it a little bit longer but if you’ve got over a hundred thousand or even more then you can test much more quickly and get test results within a couple weeks and running a rapid cycle of that testing.

Valentin: Thanks a lot Chris for your time, thanks everyone for attending, this was our first session. Chris, have a great day ahead in Vancouver and we’ll keep in touch! 


If you want to see the whole recorded presentation here it is: