What is the difference between Conversion Rate Optimization and User Experience? Listen and find out. In this episode, you will meet Conversion champion Peep Laja, the owner of the CXL institute and the author of biggest conversion optimization blog in the world. Peep and Sam will take you down the rabbit hole and into the world of conversion optimisation. They’ll discuss how businesses should approach CRO and what to avoid when rebranding your website. You’ll learn about the tools you’ll need to optimise and the dangers of getting conversion optimisation wrong.
Guest: Peep Laja
Date Added: Apr 11, 2018 1:33:08 PM
Length: 42 min
James Cook Media
Take the StoryTelling IQ-Test and get 5 FREE Personalized Videos on how to tell your Story Online to attract and convert leads into customers.
Podcast moments that will matter to you:
What is the difference between Conversion Rate Optimisation and User Experience.
The most important part of starting conversion optimisation
Never assume, always test
The reason most redesigns fail
Conversion optimisation gone wrong.
Resources to help you start testing optimization solutions.
Lessons to remember when testing and optimising.
The Limit of Optimisation.
Peep Laja: [00.00] Conversion optimization is not about building anything new. It's just about taking what you have and making it better.
Eight redesigns out of ten fail because a bunch of designers come in and then they have really good intentions, typically, but they are not data people.
Data will tell you what the problems are. And now once we understand what the problems are, now we can start fixing it. If your website is really crappy, it's easy to come up with better ideas based on the data.
Look at the analytics. Understand which pages work and which pages don't. Then based on that data, we'll reimagine those pages and come up with a new design.
The rabbit hole of conversion optimization goes really deep and whoever claims that this is simple and easy is not knowledgeable enough. So, keep on learning guys.
Sam Cook: [00:01:08] Hello again, StoryMatters podcast listener. My name is Sam Cook, the host of the StoryMatters podcast, and I'm here with a guest I've been trying to get an interview with for a while. Peep Laja is the founder of Conversion XL and they run both a high-level agency for optimizing funnels for large clients. But he also runs definitely the best conversion rate optimization blog online and it's the one that I follow and it's the one that I have my conversion rate optimization team follow to understand high level thinking around user experience design and marketing.
[00:01:52] And if you're not quite sure what that is, we're going to go deep into that topic today with Peep about user experience design and conversion rate optimization. And I-- my background with Peep is James Schramko who was my mentor when I first started out as a digital agency owner, did a podcast around Peep Laja, and what he learned when Peep came in and mercilessly ripped apart his site and told James, who's a very respected marketer, how much he could fix his site and optimize the performance. And James, being a true professional, took that, published it, and showed it to his community, and demonstrated the value of what Peep did for him.
[00:02:48] Now, I already knew about some conversion rate optimization at that point but at that point, based on my mentor James recommending him, I started following Peep's blog, ended up taking his online course where he teaches conversion rate optimization, and definitely have all of my team, project managers, that work on client funnels follow his practices and his free checklist and guide for conversion rate optimization.
[00:03:06] So, without further ado, I'm going to welcome Peep to the show. Peep, thanks for coming on today.
Peep Laja: [00:02:12] Oh, thank you for having me. Pleasure to be here.
Sam Cook: [00:03:15] So Peep, you are Estonian by background. I'm going to let you introduce yourself and in your introduction, I love free to talk a little bit about what I believe is a very special niche that Europe has, 'cause this is a European audience, around conversion rate optimization and user experience design. So, go and introduce yourself. Tell us where you're from, what you do, and about your team that you work with.
Peep Laja: [00:03:40] Oh, hey everybody. I've been doing all kinds of different types of marketing for 10+, 15 years. I was a fundraiser for nonprofits around an SEO-PPC company, [to the] marketing, info marketing, all these things, a rent-a-Saas company. And then in 2011, I started Conversion XL as a blog and the conversion optimization agency. So we work with-- mid to large size enterprises and help them optimize their sites. So these are clients that are making millions of dollars online and millions of visitors on their site so like big web properties.
[00:04:22] And while growing the business, we were also growing our e-mail list and we learned that actually ninety percent of people on our e-mail list actually are unable to afford our agency's services. So two years ago, we launched CXL Institute which is basically every course you can imagine on data driven marketing, you know, from product messaging to conversion optimization to digital analytics.
[00:04:49] So now I've been running this online training company, CXL Institute, and today it's already bigger than the agency.
Sam Cook: [00:04:58] Yeah. And Peep, you-- talk a little bit about where you're based and where you're from because you have a split team and one of the things we like to feature here in our building is this idea of a European style of marketing that's a bit distinct and different from the American style which I'm sure you're aware of.
[00:05:17] There's this special talent I believe in Europe for conversion rate optimization and you grew out of community here. I think that had a lot of strength in that in Estonia, right?
Peep Laja: [00:05:27] Yes, I'm from Estonia but I haven't lived there full time since 2005. Because then, you know, I discovered that the world is bigger and I've been living in various places. I've lived in Dubai for two years. I was in Panama in Central America for two years. And I guess since 2009, I've been predominantly in Austin, Texas. Of course I go to Estonia every summer and I have a team there.
[00:05:52] So, our agency team is all based in Tallinn, Estonia, around 12 people or so. We do have some freelancers, or remote people rather, in Budapest and Gdansk. And, yeah, I mean, we work with predominantly American companies because in Europe, in agency model, Europeans like to shake hands and they want their service provider most often to be in their same city as they're in.
[00:06:21] This is especially true for, you know, Germany and France, like they're more traditional conservative business cultures, whereas Americans don't give a rat's ass where you are. So, it's really nice to run an agency out of Estonia 'cause, you know, people are equally smart everywhere. They just cost less than some places. You know, the time difference can be a bitch but, you know, it's manageable.
Sam Cook: [00:06:47] Yeah, and one of the things that's really great about your setup is America still is on the cutting edge in digital marketing especially and obviously probably product development and innovation too, but Europe, I think, in many ways, has better talent - better volumes of talent and quality - especially on the values side. It seems to be a bit of a shortage of that in the States.
Peep Laja: [00:07:10] Well, I disagree here, I think. I mean, America is, you know, the population is huge so it's a numbers game. I wouldn't say that some countries have better talent than some other countries. It's, you know... If you have a larger population, you're more likely to find, you know, better people I think. Of course, the competition is not set. I mean, I'm right now trying to recruit people. It's so difficult, especially in Austin where, you know, it's zero percent unemployment and that's crazy.
Sam Cook: [00:07:45] Well, I think in America there's certainly plenty of talent but they are concentrated in tech clusters and those cities tend to be expensive to live and pay people. And obviously as you're saying, there's a lot of competition for those people. But, you're right. There's plenty of talent in America.
Peep Laja: [00:08:01] Yeah. You know, I think there are good people everywhere. So...
Sam Cook: [00:08:06] Yeah. Well, Peep, one of the things I'd like to hit on with you is really digging into user experience design and, you know, conversion rate optimization and that whole thought process around that because most marketers who get into this don't really think about data in a rigorous way and it's something that I learned early and always was following in the community and latched onto your training when I came around to it.
[00:08:39] What led you into this? And just if you can in layman's terms explain to marketers and business owners what is user experience design and conversion rate optimization and how can they start to use that in their business.
Peep Laja: [00:08:57] Well UX and CRO - conversion optimization - have a lot of overlap but there are some key differences. You know, like UX, you know, user experience, what they experienced should be like on the website. But sometimes, these have conflicting goals. For instance, let's take pop-ups. UX would say "Never have a pop-up because it's a terrible user experience" and it is!
[00:09:21] However, if you want to make money and catch your e-mails, you should absolutely do pop-ups. You know, so there's conflict. So how does UX measure success? It measures satisfaction, there are like usability benchmarks, things like this. Whereas in CRO? It's just about money. Nothing else. Like, what does, what we do here make us more money or not and that is how you decide about everything.
[00:09:46] And of course, there are short-term and long-term money sometimes: a tactic that you might want to employ, with making more money in the short run but, like, that will destroy your branding the long run so, like, you still need to take long-term money into consideration. And another key difference is where UX is about creating more than anything. Like, "Let's imagine a new experience," how somebody could shop for shoes or, you know, what have you.
[00:10:16] Then conversion optimization is not about building anything new. It's just about taking what you have and making it better. [7.5] It's not about, you know, starting new businesses or, like, starting a startup. You can't CRO your way to a startup because it's not about, you know, figuring out market product fit and finding an audience and what they want. So you need to have a working business already. And now you bring in optimization when you're at a certain growth stage.
[00:10:48] So, typically I tell my clients where or leads that unless you have... You should have around 1000 purchases a month online before you can think about CRO or... I mean it's tricky because you know you can always optimize and, you know, like, copy is optimization, you can make your copy better. But when you get to like AB testing, then you need a certain volume for stats to work - congruence statistics on small sample sizes - so hence a thousand purchases is the bar there.
Sam Cook: [00:11:26] Would that be leads also? So say you're talking about e-mail optins that would be a similar number--
Peep Laja: [00:11:32] Yeah. If you are optimizing for lead capture, then for sure it can be leads. If we're optimizing for more purchases, then, you know, you'll need to look at the purchase account.
Sam Cook: [00:12:41] Okay. 'Cause a lot of the people in our audience do information product based marketing and due to the small number of purchases if they're doing high ticket items versus lower ticket items, that would have thought to optimize for leads.
Peep Laja: [00:12:58] Right. So AB testing is kind of out of the picture. You can do all the other components of conversion optimization, just no AB testing.
Sam Cook: [00:12:08] Yeah. So Peep, if someone were to think about user experience design as you said is basically designing and imagining a website experience or a customer experience not only just through the website but, you know, into the products or the app if it's delivered online or mobile. And, conversion rate optimization is really taking whatever has been built and imagined and measuring data, and then just taking that product to the next level in terms of optimizing, would that be a good summary of that?
Peep Laja: [00:12:45] Yeah, I think so. You got it.
Sam Cook: [00:12:49] So Peep, one of the things that I really learned from you and appreciates about your methodology was the importance of research both on a qualitative and quantitative side for conversion rate optimization. So talk about qualitative versus quantitative research and why research is fundamental to even beginning the process.
Peep Laja: [00:13:15] Yeah. It's everything. You know, without research there is nothing. Because, what is the goal of research? The goal of research is two things. It is one: to understand the problems with whatever you have right now. So you have some sort of a website right now. So, what are the problems? We cannot improve the current website unless we know what the problems are.
[00:13:43] And while we can, to an extent figure out what the problems are just by looking at the website, you know, using our experience, it's not super accurate. Like I've been in this business for a long time and if I have to just tell people like "Hey, change this this this and, you know, all the stuff and then push it live," I get it right maybe 60, 65 percent of the time. And that's like only slightly better than flipping a coin.
[00:14:12] So, it's really-- you really need to gather actual data. So qualitative data is talking to the actual buyers, leads, prospects, you know, looking at-- So this is like surveys, interviews, things like that, chat, transcripts, user testing. And then the other part is the quantitative data which is numbers. So this is your web analytics, you know, looking at drop offs and different various funnel steps, doing segmentation, looking at your data based on various devices or traffic sources or you know whatever segments that make sense for you and including their also mouse tracking data, you know, like heat maps, things like that.
[00:15:03] So you really want to look at the data and the data will tell you what the problems are. And now once we understand what the problems are, now we can start fixing it.
[00:15:14] So the optimization is A, figure out what the problems are; and then B, we need experimentation to figure out what the best solution is. Because even if we understand what the problems are, it's not that we immediately know what the right solution is. So that's why we come up with using our experience and common sense and all these other things, inspiration, to come up with possible viable solutions or how it could be better.
[00:15:42] And then we should-- In an ideal world, we should test it. We run a test because we don't know which experience is better. You know, of course it'd be contest and we'll just go with our best hypotheses and just push it live.
Sam Cook: [00:16:01] Yeah and Peep, one of the big lessons I learned from you was a bit of humility in design and troubleshooting because I think so many business owners have made the mistake where they put up a website, they didn't have a good designer at the beginning and they're getting results from that website but then they just decide "Well, I'm going to do a complete website redesign."
[00:16:26] And, I actually had this happen to a client of mine who I did a performance-based deal. I built a funnel that was working. And I was optimizing everything with the training I got from you. And, I ended up finishing the deal with the client who was paying me on a performance basis so they're anxious to get out of paying me that 30 percent that I was charging them for all their sales.
[00:16:49] And what they didn't know, which I tried to explain to the new agency that was taking over, was "Hey, there's a very in-depth funnel that I've tested and optimized and you better not take this down. This agency was making their money off of doing a rebrand. So they wanted to rebrand the entire site and as part of that redesign, they sold redesigning the entire site and tore down the funnel that I built because it was a simple landing page that led to content and emails of marketing automation and all kinds of stuff and sales page it was optimized."
[00:18:21] And the client's sales went from really good to nothing. They killed their online sales by doing this rebrand. And I even tried to explain to this person, "Please, do not kill the golden goose," which is this funnel. But they basically just made the mistake of "Well, if we make it prettier, it's going to become better," and they had the complete opposite effect.
[00:18:39] Then a lot of people do that because like you said, when you do it without research and data even with all of your experience and intuition around this, you only get it right 65 percent of the time which is a little better than flipping a coin. You know, how many times have you seen that?
Peep Laja: [00:18:56] Too many times. Too many times. I mean, there is a time and place for a full redesign and this is when the existing website has just so many problems that it will take too long to fix every issue like one by one. So it's like, "Well, we just got to throw this out and build a new data driven design." So you still need data.
[00:19:21] The problem is yeah, I would say like eight redesigns out of ten fail because how is redesigning done? It's like, just a bunch of designers come in and then they have really good intentions, typically, but they are not data people. [6.8] So, their designers typically are not data people. They don't look at the data. They just think about, you know, the user experience, what it could be, what would make a good looking website, you know, maybe to eat some mushrooms to get inspiration. I don't know. And, yes! It doesn't sell.
[00:19:02] So, there is a way to do a managed redesign. So, first before redesigning anything, you need to understand what parts of your website are working and what parts are not working.
[00:19:16] So maybe it's like, the performance is good enough but the website is, like, doesn't represent the brand, doesn't communicate the brand, like, maybe it's a little ugly let's say. You can do a facelift for a website while keeping structurally everything the same, you know - the same funnels, what goes where, all that stuff - and you just make it basic. It's a facelift, right? That's a safe redesign. Keep everything as it is, just a facelift.
[00:19:45] Many people don't want to do it because, you know, it's just... They feel it's too superficial or boring. Other way to do this is you look at everything how the current website performs. You do this full research. Look at the analytics. Understand which pages work and which pages don't. So you keep the pages that are working well, as they are. Facelift might be possible but in terms of, like, "what goes where", the copy, and how it works, stays the same. But those pages that are not working well, now you reimagine what those could be.
[00:20:19] So based on, you know - we understand the data, we've tried to figure out through qualitative - what is wrong with this page, what is the source of friction, why aren't people clicking here or buying this or that? And then based on that data, we'll reimagine those pages and come up with a new design. So it's like a hybrid version that really works. This is what our agency does. It's like, managed redesigns.
[00:20:45] But of course, there are these traditional agencies that are really good at pitching and they have really fancy designs. And there are executives who just want to buy new shiny toys especially if they include, like, new design trends like video backgrounds, ghost buttons, and all that, you know, fun looking stuff that actually isn't going to work at all.
Sam Cook: [00:21:13] Got it. And Peep, how many times have you had to come in and pick up the wreckage from a design that went wrong?
Peep Laja: [00:21:21] A lot. A lot, a lot. Yeah. So, this has become one of our, like, bread and butter things where, you know, people go for that shiny toy in them then they revert back and come back to us and say, "Woops, can you fix it please?"
Sam Cook: [00:21:41] Got it. OK. And... So you've seen all of these trends and one of the reasons I told that story and wanted to get your perspective on it was for the clients on here, who are the listeners on here, who have something that's working, is understanding the dangers of redesigning a website without looking at the data and this goes all the way back to educating yourself on data and analytics and conversion rate and testing and optimization.
[00:22:13] So, you know, what you talked about is one of the first mistakes I think I'd like everyone in the audience to learn to avoid which is killing something that's working accidentally in the name of trying to get it working better. And, if you really want to do a redesign, like we have a client right now who's doing a redesign of their website, it is hire someone, like our designer, who does look at data when he does data driven redesign and would make those kind of judgments and be able to tell if we do something that works better or not because we've looked at it and benchmarked it before and after.
Peep Laja: [00:22:53] Yeah. We have also worked with traditional design agencies where we give them the Y frame, well, basically with the full copy. So the Y frame, with the basic mock-ups or whatever, that basically shows what goes where on the page with the full copy. And then there are designers who can go nuts about, you know, how you make it or what it should look like, right? Like, creating that just the design, the decoration of it and that? That can work just fine.
Sam Cook: [00:23:21] Yeah. And... So Peep, one of the things that I really appreciated from your class and the reason I came to you was when I first started marketing about 10 years ago with my own website, I had an owner of a... or a manager of a multimillion-dollar company in the tourism business in New York City. And I actually hired his web designers and web developers to do my second website, because the first one was an absolute failure.
[00:23:54] And I saw the difference between having an agency that thinks holistically and does user experience design and plans and develops holistically from the ground up thinking this way. And basically they took what had been working for this other big site and applied it to mine and it worked a lot better than my first website which didn't work at all.
[00:24:11] But one of the things he told me, which was very interesting, was that they had an issue on their checkout page. And when they redesigned their checkout page and broke a huge user experience bottleneck, they basically tripled their business overnight which shot them into the stratosphere in terms of revenue and dominance in the industry.
[00:24:33] And I remember that lesson and just getting deep into research on conversion rate optimization and I bounced around for a while looking into AB testing and at the time there weren't that many easy tools to do it off the shelf and then I started exploring multivariate rate testing. I'd taken one Statistics class in college which made me dangerous. My brother has a Ph.D. in Statistics, so, I felt like I had the resources with my own experience and his experience to really master this optimization.
[00:25:05] And, the power of that whole experience led me to you where you taught the principles of conversion rate optimization. I wanted to run through some of those principles with you - the things that I learned and then explain later how we apply that to some client funnels which really helped take their campaigns to the next level.
[00:25:28] So, what is the first principle people need to learn after research in conversion rate optimization? After they do their research, the next thing is coming up with hypotheses and test ideas. How do you explain and walk people through that? Because that's a real art.
Peep Laja: [00:25:50] So the way I teach it is... You know, I've developed this conversion research framework called Research XL. There's a blog post about it. And so once you go through these research data gathering and analysis exercises using qualitative and quantitative data, you then gather every identified issue or problem into a spreadsheet, you know, regular - Google Docs Spreadsheet or whatever. So it lists every single issue. So usually after the research depending on the website, you know, we'll have 50 to, like, 250 issues that we have identified. And each of these issues, we then categorize, like, what type of an issue is this.
[00:26:37] Some issues are, like, "Oh, something is broken. We need to fix it," or it's maybe an analytics-related issue like, "We are not tracking when people are clicking here" or "We are not tracking with people scrolling down. So we need to set up measurements." So it's a measurement issue.
[00:26:53] Or some issues are, like, silly where people can't read the text because it's, you know, light gray font color and like 8 pixels. So, it's easy. Just make the font size bigger. And then, there's a lot of issues where, "OK, now we understand this is a problem but we don't know what the optimal solution is so we need to run a test." And so every single issue that you have found you categorized based on-- you fix right away if it's an analytics issue or we need to test it.
[00:27:28] Once you have that, you know, figured out, then you need to: Okay, who fixed this, who in your team? Who fixes what in your team? And then you proceed. So let's say you want to fix your checkout page. Then, you want to look how many different issues did you identify for the checkout page and then maybe there were seven different things.
[00:27:52] So now you want to look at, you know, statistics come into play. If you are able to run AB tests you have enough transaction volume, you want to estimate whether you can test these changes one by one which requires insane amount of traffic. And if you don't have insane amount of traffic, you probably want to merge all these seven changes into a single test. And so you wireframe different markups, how you can--
[00:28:21] 'Cause you know, let's say that we know that people are feeling anxious about their credit card security on our checkout page. Let's say that in basic qualitative data, we know this.
[00:28:35] Now, how many ways are there to design or do improve the perception of security on the checkout page, on any check up page? The answer is infinite. There are infinite possibilities. Now, that is why you need to run experiments because, OK, let's increase the perception of security. Well, maybe it's copy. Maybe we should have some trust badges. Maybe we should talk more about our 256-bit encryption. Maybe we should, you know-- There are so many ways how you can design it, how you can phrase the copy, what goes where, all those things.
[00:29:09] So, that is why you need to run tests. If you couldn't run tests, if you're doing low volume, well you just need to go with your best bet and just implement all those changes right away. And then, if you want to now know whether your things got better or not, you need to-- those changes need to have an impact that's 20 percent or more. So if you look at-- let's say you look at your Google Analytics, you used to get hundred leads a week or purchases, and now you get hundred and thirty a week.
[00:29:44] OK, it's probably because of the changes we made. But if it's like 1 on 5 or 1 on 6? Then we can't be sure because your data is not static, you know. Your conversion rate is always fluctuating, like, 10 percent, 15 percent, up and down, on a day-by-day basis.
[00:29:59] So, if things got 7 percent better, you can't see it in analytics because it might be the normal fluctuation which is you're unable to tell. That is why AB testing is the only way to really know unless the changes that you make the changes in, you know, the improvement is significant. You went from 100 purchases a week to 150 or 200, you know, like, these are really big gains which in real life you don't see that often. But you know, it's possible.
Sam Cook: [00:30:31] Yeah. You know, Peep, you've come up with tests and one of the things that you taught me was the humility of putting your stuff out there and how often tests fail. How often do tests normally fail? AB tests.
Peep Laja: [00:29:49] Well, it depends is the correct answer because it depends on how optimized your website is. If your website is really crappy, it's easy to come up with better ideas based on the data. If your site is super optimized and already thousands of tests have been run on it, then the win rate could really down.
[00:31:10] So like for a super optimized website, a 10 percent win rate is pretty good. If it's a website where, you know, you haven't run any tests on or maybe just a few, then I would say that 40 to 50 percent is more likely.
Sam Cook: [00:31:39] OK. So 40 or 50 percent of tests on a good optimized website will actually fail and lose you money.
Peep Laja: [00:31:36] Well, it's... Losing tests are also rare. It's mostly that the changes you made make no difference whatsoever. That's the typical outcome.
Sam Cook: [00:31:47] Got it. So basically, a win, especially the more optimized your website gets, just gets harder and harder to lock in and to find.
Peep Laja: [00:31:57] Exactly right. And you need better and better skill level to be able to squeeze more wins out of a highly optimized website.
Sam Cook: [00:32:05] Yeah. And Peep one of the things that I like to tell my coaching clients on tests is, "Well if it doesn't win, what did you learn?" You know, you have a hypothesis that you make when you test. You have a theory and you have something that you as believer assume about your ideal customer that underpins the reason that you put all the time and energy into that test.
[00:32:34] Talk about assumptions and learnings from assumptions and tests. And would you say this is even more valuable sometimes than the numerical outcome as the learning?
Peep Laja: [00:32:45] It definitely can be 'cause also when you... Let's say you have traffic is-- No, you know, it's not a limitation. Let's say you're able to test 20 headlines on a sales page. And these are, like, you know, you have good copywriters and you worked hard to come up with 20 new variations, a new test, 20 headlines, and that's all, no difference. Well then we can learn. We can know for sure that the headline doesn't really matter. And that's a big learning.
[00:33:16] So, like, stop wasting our energy in the headline, let's focus on other parts of the page. You know, the same goes for whatever objects. So, knowing what matters and knowing what doesn't matter is huge. That being said, it's if you only test one variation, so let's say let's come back to the checkout page example. So we want to increase the perception of security. We put like, "Oh you know, we have 256-bit encryption on this page. We put some graphics and text on it." And it loses. That doesn't mean that trust really wasn't an issue or doesn't matter. It might still matter, just your implementation was wrong.
[00:33:59] So, you can only claim that something works or doesn't work if you've tested it a bunch of times. You tested different ways to tackle the same issue.
[00:34:10] Another way to learn from tests is, and this is also again when you have large volumes, you look at segments. So, let's say you run an AB test and the result is flat, there's no difference. But then you use split. You zoom in by device and say, "Oh my God! On desktop, it's winning by 20 percent but in mobile it's losing by minus 20 percent." So they cancel each other out. And then, you "OK, so these changes that we made, they really resonated with the people on desktop computers whereas for mobile this was a bad idea." And then you start thinking, that can lead to new insights and then you start building different things with different devices.
Sam Cook: [00:34:53] Got it. Yeah, I think that's really when you get into the advanced level is understanding the different experiences people have based on where they are, you know - mobile or desktop - and that's a huge thing that I think people overlook. And especially, you know, watching people look at their websites on desktops without even bothering to use Chrome or some other extension to preview how it's going to look and feel on mobile.
Peep Laja: [00:35:20] Yeah! And same goes for traffic sources because, you know, some sources are warm traffic, maybe traffic from your own e-mail list, and some is cold traffic from PPC and so on. And also they need different types of information and assurances and you know and so on.
Sam Cook: [00:35:36] Yeah. Yeah, we've definitely seen that in terms of the Facebook ads for whatever reason for us and our funnels four free video series. Leads almost always seemed to convert better on mobile. The Facebook experience with videos to our landing page: we have a better cost per lead from Facebook on mobile and it's quite interesting. We still haven't ever come up with a definitive reason or hypothesis or money behind that but we have some general ideas which we think are useful.
[00:36:11] But I think one of the points that we like to emphasize is that you never know for sure. You just have some level of certainty based on a lot of experience and other research whether or not that learning is true but there are so many factors going into it it's hard to ever nail it down.
Peep Laja: [00:36:33] Yeah. I'm with you.
Sam Cook: [00:36:37] So Peep, one of the other things I'd like to talk about is local and global maximum and the idea behind testing and optimizing a small hill as it were. And there is actually I think a cool diagram. Maybe you can send me for the show notes or blog post that illustrates this where you can only optimize so much in one direction before you've maxed that out and you don't want to ignore other greater opportunities. Talk about that a little bit.
Peep Laja: [00:37:06] Yeah sometimes what happens is that you keep working on the website, you keep optimizing it, and then you reach a point where you can't improve the performance anymore. Like, whatever changes you make or whatever you test, nothing works. It's just flat. And so it's like kind of you've reached the maximum possible results out of this website. And it's possible. This is the concept called local maximum.
[00:37:34] So that means that the way your website is currently structured, that's the maximum it can do but there is a higher maximum if your website architecture, if its structures were different. So this sometimes calls for a radical redesign or radical rethinking of what they experience on the website might be like. And of course you don't want to go down that path that we talked about that you just redesign and see what happens. You still need to test your radically new approach but this is like the split path testing. So your various, you know-- your funnels or your websites that you're testing, the split testing, they're fundamentally different - different funnel steps, different copy, like, different design - everything's different. And this is a way how you can sometimes dramatically increase your results.
Sam Cook: [00:38:27] Yeah. And I think for people familiar with some of our work, we actually see, we do a lot of video-based funnels. So actually the story and the production quality inside those videos and the sound and the music and all these other things have had big impacts on the conversion rate of our funnel and it makes sense. I mean, our pages are very minimal. We get people's e-mails and we drive them through videos.
[00:38:58] And, I've actually found looking at some of the data and I didn't want to believe it for a long time because from a legal perspective, I had spent a lot of time redesigning the talk that we delivered and I felt like the content was definitely better. But we're starting to look at and really think maybe that the first talk that I gave and videos that we gave were actually better converting.
[00:39:24] It's taken me a while to come to terms with that because, you know, I put so much time and energy into the new videos but we actually are going to run what I would call a big test where we create two different paths where the videos... There are so many intangibles in the video whether it's the lighting, the mood of me at the time given the content. Even if the content was better, all those other intangibles could make the experience for the user better. And for those of you familiar with our information-based education funnels, that's something that we've learned and what's been really interesting.
[00:39:58] So, Peep, I know you have to wrap up and I just wanted to give everyone one final short story or insight which is in the funnels that we do. I've taken Peep's training and when we did our last milliondollar-funnel with Peter Sage as a client and we're working on our own agency one that's approaching that, we optimized every step and tested everything and we're able to basically bring our lead costs down from an initial cost of around 70 dollars per phone number. A qualified lead who wanted to speak to us to around seven dollars.
[00:40:36] And it was just through applying the methodology that Peep's talking about where you examine assumptions, talk to users, look at the qualitative data, quantitative data, and come up with ideas and hypotheses about what would work better. And that's something we're going to be talking about at the StoryMatters Live Conference and the workshops that we deliver. We're going to talk about funnel optimization. Peep's going to do one session at the workshop which we'll record for those there and then I'll be giving some other sessions from my agency in our learnings to emphasize the power of testing assumptions around your stories and your funnel.
[00:41:13] So Peep, just wanted to thank you for coming on for lending your insights to StoryMatters community. I know this should definitely spark some ideas and insights for people to start researching and following your blog and we'll put the link to conversionxl.com on the show notes.
[00:41:32] And Peep, any last thoughts you'd leave people with before you depart here from the podcast?
Peep Laja: [00:41:38] The rabbit hole of conversion optimization goes really deep and whoever it claims that this is simple and easy is not knowledgeable enough. So, keep on learning guys.
Sam Cook: [00:41:50] Well, Peep thank you. I'm really looking forward to seeing you at StoryMatters Live. I definitely look forward to having you back in the neighborhood in Europe this summer. And, I know you also have your own event in Estonia which we'll also link to. And thanks again for the rigorous systematic approach you've developed that's quite easy to follow in your free guide to conversion rate optimization and for the education you provided me online. And I know it's available to other people who really want to master this.
[00:42:23] So, your blog is a great free resource and I highly encourage everyone to go visit it, download the guide, and get on your newsletters. So thanks for joining us, Peep, and I look forward to seeing you in July.
Peep Laja: [00:42:34] Oh, it's been fun. Thank you.
Sam Cook: [00:42:35] All right. And thanks now, StoryMatters podcast listener, for joining us for another episode. If you would like to get notified of future episodes, please go to iTunes and subscribe. Also please don't forget to leave a review if you enjoyed the podcast so you can let other people know especially in the European market community about StoryMatters. And, if you're interested in the StoryMatters Live event, please go to the website, sign up for our free content series of free masterclass on StoryTelling in the Digital Age, and you'll get all the links you need to StoryMatters Live and the special pricing which goes up at the end of the month. So, looking forward to seeing you on the next episode and hopefully meeting you in person this summer.
StoryTelling in the Digital Age MasterClass
3 Hour Video MasterClass, EBook, Slides PDF, Resouce Guide,
2 x 1-on-1 FREE Marketing Discovery Calls,
14-Day Free Trial of the StoryMatters Academy.