#34: What do 34,000 landing pages have to teach you about yours? Featuring Sahil Patel from Spiralyze

Are your landing pages truly working for you, or are you caught in a cycle of driving traffic to pages that fail to convert effectively?

Today, we’re excited to welcome Sahil Patel, CEO of Spiralyze. Sahil leads a company that has optimized over 34,000 landing pages for SaaS leaders using insights from more than 130,000 A/B tests. With his deep expertise, he’s here to share actionable strategies for improving conversion rates, debunking common testing myths, and escaping the trap of underperforming landing pages.

About Sahil Patel

Sahil Patel is the CEO of Spiralyze, a leading company specializing in predictive CRO and data-driven landing page optimization. Prior to Spiralyze, Sahil was the CEO and founder of ER Express, a successful SaaS company, which he led for 11 years before its sale in 2021.

With over 23 years of experience in sales, operations, software development, and finance, Sahil also holds an undergraduate degree from Emory University and an MBA from Harvard.

Throughout his career, Sahil has worked with renowned organizations such as Netflix, Podium, NBA, Lowe’s, Harvard University, and Gusto. He specializes in landing page conversion optimization, running over 130,000 A/B tests for clients, and offers unique insights into best practices for efficient A/B testing.

Sahil is based in Atlanta, where he lives with his wife and two daughters, and pursues his passion for music through his Rush cover band.

Resources

Spiralyze website: https://www.spiralyze.com

The B2B Agility podcast website: https://www.b2bagility.com

Sign up for The Agile Brand newsletter here: https://www.gregkihlstrom.com

Get the latest news and updates on LinkedIn here: https://www.linkedin.com/showcase/b2b-agility/

Check out The Agile Brand Guide website with articles, insights, and Martechipedia, the wiki for marketing technology: https://www.agilebrandguide.com

Transcript

Note: This was AI-generated and only lightly edited.

Greg Kihlstrom:
Are your landing pages truly working for you, or are you caught in a cycle of driving traffic to pages that fail to convert effectively? Today, we’re excited to welcome Sahil Patel, CEO of Spiralyze, a company that’s crawled 34,000 landing pages to find everyone else’s A-B tests. He’s here to share actionable strategies for improving conversion rates, debunking common testing myths, and escaping the trap of underperforming landing pages. Welcome to the show, Sahil. Sahil Patel Greg, thanks for having me. Really glad to be here. Yeah, looking forward to talking about this with you. Why don’t we, before we dive in here, why don’t we start by you telling us a little bit about your background and your current role at Spiralyze.

Sahil Patel: Happy to do that. I’m the CEO at Spiralyze. We’re a conversion rate optimization company headquartered in Atlanta. Personal stuff on me. I’ve been in Atlanta a long time. Love it here. Moved all around the country and this has been home. And I have great family, wife and two daughters. And in my spare time, I do music. I play in a Rush cover band, which is like the nerdiest thing ever.

Greg Kihlstrom: Nice. What instrument? I play guitar. Nice. Nice. Awesome. Cool. Well, yeah, let’s let’s dive in and and talk a little bit. So I want to I want to start with the thirty four thousand landing pages here. So can you talk a little bit about, you know, what the process was and just what the learnings were from looking at so many A-B tests.

Sahil Patel: Yeah, so the learnings here is you can, first of all, you should borrow from the best. And a great place to start, if you’re doing any kind of just website design, rewriting your homepage, or, you know, gold standard, running an A-B test that’s producing a statistically significant outcome for your website, a great place to start is just see what other people have done. And that’s not an insight that is new. Lots of people do it. There’s swipe files. There’s tons of people posting on LinkedIn, do this, do this. And here’s the reality. A lot of them are thinly disguised anecdotes dressed up as best practice. Why is it a best practice? Because someone calls it a best practice. It’s certainly one way, it’s probably better than doing nothing or just sitting on a whiteboard and being like, well, could we do this? Could we do this? Those kinds of things is how you end up with like the dog CEO on the, on the homepage. Someone’s like, Oh, our competitor did it. Let’s do it too. Or joy. Boy, wouldn’t that be nice? I think a better way to do it is, you know, get a sample size of more than like two, right? Maybe 10. That’s a good starting point. Now what’s even better is can you get a bigger sample size where it’s big enough where you can actually say, let’s find the ones that are relevant to us. There are companies like us solving problems like we do. Okay, great. Now, how do you know which ones to do? Because if you look at five companies, they might have five different homepages and presumably they all are pretty good or they wouldn’t be in business. So what we do that’s unique is we crawl thousands, 34,000 in fact, websites to find everyone else’s A-B tests. These are other companies, not customers of us, that are running these tasks where they have two pages of their homepage, for example, or a landing page, and half of the visitors see one version, half see the other. They don’t know that they’re part of an experiment, and then they measure the difference. What is the difference? It might be someone clicks, put the shoes in the shopping cart. It might be in an enterprise context, probably most of the companies we work with are enterprise, Fortune 1000 and above. is they’re trying to get someone to take a high intent action, click that talk to sales button, click that see demo now, start that free trial, sign up for that webinar. In B2B world, enterprise world, that’s the things you’re trying to get people to do. And that’s the gold standard. Can you get more people? What we do is we go and find everyone else’s A-B tests, and then we find out which ones work for what kind of companies. Because maybe for cybersecurity companies that sell into really big enterprise like Fortune 100, some things work well. And maybe for HR tech selling into the mid-market, different things work. And that’s what we do.

Greg Kihlstrom: So then, so essentially you’ve built a prediction engine based on this, right?

Sahil Patel: That’s exactly right. That’s the data for the prediction engine. And we’re doing two things. We’re using the wisdom of the crowd to find the signals out of all the noise. And then when we find those signals, we run those with the client, with our customers, and that helps us. When we do that, we know exactly what works and doesn’t work because we’re actually working with those companies to run those A-B experiments, A-B tests, experiments on their website.

Sahil Patel: Yeah.

Greg Kihlstrom: But yeah, I mean, it seems so then you’re benefiting both from your own engine improving itself based on customers, but you’re also benefiting from, you know, someone has some crazy idea out there that isn’t a customer, but it really worked. But it works.

Sahil Patel: It works. And the question is, will it work again? I think the fallacy of A-B testing is, or one of the fallacies of A-B testing is, because it worked for someone, it’ll work for me. Now, it doesn’t mean you shouldn’t at least try it, but often what made it work, and there’s no way to know if you’re not inside the company, how do you go from the outside in and know, like, why did it work? But what does the definition of work mean? Was it a lot of lift? Was it a little lift? Sometimes people say, hey, I have a new, I have a page that just looks better. It’s a better UX. It didn’t improve conversions, but it also didn’t hurt it. That’s good enough. I’m just trying to modernize it and make sure I don’t do harm and tank my sales pipeline. That could also be work. If you take that idea and you’re looking to get like a 10% increase in conversions, you might be sorely disappointed. So the question is, is it repeatable? And is it repeatable in a set of circumstances that apply to you? Because what works, if you’re selling into the big, big enterprise and you’re a cybersecurity company and you do some kind of really out there idea, take a big swing, and it works, I don’t know that that necessarily works if you’re, again, I’ll use this example before, you’re selling into mid-market HR tech and vice versa.

Sahil Patel: Yeah. So, uh, by all means do out there things, try be an experimenter. Um, but your best chances of success are, has the result been repeated?

Greg Kihlstrom: Yeah. Yeah. Well, and so those that also are struggling to get those results, uh, a lot of them are just pouring money into driving traffic, right? You know, it’s, it, it’s like if you can’t, or if you don’t know how to improve your page or lack the capability to then, you know, I guess the solution would be let’s throw money at the problem and, and a lot of advertising. So, you know, we’ve got a lot of companies investing heavily in, in paid traffic to landing pages, but they’re not seeing results. What, you know, what would your advice be to, to these companies that are, that are pouring, you know, good, throwing good money, potentially not after good results?

Sahil Patel: First of all, I would tell them just you’re in good company. A lot of really good companies spend a lot on traffic and haven’t invested in optimizing the landing page. And the first problem most often is dog food ad, cat food landing page. Yeah. Yeah. So if you have a, if you have a great hook on your app, which means someone clicked on it, you’re getting, and you’re getting the right people to click on it. You should just repeat that. And it should be obvious when someone reaches that landing page. Number one, number two, who would send that traffic to your homepage? Yeah. Here’s think about it this way. I’ll take a restaurant analogy. Think of the homepage. Think of your homepage as the front door to your restaurant. Hey, think of landing pages as, uh, the drive-thru window. It does a very specific thing in a very specific way. And if what you want is take out food or you want to go to the drive-thru, cause you’re going to get a certain kind of experience. It can be in your car. You’re going to speak to someone and pull up, you pay and you get the food. That is not the place where you want someone to welcome you to the restaurant, ask how your day is, encourage you to have a drink and linger.

Greg Kihlstrom: Yeah, yeah. Well, I mean, some of this also speaks to, you touched on this already as well, is like, what is the measure of success, you know, of success? So in other words, like, if the measure of success is that you get a high click through rate on ads, then to your point, if lots of people click on dog food ads, but you’re really selling cat food, the advertising team is like, yay, we did a great job, we got a lot of people to click. Landing page team, eh, not so successful, right?

Sahil Patel: Yeah.

Greg Kihlstrom: And sales team, even worse, we’re the leads. Right, right, exactly. So I mean, is some of this at least just indicative of the need to close the loop and see what actually dry? So in other words, the landing page is only as successful as the inputs and the outputs?

Sahil Patel: Um, I might put it a little bit differently than it’s only as good as the inputs and the outputs. If you’re, if you’re, if you’re, let’s just take the scenario of a paid landing page where you’re getting paid traffic, a search engine ad, or say a social media platform that brings people there. Yeah. There’s a pretty good chance. Let’s say it sends you 100 people a day. If you’re lucky, one of them will convert. The data shows somewhere in that other 99, there’s people that want to convert. You just didn’t give them an experience. that they found compelling. Got it. Now, you could spend money to, instead of have a hundred people coming to your page, you get 200 people to come to your page. That might get you one more person. But before you do that, I would say, optimize that landing page. You’re going to have actually the same inputs, right? The same traffic, same quality. You’re not spending more. You’re not changing your keyword bidding strategy. Right. You have a great, you have a dog food ad, give them a dog food landing page. Now let’s, let’s, let’s apply that to enterprise fortune 1000. No one’s selling dog food here. Uh, you’re selling enterprise software. That’s complex, a long consideration cycle. No one’s buying, uh, cybersecurity, vulnerability, for scanning perimeter security platforms based on an ad or even a landing page. But if they find something that’s compelling, I’m going to go, oh, well, I clicked on the ad because I was looking for something in this category, especially if it’s a non-branded search term. And this looks interesting. I do want to talk to them, or I do want to go to their webinar, or I do want to read this article or this white paper. And somewhere in that 99 out of the hundred that didn’t convert, there’s a lot of opportunity. There’s going to be a mix of the higher intent, the lower intent, middle intent. The game is mostly won in the middle intent. The high intent, they almost are going to convert. You could do anything. You could give them the world’s worst landing page. They’re just so high intent. They’re going to do it anyway. And the low intent, they’re not there to convert today. They’re learning. This might be the first time. You might be a challenger brand to them. They just might be doing some research. Totally fine. Maybe you don’t want to turn them away, but that’s not where the game is. The game is won in the middle intent. Yeah.

Greg Kihlstrom: So, I mean, you’ve seen a lot of landing pages, obviously. Your software has seen thousands and thousands and probably many variations of each of those tens of thousands. What are some of the myths, misconceptions that you’ve encountered when you have customers or people approaching to do A-B testing with their landing pages?

Sahil Patel: Yeah, I think there’s three myths that I can share specific in the kind of the enterprise world of of ABI testing. So number one is showing the product. Number two is making the page easy to skim. And number three is Well, I’m not giving you the myths. I’m giving you the truth. So the truth is, show the product. The truth is, make your page easy to skim. The truth is, video often cannibalizes conversions. Okay, so let’s unpack those. Actually, Greg, maybe you know the audience. This is your audience. Which one of those three do you think you’ve seen most often? Or do you think maybe the people listening today will find most interesting and we can unpack one of those? Yeah. I mean, I think the video one might be, might be interesting. Yeah. Yeah. Video is a good one. So, um, our data shows video tends to cannibalize conversions. It’s surprising to, to companies I work with and they have these high, and it’s not that the videos are well-produced, they’re compelling, they’re right. And they, and they hear me saying, oh, video causes conversion. How could that feature. Video uses up people’s attention. They have a finite reservoir of attention. They might be willing to spend 30 to 60 seconds on your website and you have a 40 second video. And now you leave them with 20 seconds to do everything else. And the answer is not, okay, let’s make the video 15. Short video is of course has a place. What video does is it sends your highest intent audience off onto this thing that eats up their, their attention. So what I’ve seen work well, first of all, it doesn’t mean kill the video or all videos, bad or video, by the way, doesn’t mean video doesn’t work. I was literally on the phone call before we got on today, looking at an AB test where video beat something else. Yeah. You should test it. Don’t kind of, these are that’s, that’s why you test everything or you should test everything. So here’s what I’d say. Especially on like a landing page or a high intent page, you should keep video but move it lower on the page. And what works instead, what the data shows, is that a single beautiful crisp image of your software beats video. Interesting. Why? Here’s why. Number one, it’s easy to scan. Number two, it doesn’t eat up your attention. Number three, your brain immediately sees, Ooh, this looks interesting. And it makes them want to learn more. And if they’re that highest intent audience, the image then releases the audience, their attention, and lets them continue on the journey. The highest intent, they’re there to convert. Your medium and lower intent, they’re not quite ready. They’re going to scroll. And that’s a great place to give them the video. And the video absolutely can work. And then they’re like, hey, I’ve never heard of this brand. I put in a category search. I put in a vulnerability scanning software. I’m not looking for any one particular brand. I come to this landing page. I see a beautiful product screenshot. I see copywriting that lines up with the ad I clicked on. Highest thing goes, great. I want to learn more, or I’m ready to do this. The medium intent go, this looks interesting. I need to read a little more, watch a little more. They’re going to scroll. Maybe first or second day of showing the video, how it works, how it integrates with your tech stack, customer testimonial. That’s a great place to put the video.

Greg Kihlstrom: Nice. Nice. Great. Well, yeah, definitely. Definitely some things to think about. And I like. you know, the recurring theme here of, you know, you’ve also got to test this stuff for, for yourselves too. So it’s like, it’s, it’s no, no two cases are exactly the same. It’s, you know, um, and along those lines, uh, you know, there’s, there’s often a, in, in testing and, uh, and research, there’s a, there’s an emphasis on achieving statistical significance. How important is achieving the optimal statistical significance versus moving quickly and learning and all that? Is there an equilibrium there or is there a point where there’s too much focus on one or the other?

Sahil Patel: I think so. I’m probably going to, there’s some CRO people that are going to cringe at what I say and that’s okay. I have a commercial sensibility. Look, I think the job of CRO, especially in enterprise, is to make the cash register ring.

Sahil Patel: Full stop. So when you run a test, three things that can happen. The test is a big loser. The test is kind of neutral, a little good, a little bad, or it’s a big winner. If it’s a big loser, you should run it long enough to know that it’s a real loser. I don’t think you need 95% statistical significance to know something’s losing. And I rarely see it flip. Yeah. Yeah. Okay. Something’s neutral. Some edge cases aside, same. Tested big winners are the ones I want to be skeptical of. And I want to run those long enough. And I think you should put a lot of scrutiny on any test that’s delivering double digit lift. But by the way, if you apply that scrutiny and it still wins, you have a real winner. That’s stuff dreams are made of. That’s why we play the game, is to get that kind of double-digit lift. So that’s my take, is go fast, kill the losers, kill your darlings, find the winners, be skeptical of the winners.

Greg Kihlstrom: Yeah. Well, and then when’s the right time to, to scale? You probably, you kind of answered this, but you know, when, when’s the right time to scale a successful test? Like what, what do you need to know? Yeah. Tell me more, Greg, about what you mean by scale a successful test. So some of this could be, you know, throwing more ad dollars into driving traffic or potentially rolling out copycats or something like that as well.

Sahil Patel: Yeah. So we’ll take the first one. You find a winning test. Should you then start put a put more fuel in the tank and really drive more traffic. I would be cautious on like getting one winner. I like to run at least five different experiments on a page before I, because what you really want to do is find the winner of the winners. I’ll give you an example. I’ve seen two very different tests. One that goes strong on showing the product, big product screenshots, really giving people that visual enticement. One that goes heavy on social proof, showing logos, trust badges, testimonials, case studies, and both beat the original control because the original control is just not that good. But you got to pick one version. So you ought to test both. And then maybe you ought to run both in a head-to-head against each other.

Sahil Patel: I would do those things before I say, okay, now I’m going to add 30% more spend and get more traffic here. That’s number one. Number two, again, I would be cautious before saying, okay, we did this and therefore we quote learn something. And now we’re going to do this for all of our landing pages. I’m very skeptical of learnings. I’m skeptical of the learning, uh, Just the world of learnings. And again, just some very smart, very capable people know more about CRO than I, they’re going to super disagree with this. I think most of the time, what you learn from an A-B test, whether it won or lost, is that it worked for this page, for this audience on this time period.

Sahil Patel: And that’s it. Full Stop.

Greg Kihlstrom: I would agree with that too. I mean, that’s, that’s the whole. The whole idea behind continuous improvement and all that is, you know, the world doesn’t ever stay the same.

Sahil Patel: It’s deeply unsatisfying because there’s a whole, I think, cottage industry of people who who’ve created this like this, yeah, cognitive like learnings. Oh, we learned we’re going to learn something from this losing test. And that means we should never do this again on our website. And I just like, I don’t think you know that. I think we can we can tell ourselves and we want to impute this learning. We ran a test. And when we showed the customer logos below the call to action, it failed to beat the control. So learning social proof with logos doesn’t work. I was like, I don’t think you know that. Right, right.

Greg Kihlstrom: Well, yeah, I mean, it’s it’s hard to Like, institutional knowledge in a very quickly changing world is, I think it can be a dangerous thing in that regard. So, yeah.

Sahil Patel: Yeah, it is. And it’s just not satisfying. It’s like, well, we learned something.

Greg Kihlstrom: Right.

Sahil Patel: It’s dynamic. That’s why that’s the whole point of A-B testing is you’re trying to You’re trying to run this kind of clinical experiment and you’re attempting, I want to say this, you’re attempting to control the variables. Yeah. Yeah. And you can do some things to give yourself control, but don’t, I don’t, I think falling for the illusion that it’s a perfectly controlled experiment, which means you can make this extrapolation and we learn something. Now, I’ll show you from one experiment, even a few experiments now. I think there’s a longitudinal learning that absolutely happens if you run a lot of experiments, you have a lot of data. But I’m not talking about an outside third party. They have a different kind of learning because they just get reps, repetitions. But I think if you work inside a company, you’re working in an enterprise, Fortune 1000 company, and you run experiments for a year, I think you can draw some learnings from that. You can say these kind of things tend to work. These kind of things tend not to work. And Okay, great. Yeah, yeah.

Greg Kihlstrom: So, um, one last question before we wrap up here, just wanted to get your, um, your thoughts on, you know, what’s, what’s does the future look like? I mean, you know, we, we barely talked about AI, which is a surprising thing for a podcast in 2024, but, um, you know, what, what’s coming down the pile? What do you, what are you looking for as far as emerging trends or just technologies that are going to help, you know, companies like Spiralyze, you know, just do, do your work and do it better.

Sahil Patel: Okay. So one, I don’t wanna make this an infomercial about Spiralyze. Our business has a network effect built into it. So that’s one of the reasons I’m excited about it. It’s not just based on us acquiring more customers and running more tests for customers. We gotta do that. It’s how we make money. It’s how we stay in business. Every time someone runs an A-B test, but there’s many more people running A-B tests that don’t work with us than do work with us. Sure. And we’re crawling the internet to find all those A-B tests. So to our customers, they’re benefiting from everyone else’s A-B tests. And I think that’s, that’s pretty cool. It’s the definition of network effect. The value of the users grows as the number of users grows. Okay. So I’m geeking out a little bit there on network effects. Um, I think in the, I think in the era of just AI generated stuff, it’s just, it’s harder to cut through the noise. Yeah. And I think good landing pages that they should not surprise people by bizarre, but I think that deliver user delight. are a way to cut through the noise. And there’s a false, I think it makes for like good clickbait. And I think it makes good like, it’s kind of like having a beef on LinkedIn, which, but I don’t do that. But there’s a whole school. So I was like, you want to create, you have something to rail against. Right. There’s like, you know, brand on the one side and then performance marketing on the other side and that they somehow are at odds. I fundamentally disagree with that. I think they’re actually, you might say they’re flip sides of the same coin, but they’re a hundred percent the same coin. And you should do think performance is brand, brand is performance. You can’t have one without the other. They’re both there to make the cash register ring. the way you do that, performance marketing tends to be a little more near-term. You’re running ads, you’re doing demand gen, you’re doing gen capture, and brand tends to be a little more of a longer-term play. You ought to be doing some of both. But if you run A-B tests that don’t honor the brand, I don’t think it works. And I think if you let branding kind of get house and calcify your landing pages where no one is willing to do anything different. I think you also, then you damage the brand because both things are going to make, they’re going to, they’re going to hinder your ability to make the cash register ring. Yeah. Yeah.

B2B Agility with Greg Kihlström