split testing Archives - Done For You https://doneforyou.com/tag/split-testing/ Done For You Sales & Marketing Sat, 24 Feb 2024 17:09:44 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://doneforyou.com/wp-content/uploads/2017/01/dfy-podcast-cover-150x150.jpg split testing Archives - Done For You https://doneforyou.com/tag/split-testing/ 32 32 126347446 Split Testing Software: Optimize And Improve Your Website https://doneforyou.com/how-to-use-split-testing-to-optimize-and-improve-your-website/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-use-split-testing-to-optimize-and-improve-your-website Fri, 05 Jun 2020 15:00:08 +0000 http://doneforyoucom.wpenginepowered.com/?p=11053     Video Transcript: Jason Drohn: All right. Hey, what’s up? This is Jason Drohn, welcome to today’s episode of Sales System Experts. How are you doing, Aaron? Aaron Parkinson: We are in life. Jason Drohn: We are in life. Yes, we are. Aaron Parkinson: I’m doing great. Jason Drohn: That’s fantastic. Any good news […]

The post Split Testing Software: Optimize And Improve Your Website appeared first on Done For You.

]]>

Get This Sales Funnel Custom Built >> Click Here!

 

 

Video Transcript:

Jason Drohn:
All right. Hey, what's up? This is Jason Drohn, welcome to today's episode of Sales System Experts. How are you doing, Aaron?

Aaron Parkinson:
We are in life.

Jason Drohn:
We are in life. Yes, we are.

Aaron Parkinson:
I'm doing great.

Jason Drohn:
That's fantastic. Any good news from the week?

Aaron Parkinson:
So many good news. Did a webinar yesterday, had 2,300 people registered, 700 of them, ish, showed up.

Jason Drohn:
Wow.

Aaron Parkinson:
Did about 50,000 in sales in the first wave. Now, the replays will go out and all that good stuff, which usually doubles the conversions of but $100,000 in sales. A really good client of ours, I was helping them do it. We kind of partnered up on it. That was super, super cool. Yeah, man, just so much going on like you, but let me ask you, how was your week? What cool thing happened in your week?

Jason Drohn:
Oh, geez. My cool thing, let's see, I've been on the phone all week.

Aaron Parkinson:
Not that cool really, but cool that so many people want to talk to you and give you money.

Jason Drohn:
Yeah, kind of. Yeah. Yeah. I've been on the phone all week, so it's been a lot of calls, a lot of client calls, just a lot of calls, you know?

Aaron Parkinson:
Did that make it feel like the shortest week in history or the longest?

Jason Drohn:
Oh, it was the longest. I'm an extrovert by nature or an introvert by nature, so yeah, these long call weeks are just not necessarily something that I get into. I'm going to share something that I have been getting into if that's okay?

Aaron Parkinson:
Please do, the topic of our call today.

Jason Drohn:
The topic of our call today is scaling and optimization. This is split testing.

Aaron Parkinson:
Are we going to do 42 calls about this? This could go long.

Jason Drohn:
Yeah. This is more probably just an optimization thing. It's funny because I haven't even shown Aaron this, so we're using this as an example, just as an excuse to have a conversation.

Aaron Parkinson:
Real live example of our stuff that you're filling me in on that we're going to use for training... I love it.

Jason Drohn:
Yeah. Let's see. Okay, so here's a good one. We're testing two different variations of a... So this is a piece of software called Visual Website Optimizer. I was in love with Visual Website Optimizer, and then I wasn't in love with Visual Website Optimizer. Now, I'm back in love with it again.

Aaron Parkinson:
Sounds like a girlfriend.

Jason Drohn:
Right, I know. It was the very first split testing tool that I ever played with. By and large, it was something I was just kind of really... I enjoyed. It also works very, very well. Then, they kind of added a bunch of features to it, and then it just got really like Microsoft. They did what Microsoft does to the software. They just move in, they fucking destroy it by adding a bunch of stupid shit to it, and then it doesn't work as well.

Aaron Parkinson:
Take that Microsoft.

Jason Drohn:
Since they have cleaned it all back up, they've reined in all their new features and now it's pretty cool again.

Aaron Parkinson:
They've hired somebody from Apple.

Jason Drohn:
Right. This is one test. Basically what this test is I'm split testing our old sales funnel page with a new sales funnel page. This sales funnel page is... The only thing that is different on this page, and this page converts at 2% typically, the only thing that's different from this page and this page is a new video. It kicked me to the new variation, but all it is is this video. This video is an updated video that is 50 pounds lighter than the previous video that I recorded a year and a half ago.

Aaron Parkinson:
This is going to be an evaluation of human nature. Do people trust the fat guy more or the skinny guy more? We're going to find out. This is going to be interesting. I can't wait to hear the results of this. That's the only thing that's changed, right?

Jason Drohn:
It's the same script and everything.

Aaron Parkinson:
Wow. This is going to be interesting.

Jason Drohn:
The winner is the skinny guy or the skinnier guy.

Aaron Parkinson:
Oh my God. You, humans, are so shallow.

Jason Drohn:
The variation one, which is the new video, has gotten three out of 58 conversions so far. That's three successful applications compared to zero out of 51 of the control. Now, this is still the early days. It's not like a 200 click test or anything like that. By and large, three to zero is ridiculous, so 6.9% conversion on the new video, which that's saying something, right? Here's the preview. You can see, this is the screenshot of the old video, and then this is the... It looks like this is just kind of a Vimeo screengrab or whatever, as an automated thing.

Aaron Parkinson:
the same thumbnail on both?

Jason Drohn:
Yeah, no, it's a different thumbnail. Both of them are playing, so you have to hit the play in both instances, but the rest of the content is the same. The form is the same, the content's the same, the headlines are the same. Everything besides the video is different. Or everything besides-

Aaron Parkinson:
What would be interesting, if the thumbnails are different, what would be interested just to see is look at the data of how many people played it?

Jason Drohn:
Oh yeah. The other thing we're tracking is engagement. Both of them do have... You have to play, it's not an automated play. In the thumbnail here, check this out. If we look at the previews, the thumbnail is this overview shot of a laptop. Then, the thumbnail on the new one, Vimeo has this weird capture thing, but with Vimeo, it's me. It's a thumbnail of me, my face, full-motion graphic. They have to click play in both scenarios. If we look at engagement, then engagement is... So we're at 34% when it's a full-motion shot, they have to click play on a thumbnail of me, versus 20% when it's a thumbnail of that overview stock photo thing. It's kind of an interesting little tidbit.

Aaron Parkinson:
Interesting. Are we getting... Because we've got 34% versus 20%.

Jason Drohn:
Mm-hmm (affirmative), engagement.

Aaron Parkinson:
Yeah, which is an increase in that video being played almost 100% more than the other one.

Jason Drohn:
Right, and then this has gotten... It's three to zero in terms of the number of applications submitted from the page. Let's look at the click map. Let's just check this baby out. This is control.

Aaron Parkinson:
Yeah.

Jason Drohn:
Then, we see, so we have a click there, a couple clicks through the page.

Aaron Parkinson:
Will this heat map software show us how many clicks were done on that video? Sometimes they'll summarize them in a popup.

Jason Drohn:
I don't know. Currently showing 10 clicks total with an intensity... There's a click map.

Aaron Parkinson:
Maybe the click map will show us.

Jason Drohn:
It doesn't look like it's doing anything. All right, so let's go look at the other one.

Aaron Parkinson:
See, now what would be interesting is put the same thumbnail on both of them.

Jason Drohn:
Yeah, to split testing...

Aaron Parkinson:
Thumbnail just got more people to click it, or if that video was the better performing video.

Jason Drohn:
Look at this. If we look at the heat map for the control, we got a couple of clicks, just random clicks here. This is the heat map for the new video, which is, I mean, it's lit up compared to the other one.

Aaron Parkinson:
It is.

Jason Drohn:
Then, we've got a couple of stuff here. It looks like people are just watching that video more.

Aaron Parkinson:
Yeah. That's what I think. I think that thumbnail of you there talking, whatever you want to call it, a thumbnail of you or whatever is just driving dramatically more traffic through. I'm not sure if I can make a conscientious evaluation of the shallowness of the human race.

Jason Drohn:
It might just be the thumbnail, human being.

Aaron Parkinson:
Yeah. Skinny Jason is more credible than a fat Jason. I think that it might just be a direct correlation to the volume of clicks-through based on the preview image.

Jason Drohn:
Yeah, might be. We should probably just split testing the thumbnail. We'll just test the thumbnail and see, maybe we'll update in a couple of weeks and see how shallow the human race is.

Aaron Parkinson:
Yeah. There's a lot of things that I won't share on here that are just so shallow. By split testing, it's like standard marketing 101, there's stuff I test and you're just like, "Oh man."

Jason Drohn:
That can't work.

Aaron Parkinson:
It's so disappointing, the human race, but it's so predictable. It's the go-to every time.

Jason Drohn:
The other one that we're going to look at is the Funnel Factor Lander, which you guys have seen this in a couple of our... You've already seen this in a couple of things. We're controlling a couple of different... We're doing a couple of different... So I split testing the main headline, so just to kind of catch everybody up, so basically I tested this headline right here and this is the winner by long shot, convert more clicks to customers. It was a way, way, way long shot winner over the other three tests that we ran. I locked that one in, and now I'm at this. I'm testing this black box, the text in this black box, so the updated master guide reveals the step-by-step process to building blah, blah, blah.

Then, this is a slightly different variation of that. Then, we have a shorter variation, the master guide reveals the top converting sales funnels. Then, we have another kind of just a tweaked version, so no major, major changes. It's a very prominent space on the landing page, but not necessarily major language changes there. Do you know? Here's the report. One of them came in... So I disabled version one because that was a fricking dud, so nothing going on there.

Then, we have these other versions. Our control is currently the winner. Our control is outperforming the rest from a leadership standpoint, so when we're looking at the lead. Now, Aaron was like, "Wow, that first sentence is grammatically weird." Yeah, dude, I get it, but it also converts better than everything else, so whatever.

Aaron Parkinson:
In my defense, I did say it could outperform because it's grammatically weird. People will read it two or three times...

Jason Drohn:
Yeah, but here's the crazy part. Here's the anomaly that I was like, "This shouldn't be right. This totally shouldn't be right." All right. Our variation two is converting at 39% and our variation three is converting at 37%. If you're only looking at lead conversion, you would immediately whack these two and then start another split testing, wouldn't you? Okay, so watch this.

If we actually kind of focus on a different conversion point, watch what happens. This is our add to cart, so this is how many people add to cart in their variations. Our control has an expected conversion rate of 1%. This is the best performing lead magnet. Check variation two out though, 12.7% add to cart, which means that our... So let me just flip back real quick. I see Aaron zeroing in, he's like, "What the fuck just happened?"

Aaron Parkinson:
I'm all about it right now. I just heard 12% add to cart, got my attention.

Jason Drohn:
The control is split testing on the front side. When we're optimizing per lead, the control is split testing at 45%, which is awesome. Variation two is converting at 39%, which is less awesome than the control. If we take a step back and we zero in on an add to cart conversion, so the number of people who are going through the confirmation page, buying the video course, or at least attempting to purchase the video course, 12% of variation two clicks through and adds it to the cart. 7% of variation three clicks to the cart, three out of 51. In this, when we focus on add to cart, variation two is the winner of the split testing, not the control.

Aaron Parkinson:
Then, it all comes down for us to ...

Jason Drohn:
Revenue.

Aaron Parkinson:
Return on ad spend. Therefore, the metric that we care about most is the sales metric, not necessarily the lead metric. Now, for people watching this, the three people that will watch it, what would you look at as a base number of conversions for statistical relevance in this case?

Jason Drohn:
I'm usually like 95%. If I can get a probability to be best at 95%, then that's a pretty good number for me, but sometimes it's flat out wrong. At this point, I'm looking at zero out of 49. I'm like, "Well, that thing, there's no way that's going to win. Not in these two, so let's let this two duke it out."

Aaron Parkinson:
Yeah, I agree.

Jason Drohn:
This is just added to the cart. This is just the number of people who added to the cart. Let's take this one step further and see how many people-

Aaron Parkinson:
Uh-oh, are you going to screw up my mind now?

Jason Drohn:
No, actually. Actually kind of. Right now, variation two and variation three are neck and neck, which is, I mean, it is what it is. Three and three out of 51, in each sense, ended up clicking through to purchase, so these are successful buyers since the split testing was rolled out. Just goes to show you that your first, your front end metric, when you're testing is not always the best metric to be looking at. We're going to disable this variation.

Being that this is the control and the base, what we have to do is it's going to... It can't disable base variation, change the base variation, and try again, so we're going to change the base variation into variation two. It's switching the base variation, so the one that originally tests from split testing, and then from there, we disable this control variation because that one sucked. Even though it was the best conversion, we don't care about conversion. We care about sales. These two are our best from a sales standpoint. Make sense?

Aaron Parkinson:
I like it.

Jason Drohn:
What do you think?

Aaron Parkinson:
One, I think that the initial results are very exciting. Number two, that's why we always are split testing.

Jason Drohn:
Right.

Aaron Parkinson:
Always be split testing, tests running 24/7, because of the more testing... As long as you don't... For me, you can over test.

Jason Drohn:
Yeah, totally.

Aaron Parkinson:
If you do it within an intelligent, controlled, organized way, like in the agency, we only test with the inside of 10% of the ad budget. Right? We don't mess up the overall return on ad spend, then you should always be split testing. You should always be trying to beat the control. Always. I do want to point something out, correct me if I'm wrong, the grammatically incorrect one was attracting stupid people who don't buy.

Jason Drohn:
You could make that case. You could make that case.

Aaron Parkinson:
We might have been getting the highest opt-ins there, but they were dumb-dumbs.

Jason Drohn:
Yeah. You could be making... I mean, because the smart ones buy because they recognize quality, right?

Aaron Parkinson:
That's what I'm saying. The buyers were like, "I ain't buying this. This guy can't even speak English." The dumb ones were like, "I resonate with this guy. He speaks like me."

Jason Drohn:
Oh, that's funny.

Aaron Parkinson:
Thank you.

Jason Drohn:
That is some of the big statistical stuff that has kept me sane throughout this process amidst my 35 calls.

Aaron Parkinson:
I love it. I love it, call this a wrap. I have a hard stop at another meeting and you and I have another meeting later today.

Jason Drohn:
We do.

Aaron Parkinson:
Thank you for joining us again on Sales Systems Experts. We'll see you next week.

Jason Drohn:
All right, see you. Bye.

The post Split Testing Software: Optimize And Improve Your Website appeared first on Done For You.

]]>
11053
A/B Testing: Must-know Tips And Recommendations For Insightful A/B Testing https://doneforyou.com/a-b-testing-tips-and-recommendations/?utm_source=rss&utm_medium=rss&utm_campaign=a-b-testing-tips-and-recommendations Wed, 20 Jun 2018 13:29:33 +0000 http://doneforyoucom.wpenginepowered.com/?p=4189 If you are not doing A/B testing or split testing. as it’s also called, you are leaving money on the table. It’s one of the greatest tools available to internet marketers, after the internet itself. It’s a shame if businesses and marketers don’t use A/B testing the way it’s meant to be used. A/B testing […]

The post A/B Testing: Must-know Tips And Recommendations For Insightful A/B Testing appeared first on Done For You.

]]>
If you are not doing A/B testing or split testing. as it’s also called, you are leaving money on the table. It’s one of the greatest tools available to internet marketers, after the internet itself. It’s a shame if businesses and marketers don’t use A/B testing the way it’s meant to be used.

A/B testing is the method of comparing two versions of the same web page (mostly landing pages, in our case), to figure out which version performs better, given that everything is the same, except for one single element that’s being tested.

According to Optimizely,

A/B testing allows individuals, teams, and companies to make careful changes to their user experiences while collecting data on the results. This allows them to construct hypotheses, and to learn better why certain elements of their experiences impact user behavior.

In another way, they can be proven wrong — their opinion about the best experience for a given goal can be proven wrong through an A/B test.

Split testing is at the very core of data-driven marketing, and it essentially makes sure that you inch towards a more positive ROI, while making precise decisions leading to lesser spending for more results.

A/B testing is awesome, but there’s an issue: How do you determine what to test?

A/B testing or split testing

Before we get there, here are a few things you shouldn’t even waste your budget testing:

  • Background color: If you are not using white as your background color, and if you got around to testing the background colors of your pages, there’s something wrong. Get on a strategy call with us if this is the case (we need to talk).
  • Hero section: Image or no image, visuals are important for us humans to process information. You can either use a background image on your landing page, keep the hero section plain, use gradients, or colors. There’s no need to test this.
  • Video vs. No Video: Video just works. If you have a sales letter or a landing page with a video on it, that page is bound to work better than a page without a video.

Now with that out of the way, here’s what you should split test on — from traffic sources to your sales funnels.

Start with testing headlines

At least 8 out of 10 people who arrive on your landing page will read the headline. The headlines you use are the heavyweights of your funnel, they have the capacity to either make that conversion happen or not.

In fact, if there’s only one thing you’d bother A/B testing start and end with headlines and leave everything else alone. Pick the first variant of your landing page and use one type of a headline while you can use another variant of the same headline on your other variant.

Test your offer type

You might think that the offer you are making is out of this world, but you never know if it’s the best offer. The only way to find out what kinds of offers work and what doesn’t is to test them out.

If you are giving away coupon codes for your eCommerce store as one kind of offer, try to see if a non-discount offer, such as a tip sheet or a cheat sheet, works better.

Customers in different niches respond to offers differently. While some prefer discounts, others prefer videos. A few others might demand a lot more than just a little “free something.” Offers for other verticals such as SaaS products are almost always like a “free trial.”

Test call to action buttons

First, never ever use “submit” on your call to action buttons. You are better off without any call to action than to use “submit.”

Don’t write off the power of a button. According to Wingify, more than 30% of all A/B tests done are for CTA buttons. About 1 in 7 A/B test campaigns produce results that are statistically significant. When that happens, the conversion rates go up by 49%.

Flow with the context of your copy and use a call to action text on your button that’s more contextual and colloquial.

Now, how exactly do you determine what copy is appropriate for your CTA button? Michael Aaagard has been doing CTA button tests for years, and Peep Laja of Conversion XL points out that Michael only has two questions to help you arrive at the right copy for your CTA buttons:

  1. What is my prospect’s motivation for clicking this button?
  2. What is my prospect going to get when (s)he clicks this button?

Answers to these two questions are going to be the basis for the button copy and for the copy surrounding it.

A/B testing tips wrap up

That’s it. You only have three different elements on which to do your A/B testing. If your budget permits and if you have the time and resources, you can fine-tune your testing and go deeper if you like. Or you could choose to do A/B testing for several other elements.

If you’re looking to dig in even further, check out this A/B Testing Guide by our friends over at CXL.

But if you are like most businesses strapped for time and resources, we only recommend that you start with the big three elements (headlines, CTA buttons, and offers) and take it from there.

If you need help building custom funnels and layout specific A/B testing strategies relevant to your business, get on a call with us now.

The post A/B Testing: Must-know Tips And Recommendations For Insightful A/B Testing appeared first on Done For You.

]]>
4189
Landing Page Checklist: Get Your Landing Pages Ready For Lead Generation https://doneforyou.com/landing-page-checklist-lead-generation/?utm_source=rss&utm_medium=rss&utm_campaign=landing-page-checklist-lead-generation https://doneforyou.com/landing-page-checklist-lead-generation/#comments Mon, 16 Apr 2018 17:29:25 +0000 http://doneforyoucom.wpenginepowered.com/?p=3523 Recently we published a 27-point landing page checklist (with a downloadable sheet) because we wanted to provide you with a handy tool that you can use every time you create a new page for your sales funnels. Now in this article, we want to get deeper into some technical aspects of creating an optimized landing page. […]

The post Landing Page Checklist: Get Your Landing Pages Ready For Lead Generation appeared first on Done For You.

]]>
Recently we published a 27-point landing page checklist (with a downloadable sheet) because we wanted to provide you with a handy tool that you can use every time you create a new page for your sales funnels.

Now in this article, we want to get deeper into some technical aspects of creating an optimized landing page.

Some businesses are in a rush to launch paid campaigns or start sending some traffic to landing pages to start generating leads. Other businesses tend to build complicated funnels trying to reach perfection before they start driving traffic to the top of the funnel.

The palpable excitement and rush are understandable. The tendency for perfection as well. But without due diligence and a well-thought-out plan, you are likely yo end up wasting time and money.

While there are a lot of success factors that are out of your control when launching a campaign, it’s inexcusable to be clumsy with the essential aspects of a landing page that you are indeed control of.

To help you build a solid foundation for a successful lead generation campaign, and before you start changing creatives, optimizing, and engaging in on-going split testing), here’s a simple landing page checklist.

Landing page checklist

Unmissable landing page elements

Be sure to have some of those design best practices, conversion optimization hacks, and insert appropriate landing page elements:

  • Use a singular form and a button (one Call to Action per page)
  • Have as few form fields as possible.
  • Be sure to have a clear and large headline. Learn more about writing headlines that work.
  • If possible, put some real faces on the landing page (like team photos, headshots of the founding members, images of previous customers), etc.
  • Add social proof in the form of client logos or testimonials or both.
  • Ensure that your landing page is mobile responsive (there’s just no excuse for leaving this one out).

Collect and insert tracking tags and scripts

Facebook pixel - tracking code for landing pages

Depending on the traffic source, there are various tracking tags or scripts that you need to inject into your landing page code to facilitate website tracking. Website tracking, by the way, can help you monitor the behavior of the visitors to your site.

Let’s say you choose to run Facebook ads. You’ll need the base Facebook Pixel code to go on every page that visitors are likely to visit. Such pages are: the main landing page, the form lightbox modal, the thank-you page, and interstitial pages if any.

Then, you’ll have to add the “event” code of the Facebook Pixel to your “thank you” page or the page that shows up after leads sign up. this way you’ll be able to track successful conversions (opt-ins in this example).

If you choose to use Google Adwords, you’ll need to insert the Google remarketing tag. With remarketing,  you are able to run campaigns that target people who have already visited your site at least once.

Are you using third-party services such as Adroll or Perfect Audience for retargeting? You’ll then have to add these tracking pixels to your landing page code too, as advised.

For Facebook Ads, Google Adwords, and for third-party services like Adroll, you’d also want to attribute a numerical value to your leads and sales “event codes” so that you have an accurate record of leads and sales.

It also makes sense to add Google Analytics — while using UTM tracking — to track each of your landing pages so you have a holistic view of traffic being sent to your landing pages.

One URL, two versions

Split Testing

If you had only one landing page for your campaign, and let’s say you got a 20% conversion rate at the landing page level, how would you know this is the best result you can get? You couldn’t know.

That’s precisely why you need to run split tests for your landing pages. By doing A/B testing or split testing, you can test and compare two variations of the same page.

Every landing page will have a single URL but this URL will show different content to each visitor at random. For example, 50% of the visitors will see “CTA A” and the rest will see “CTA B.” Then you will be able to tell which CTA (and landing page variation) performed better in terms of conversion.  This is can be accomplished with A/B testing.

Start by testing a single element of your landing page (such as the headline, the CTA button, or the CTA button color, or a background image).

Gather data and make an informed decision to continue with the champion while discarding the losing versions.

Connect with your email service provider

One of the most important aspects of your sales funnels is the series of autoresponders that you’ll send out to new subscribers or segments of your email list.

Your autoresponders trigger immediately after leads sign up, and they continue to nurture your new subscribers with automated messages, like a simple welcome email, a strategic autoresponder created for sales, or a full-fledged customer onboarding series.

To set up and connect your landing pages with your email marketing platform you can either use some integration plugin or a piece of code that the platform will provide. This is a case-based scenario and the difficulty of implementing such an integration depends on the technology you choose. So please think about this factor before you decide on the exact software you will use for building landing pages and sending marketing emails.

Once you connect, sign up on your own landing page to test if everything fires up alright and that if you are receiving your email messages, notifications, etc.

Trace the path from traffic source to email

website tracking - user path

You’d want a complete picture of everything that happens in your sales funnel for every user that has your cookie in their browser. What you want to track is every aspect of the user behavior throughout their journey from the source of the click all the way to your email autoresponder, and the users’ reactions to your messages.

For instance, after a click on the ad, you’d want to check whether or not your Google Audiences is building up. You’d also want to see if your Facebook Pixel is firing correctly and is recording conversions for your Facebook and Instagram ads.

Is your landing page loading fast enough? Are all the interstitial pages and the final thank you page loading? Are your email marketing workflows triggering correctly?

Get to active optimization, after the campaign goes live

Soon after your campaign launches, you’ll start getting all sorts of feedback. Your landing pages might seem to not convert at all. Is it because have missed one of the steps on this landing page checklist? Revisit the tips we listed in this post, go right into your setup, deep dive, and fix what’s not properly configured.

You’d want to tweak headlines, copy, CTA button copy, replace images, add tracking codes or maybe work on the mobile version of your landing page.

If one of the steps on the landing page checklist is missing altogether, it is advised that you stop your campaigns and go build what’s not there.

If everything is working fine, gather enough data to fine-tune your campaigns; Get busy with optimizing, A/B testing, and using the numbers you have to optimize every stage of your sales funnel.

Taking care of these steps will ensure that you don’t miss out on any of the necessary due diligence.

How are your campaigns performing? If you’d like to talk to us about your funnels, landing pages, marketing campaigns or strategy, get on a strategy call with us. We’ll be happy to discuss how we can put together a Done For You Lead Generation System for you or help deploy a high-converting Done For You Sales Funnel.

The post Landing Page Checklist: Get Your Landing Pages Ready For Lead Generation appeared first on Done For You.

]]>
https://doneforyou.com/landing-page-checklist-lead-generation/feed/ 2 3523
A/B Testing: Performance Testing And Landing Page Optimization https://doneforyou.com/a-b-testing-performance-testing-landing-page-optimization/?utm_source=rss&utm_medium=rss&utm_campaign=a-b-testing-performance-testing-landing-page-optimization https://doneforyou.com/a-b-testing-performance-testing-landing-page-optimization/#comments Tue, 23 Jun 2015 15:07:49 +0000 http://businessinsiders.org/?p=2068 A/B Testing, or split testing as it’s called in some circles, is one of those terms that’s often used, but rarely understood because it involves performance testing and landing page optimization, carried out through a series of scientific-type tests. In the next few days, count the number of times you hear the word “test” or […]

The post A/B Testing: Performance Testing And Landing Page Optimization appeared first on Done For You.

]]>
A/B Testing, or split testing as it’s called in some circles, is one of those terms that’s often used, but rarely understood because it involves performance testing and landing page optimization, carried out through a series of scientific-type tests.

In the next few days, count the number of times you hear the word “test” or “testing” as it related to marketing, and it’ll shock you! I know I say it all that time and I hear it from a lot of different people in their podcasts, webinars and blog posts…

The thing about A/B testing is, by being diligent about the pages that you include in your tests, you can test to a winning campaign if you have the patience and a traffic source.

What I mean by that is most marketing fails.  It’s a fact of business.  Product offers, startups, ad campaigns; the deck is stacked against us most of the time…  But, byA/B testing your marketing and sales material, you can make iterative improvements on the messaging, target markets, images and design, all adding up to a dramatically improved campaign over time.

That’s the trick to landing page optimization…  It’s not one test – oftentimes it’s multiple!

What I want to share with you today is what I call the “Split Test Evolution.”

In this iterative A/B testing process, you’ll see how we test one thing first, find the winner, and then start a second test based on that one control.

After Test 2, Test 3, and so on; you start to truly zero in on your ideal conversion because your landing pages are as closed to optimized as possible…

Most of the time when you get started with A/B testing, you put up a few variations of a page and figure out which one converts best…  Maybe there’s a method to your madness – maybe there isn’t.  But then what?  What happens after you find a winner?

What you DO with the results is what matters in your hunt for landing page optimization.

First, let’s establish some ground rules:

  • Split testing takes patience and practice.  You should only be testing out ONE thing per page, per test.  That might be colors, buttons, headlines or images.
  • For every additional variation you add, you need to send that much MORE traffic!  If you're testing 2 variations, you might need 200 clicks.  3 variations, 300 clicks.  8 variations, 800 or 1000 clicks.
  • Disable the losers when you think they're losing - not when the software tells you they are.  You can always re-enable variations as the apparent winners start to drop (and they always do!)
  • Expect pretty high conversions right after you start a campaign.  The true test of your variations is what they do when you start to scale them...  From 20 clicks to 200 and 500 clicks or more!

Now, to do split testing, you need to have split testing software.  My pick is Visual Website Optimizer because you can test any page you have online, and it’s super easy to use.

Click here to check out VWO >>

Now, let’s look at some tests…

Here’s a landing page optimization campaign that we’ve been running for quite a while.  The traffic source is Facebook Ads driven directly to an opt-in page.  From Facebook Ads Manager, this is a website clicks campaign to a cold audience.

As you can see, Variation 2 is the clear winner with 37.74% conversions after we finished the test.  We discovered that pretty early so the majority of traffic went to that page…

This is actually where the A/B testing started out though…

 

Our very first test, we were getting 24.47% conversions from cold, paid traffic.

Here’s what that page looked like:

 

From there, we tested different headlines to see which one resonated with our audience…

 

The headlines were:

  • 100 "Plug & Play" Subject Lines
  • 100 "Most-Opened" Subject Lines
  • "Tested For You" Subject Lines
  • Subject Lines That Get Your Emails Opened

And as a refresher, here were our stats at the beginning of the A/B testing:

 

As you can see, Version 1 was our winner.  The one that said “100 Most-Opened Subject Lines.”

Now, a 2% bump doesn’t sound like a lot (and it’s not!), but it did give us some very valuable intel…

We knew which headline to use on our next test!

For Test 2, we used the winning variation, Variation 2, and changed the look of the landing page itself.  Now, if you’re familiar with our Scriptly Page Builder, one of these examples is inside Scriptly for you to use…  Largely because of this A/B test!

 

The text is the same – the headline, body copy, book image and top headline.

What we changed were the colors of the background, the button and the button location.

  • In Control and Variation 1 - the difference is the primary background color.
  • In Variation 2 and Variation 3 - we moved the button to the other side of the page.
  • And in Variation 3, we changed the button color.

Here are the results:

 

As you can see, Variation 2 was the HUGE winner of the A/B test, at 37.74%!

Here’s the winning landing page:

 

That’s a 13% bump in optin conversions and landing page optimization that rivals most other landing pages out there, consider this traffic was cold Facebook Ads…  All from two simple tests…

That’s some pretty impressive performance testing for only a few weeks worth of time!  Please take notice too – a 13% bump in conversions means 13 MORE people out of 100 who hit that page optin for the lead magnet…  Meaning, my lead cost dropped and my ad budget is going further for building our list.

Now, the next thing I need to do after this A/B test is test subheadlines to see if there’s a noticeable bump in conversion from them.  I’m guessing that there isn’t, but I’ve been wrong in the past :0)

At the end of the day, make small changes, see what’s working, and use that data to keep improving your conversions!  If you need help with performance testing and landing page optimization, make sure to book a call with us here!

The post A/B Testing: Performance Testing And Landing Page Optimization appeared first on Done For You.

]]>
https://doneforyou.com/a-b-testing-performance-testing-landing-page-optimization/feed/ 19 2192