A/B Testing: A Full Guide For Lead Generation AB Testing
What is A/B testing?
A/B testing (aka split testing) is a randomized experimentation process where two or more versions are being sent to different target users at the same time to determine which one makes results and influences the desirable business action.
Whether writing a sales copy, sending an Invite to Connect on LinkedIn, or launching a new website, it is tricky to rely solely on your gut feel. It could cost you time and money, and some valuable leads for that matter. On the other hand, there is A/B testing, a reliable path to making a data-backed decision.
However, A/B testing can be confusing. If you are not sure how to use it and read data correctly, it could completely mislead your strategy.
In this blog you will learn:
- The advantages of A/B testing;
- How to conduct split testing properly;
- The right way to analyze A/B testing results so they don’t mislead your strategy;
- How to apply split testing to your Lead Generation strategy in Skylead’s Smart Sequences.
Let’s go.
P.S. If you are here because you know that A/B testing rocks, skip the first part and jump right away to our guide to successful analysis of your leads’ behavior.
Why should you consider conducting an A/B test?
The advantages of split testing are numerous.
Let’s go through its benefits when it comes to generating leads on LinkedIn or outside of it.
Simple analysis
Probably the best thing about A/B testing is its simplicity of analysis.
Numbers don’t lie.
If you’ve done everything properly (keep reading, more on that down below), determining the winner is a piece of cake.
Relatively small sample
You don’t need to do massive research to realize what drives desired results.
True, more data is usually better for analysis. However, with split testing that is not quite necessary.
After the test has been brought to an end, it will be crystal clear which one your leads found more engaging and therefore more responsive to. Sometimes, you can put a timeframe to your test and not even wait till it’s over. One of the options might already turn out to be an outright winner. However, we do recommend waiting till the end (check out most common mistakes when A/B testing down below).
It’s multi-functional
You can A/B test pretty much anything.
For example, in Skylead, you can create up to 5 variables of Invites to Connect, InMails, Emails, and LinkedIn messages, and measure what triggers your leads to carry out what you consider to be desirable action.
Furthermore, you can experiment with the message copy, subject line, message body, signature, links, or any additional material you would like to include, call-to-action, etc.
Have in mind that you don’t need to test everything out right away. You can gradually improve the successful option and A/B test it down the road over and over again.
Increased conversion rates
Last but not least, it increases conversion rates.
By using A/B testing you are researching the best way of presenting yourself and your product on the market.
Maybe you will find out something you did not know before. Maybe you will find out that leads are reacting more to certain qualities, not even considering your unique selling point in the first place.
Use A/B testing to experiment and market those characteristics. This way you will provide your customers or leads you found via prospecting on LinkedIn with a better experience, and breed trust and confidence in your brand and company.
Once your customers find that you provide value, your conversion rate will skyrocket.
How to set up A/B testing correctly
In Skylead, you can test the entire message copy (LinkedIn message, Email, InMail, and Invite to Connect), one part of it (such as a certain paragraph, subject line, signature, Image & GIF), or the execution time in between the above steps.
Our recommendation is to test only one of the above steps or elements at the time. This is simply because it is harder to draw conclusions in case you are testing several different messages at once. You simply cannot know what influenced a certain action and to what degree.
The same stands for a situation in which you are testing one element of a message. For example, if you would like to check which call-to-action works best, make sure the rest of the message body stays the same across all of the variants. If you would like to further test another part of the message, or subject line, or signature, you should conduct a separate A/B test for each.
What can you split test in Skylead?
Let’s go through everything you can do in the split test in Skylead and why you should do it.
A/B test in SkyleadResultsSubject line (Emails and InMails)Play with the subject line copy, length, emoji, caps lock, etc. Test which subject line affects the desirable open rate. Images & GIFs (Emails, InMails, LinkedIn)Change images and gifs, play with personalization, add or change only one of the custom elements, etc. Check which Images & GIFs influence response/conversion rates. Writing style (across all formats)Test different writing styles and tone of voice depending on your target audience to increase response rate.Formatting (across all formats)Check if breaking the message body into smaller pieces with or without headlines draws the desired results. Call-to-action (across all formats)Check which call-to-action draws the highest number of conversions. Content depth (across all formats)Check if your audience prefers long-form content pieces that extensively cover even the minutest of details or something shorter. Invites to ConnectSend blank Invites to Connect, generic, or personalized messages to see what people respond to most. Paragraph (across all formats)Change the key paragraph in your message body to check your leads’ behavior and see if it draws the desired results. Message body (Invite to Connect, LinkedIn message, InMail, Email)Send a completely different message body to see which one affects response/conversion rates. Time delayCheck if the time in between the messages affects the response/conversion rates. LinksInsert different links and see if your leads show interest for them.Signature Insert links or other copy to your signature that might increase demands for calls and demos.
How A/B testing works
In Skylead, you can create up to 5 different variants to test.
Sure, you can edit any message and at any moment once you’ve started a campaign. However, this is not recommended, unless it is a typo or a minor change that cannot significantly influence the results.
In the following sequence, we tested how three variants of Invites to Connect affected the acceptance rates.
Invite to Connect variant A was sent blank, variant B with a generic message, and variant C with a personalized copy.
We’ve sent a total of 100 Invites to Connect. The system randomly distributed variant A to 33 leads, variant B to another 33 leads, and variant C to 34 leads.
And then we waited for the campaign to finish.
How to read results
Go to Skylead’s Reports Page.
Select the campaign you wish to check results for.
Scroll down and check the step you were testing.
From this analysis, it is clear that variant C drew the highest acceptance rate.
This was a simple example of A/B testing.
However, in some other, more complex cases, consider taking into consideration other factors as well. For example, maybe a certain message with a call-to-action got the highest number of responses, but the other one booked the highest numbers of demos/calls. The end result is the most most important one.
Most common split testing mistakes
Testing too many elements at once
As mentioned above, don’t get too excited and try to test several elements or steps at once.
You will be in a situation where you will not know what factors influenced the result. Therefore, your deduction might be completely misleading.
Test one step or element, and then, if needed, A/B test another step or element until you get all the necessary answers.
Calling A/B tests too early
We always recommend our users to wait until the entire campaign is over and then call the “winner”.
If you have a bigger sample and you are in a rush, you could set a timeframe and check the results once the time is up.
However, sometimes the leading variant at the beginning turns out to be the less engaging one down the road. Therefore, the best practice is to wait.
Completely ignoring other results
Like mentioned before, depending on what you are testing, don’t be blindly led by the numbers.
Wait, what do we mean by that?
For example, you are testing an email subject line. Variant A has the highest open rate, while variant B the highest response rate or demos booked. Which one is the winner then? The one that gets you demos and conversions, naturally. That is your goal, isn’t it? Therefore, take into account the entire results, not just the element you are testing.
Not testing time delays
You think that when and how often you reach out and follow up with your leads is not important? Think again.
This is exactly why we introduced this option in Skylead.
A timely follow up is crucial. On the other hand, you need to be careful to not spam your leads. So, follow the Goldilocks principle — “Everything needs to be just the right amount”.
If you have two or more assumptions on “just the right amount” could be, A/B test them.
Wrapping Up
At the end of the day, A/B testing can give you a quick understanding of certain doubts and help you make a data-backed decision.
If conducted properly, it can dramatically improve results and even help you discover something you did not know about your product or target audience.
To check out the A/B testing feature at Skylead and hear more about how it can bring valuable insights to your Lead Generation, schedule a demo call with our Sales Representative.
This article was originally published on our blog.