A/B testing lets you create two versions of the same sequence (A and B) so you can compare performance and keep the variant that gets the best results (typically replies).
Learning objective
By the end of this tutorial, you’ll know how to create an A/B test for a lemlist sequence, build two variants (content and/or structure), and choose a winning version.
Why this matters
Small changes (first step type, timing, CTA, personalization, or channel mix) can significantly impact reply rate. A/B testing helps you validate what works with your audience using real campaign data, instead of guessing.
Prerequisites
You have a campaign with a sequence created (at least one step). If you don’t, follow Phase 0 below.
You know which element you want to test (e.g., first touchpoint, copy, delays, or channel order).
Core lesson — step-by-step workflow
Phase 0 (optional): Create a campaign and add your first step
Go to Campaigns, then click Create campaign.
Select Create manual campaign.
In Build my campaign manually, choose your first step (for example, Email) to create your sequence.
Phase 1: Create your A/B test variants
Start an A/B test from your sequence
In your campaign sequence builder, open the three-dot menu at the top of the sequence.
Select A/B test this sequence.
Choose how to create Sequence B
When prompted, choose whether you want to:
Fill sequence B (duplicate Sequence A as a starting point), or
Don't fill sequence B (start from scratch).
Duplicating is usually best when you want to change only 1–2 variables.
Edit Sequence B
Once Sequence B is created, switch to it and update the copy and/or steps you want to test (keeping the rest as consistent as possible with Sequence A).
(Optional) Remove Sequence B
If you want to cancel the test and go back to a single sequence, open the sequence menu and select Delete sequence B.
Phase 2: Decide what to test (content vs. structure)
For a clean test, change only one major variable at a time (e.g., first step type, messaging angle, or delays). You can test both content and structure, but the more differences you introduce, the harder it is to know what caused the improvement.
Content tests: subject line, CTA, opening line, personalization, offer positioning.
Structure tests: first step type (email vs. LinkedIn action), wait times, channel order, number of steps.
Practical application (recommended approach)
Pick one hypothesis: e.g., “Starting with a LinkedIn action will increase replies.”
Keep everything else consistent: same lead list, same sending window, similar total duration.
Create your A/B test before launching: if you launch a campaign first and create an A/B test afterward, all leads already in the campaign will stay in Version A. Only leads added (and launched) after the A/B test is created will be split between Version A and Version B.
Run the campaign long enough to collect meaningful results before deciding.
Choose a winner (stop A/B testing)
Important: Choosing a winner is irreversible. Once you choose the winning sequence, A/B testing is stopped and the choice can’t be undone.
When you’re ready to stop the test and keep one version going forward, open the three-dot menu at the top of the sequence.
Select Choose sequence B.
Read the warning, then click Confirm to apply your choice.
Note: If you duplicate a campaign after choosing a winner, the duplicated campaign will also duplicate the winning version (not both A and B).
Troubleshooting & pitfalls
Issue: Results are inconclusive (no clear winner)
Root cause: Not enough leads or not enough time running.
Fix: Increase sample size, run longer, or simplify the test to one variable.
Issue: You changed too many things between A and B
Root cause: Large differences make it hard to attribute performance to a specific change.
Fix: Duplicate the winning sequence and test one variable next (subject line, CTA, first step type, wait time, etc.).
Issue: You launched the campaign before creating the A/B test
Root cause: Leads already in the campaign were started on Version A before the A/B test existed.
Fix: Create the A/B test before launch next time. For the current campaign, only leads added (and launched) after you create the A/B test will be split between Version A and Version B.
Review results and pick a winner
After your campaign ends, review the metrics for Sequence A vs. Sequence B and reuse the best-performing variant for future campaigns targeting the same audience.











