Skip to main content

Choose a winning A/B test sequence

Learn how to choose a winning sequence in your A/B test and set it for all future leads in your campaign.

Updated over a week ago

Learning Objective

By the end of this guide, you'll know how to select a winning sequence after A/B testing, understand what happens to existing and future leads when you make the selection, and learn best practices for applying winning insights to optimize your campaigns permanently.

Why This Matters

Running an A/B test is only half the battle. Once you've identified which version performs better, you need to lock in that winner so all future leads benefit from your learnings. Choosing a winning sequence ensures you're always sending the highest-performing version, maximizing reply rates and campaign effectiveness going forward.

However, this decision is permanent and irreversible. Once you select a winner, you cannot switch back or run new A/B tests on that campaign. Understanding the implications helps you make confident decisions and avoid costly mistakes.

Prerequisites

Before choosing a winning sequence:

  • Your A/B test has run long enough – Wait until at least 100-200 leads have received each version

  • You've analyzed the results – Review campaign analytics to confirm which version truly performed better

  • The performance difference is significant – Small differences (e.g., 8% vs. 9% reply rate) might be random; larger differences (e.g., 8% vs. 14% reply rate) indicate a real winner

  • You're confident in your decision – Once confirmed, this cannot be undone

What Happens When You Choose a Winner

When you select a winning sequence:

All future leads receive the winning version – New leads imported after selection only see the winner ✅ Existing leads continue with their original version – Leads already in the campaign keep the version they started with (A or B) ✅ The losing version is archived – You can still view it, but it won't send to new leads ✅ A/B testing ends permanently – You cannot re-enable the losing version or create new A/B tests in this campaign ✅ Campaign structure locks – The winning sequence becomes the permanent structure

💡 Key insight: This doesn't affect leads already in progress. If 100 leads received Version A before you chose Version B as the winner, those 100 continue with Version A through their entire journey. Only new leads get Version B.

Core Lesson — Step-by-Step Workflow

Phase 1: Verify Your Results

Step 1: Go to campaign analytics

Before making the final decision, review your A/B test results one last time.

Go to Campaigns → Select your campaign → Analytics or Reports.

Step 2: Compare performance metrics

Look at the key metrics for each version:

For subject line tests: Compare open rates For email content tests: Compare reply rates For CTA tests: Compare click rates or conversion rates

Example:

  • Version A: 100 sends, 35% open rate, 7% reply rate

  • Version B: 100 sends, 48% open rate, 12% reply rate

Clear winner: Version B (higher on both metrics)

Step 3: Confirm statistical significance

Make sure the difference is meaningful, not random chance.

Guidelines:

  • If Version B is 30%+ better than Version A → Strong winner

  • If Version B is 10-20% better → Likely winner, consider testing longer if possible

  • If Version B is < 10% better → May be random variation, test longer or accept tie

💡 Pro tip: If you're uncertain, wait for more data. Better to delay the decision than lock in the wrong version.

Phase 2: Select the Winning Sequence

Step 4: Open your campaign

Go to Campaigns → Select the campaign with the A/B test.

Step 5: Navigate to the sequence

Click the Sequence tab to view your A/B tested step(s).

You'll see both versions displayed

Step 6: Choose the winning version

Click on the version you want to set as the winner.

If Version A won: Click on Version A, then select Choose Sequence A.

If Version B won: Click on Version B, then select Choose Sequence B

Step 7: Review the warning message

A confirmation dialog appears with an important warning:

"This action is final and cannot be undone. Version B will become the permanent sequence for all new leads. Leads already contacted with Version A will continue with Version A."

Read this carefully. Make sure you understand:

  • The selected version becomes permanent

  • You cannot switch back

  • Existing leads are unaffected

  • Future leads only see the winner

Phase 3: Confirm and Apply

Step 8: Confirm your choice

If you're confident in your decision, click OK or Confirm.

The winning version is now locked in as your campaign's permanent sequence.

Step 9: Verify the change

After confirming, verify the sequence now shows only the winning version.

The losing version should be archived or marked as inactive.

Step 10: Import new leads to test the change

Add a small batch of new leads (5-10) to confirm they receive the winning version.

Check their journey in the Leads tab to ensure the correct sequence is being used.

What Happens to Existing Leads

Leads already in the campaign before you chose a winner:

Continue with their original version (A or B) for the entire sequence

Are NOT switched to the winning version mid-campaign

Complete their journey exactly as planned when they entered

Why this matters: If 200 leads received Version A and you later choose Version B as the winner, those 200 leads don't suddenly switch to Version B. They finish with Version A. Only leads imported after your selection get Version B.

Example timeline:

  • Day 1-10: Campaign runs A/B test, 200 leads split 100/100 between A and B

  • Day 11: You choose Version B as winner

  • Day 12 onward: New leads receive only Version B

  • Original 200 leads: Continue with their assigned version (100 with A, 100 with B)

This prevents disrupting active campaigns and ensures data integrity.

What Happens to A/B Testing After Selection

Once you select a winning sequence:

You cannot run new A/B tests in this campaign

You cannot re-enable the losing version

You cannot switch the winner later

Workaround if you want to test again:

Create a new campaign

  • Build a new campaign from scratch

  • Test different variations in the new campaign

  • Apply learnings to both campaigns

Practical Application / Real-Life Example

E-commerce Brand Tests Two Outreach Approaches

An e-commerce brand targeting boutique retail stores ran an A/B test comparing two email approaches:

Version A: Long-form email (200 words) explaining product benefits, case studies, and pricing

Version B: Short-form email (60 words) with a single question and calendar link

Results after 300 leads (150 each):

  • Version A: 32% open rate, 4% reply rate, 1.3% meeting booking rate

  • Version B: 46% open rate, 11% reply rate, 4.7% meeting booking rate

Clear winner: Version B (3.5x higher reply and booking rates)

Their decision process:

  1. Analyzed results – Version B dramatically outperformed on all metrics

  2. Saved both versions as templates – Preserved Version A for reference

  3. Selected Version B as winner – Locked it in for all future leads

  4. Imported 500 new leads – All received Version B

  5. Monitored performance – Version B continued outperforming across new batches

Outcome over 3 months:

  • Before: 4% reply rate with original approach (similar to Version A)

  • After: Consistent 10-12% reply rate with Version B across 2,000+ leads

  • Business impact: 150% increase in booked demos, resulting in $200K+ additional revenue

Key takeaway: Choosing the winning sequence and applying it to all future leads compounds performance gains over time. Small improvements scale massively.

Troubleshooting

Issue: I chose the wrong winner and need to switch back

Root cause: Winner selection is permanent and cannot be undone

Fix:

  • You cannot reverse this decision

  • Option 1: Duplicate the campaign, manually rebuild the losing version as the main sequence, and migrate leads

  • Option 2: Create a new campaign using the other version saved as a template

  • Prevention: Always triple-check analytics before confirming winner selection

Issue: Existing leads are still receiving the losing version

Root cause: This is expected behavior. Wxisting leads continue with their original version

Fix:

  • This is not an issue, it's by design

  • Only new leads imported after winner selection receive the winning version

  • If you need existing leads to switch (not recommended), you'd have to manually pause them and re-import them as new leads, which disrupts analytics

Issue: I want to test a new variation after choosing a winner

Root cause: A/B testing is disabled once a winner is selected

Fix:

  • Duplicate the campaign to create a new version where you can run fresh A/B tests

  • Create a new campaign from scratch and test there

  • Use the winner from Campaign 1 as Version A in Campaign 2, test it against a new Version B

Issue: The winning version doesn't show improved performance with new leads

Root cause: The initial A/B test results may have been influenced by factors that don't apply to new leads (timing, audience segment, market conditions)

Fix:

  • Continue monitoring performance over a larger sample size (500+ leads)

  • If performance degrades significantly, investigate whether your audience or market has changed

  • Consider running a new A/B test in a duplicated campaign to validate current best practices

Issue: I can't find the "Choose Sequence" button

Root cause: The campaign may not have an active A/B test, or you're looking in the wrong place

Fix:

  • Make sure you've set up an A/B test first (see "A/B test a step" guide)

  • Go to the Sequence tab and click on the A/B tested step

  • The "Choose Sequence A/B" option appears within the A/B test interface, not in general settings

Optimization Tips

Document why the winner won: Write a note explaining what made the winning version succeed (e.g., "Version B's shorter copy reduced friction" or "Version A's specific pain point resonated more"). This builds institutional knowledge.

Apply insights across campaigns: If Version B won because of a shorter email, apply that lesson to other campaigns immediately. Don't wait to retest the same principle.

Test sequentially: After choosing a winner for Step 1, move on to A/B testing Step 2, then Step 3. Optimize each step independently for compound performance gains.

Monitor long-term performance: Just because Version B won initially doesn't mean it will win forever. Track performance over months and be ready to test again if results plateau.

Use the winner as the new baseline: In future tests, use the winning version as Version A (control) and test new ideas as Version B (variation). This ensures continuous improvement.

Consider audience segments: If Version B won overall but performed poorly with a specific segment (e.g., enterprise vs. SMB), consider creating separate campaigns for each segment with different approaches.

Don't overthink small differences: If Version A and Version B perform within 10% of each other, pick one and move on. Focus on testing bigger swings that produce clearer results.

Retest after major changes: If your product, messaging, or target audience changes significantly, re-run A/B tests. What won last quarter may not win this quarter.

Related Articles

Did this answer your question?