A/B testing your website personalization
RightMessage automatically A/B tests your personalization campaigns so you can measure whether personalized content actually improves conversions. No extra setup required, just attach a goal to a campaign and testing begins.
This guide covers campaign-level A/B testing (testing personalized vs. default content on your website). If you're looking to test different versions of questions, offers, or forms inside a Flow, check out our guide on split testing questions and offers.
How campaign A/B testing works
When you add a goal to a personalization campaign, RightMessage splits your traffic into two groups:
Personalized group (90% by default): Sees your targeted, segment-specific content.
Control group (10% by default): Sees your original, unmodified content.
This split is configurable. You can adjust the control group percentage in your campaign settings depending on how aggressive you want the test to be.
Visitors are assigned to a group for the duration of their session. If someone lands in the control group, they'll stay in the control group for every page view in that session. This prevents confusion when your personalization changes things like headlines or offers across multiple pages.
Strategy tip: Start with a 10% control group. Once you've confirmed personalization is working, you can drop it to 5% to maximize the benefit while still tracking performance over time.
Setting up a test
Create a personalization campaign with at least one variant and targeting rule.
Add a campaign goal (click, pageview, or custom event). Without a goal, there's nothing to measure.
Set your control group percentage in the campaign settings.
Publish your changes.
That's it. RightMessage starts collecting data immediately.
Where to see your results
Campaigns dashboard (aggregate view)
The main Campaigns page shows high-level stats across all your campaigns:
Traffic: Total pageviews on campaign pages
Personalized experiences: Unique visitors who saw personalized content
Conversions: Total conversion count and overall conversion rate
Improvement (lift): Percentage change compared to the control group, along with a confidence level
This is useful for a quick pulse check, but for the real insights you'll want to dig into individual campaigns.
Individual campaign view (where the good stuff is)
Click into any campaign to see detailed reporting. This is where you can see what's actually driving results.
Variant-level breakdown: For each audience variant you've set up, you'll see:
Unique visitors who saw that variant
Traffic percentage (as a visual bar)
Conversions and conversion rate
Lift vs. the control group
The control group is shown separately at the bottom, so you can compare each personalized variant against the baseline.
Segment combination breakdown: This is the most useful view. It shows your top segment combinations ranked by traffic, so you can see exactly how visitors with specific segment combos are converting. For each combination you'll see:
Which segments the visitor belongs to (e.g., "SaaS + Enterprise")
Which variant(s) targeted them
Traffic, conversions, and conversion rate
This lets you answer questions like "are freelancers converting better with our freelancer-specific headline, or would the default have been fine?"
Statistical significance
RightMessage calculates statistical significance automatically for each campaign:
Confidence level: How certain we are that the difference is real (shown as a percentage)
Lift: The percentage improvement of personalized content over the control
Results are highlighted in green when confidence reaches 95% or above
Strategy tip: Don't call a winner too early. Wait until confidence is above 95% before making permanent changes. Low-traffic sites may need several weeks of data.
Best practices
Always have a goal attached. No goal means no data. Attach a goal to every campaign you want to measure.
Test one thing at a time. If you change the headline, CTA, and testimonial all at once, you won't know which change drove the result.
Let tests run long enough. A 20% lift based on 50 visitors doesn't mean much. Give your test time to accumulate enough data for reliable results.
Check the segment combination view. Your overall lift might be modest, but personalization for a specific segment could be driving outsized results.
Reduce your control group over time. Once you've confirmed personalization works, shrink the control to 5%. Keep a small control running to catch regressions.
Publish changes before expecting data. Campaign, goal, or control group changes won't take effect until you click Publish to website.