This guide pairs with the simple A/B testing setup walkthrough if you want the product flow open beside it.
Let's say you have a landing page. It's fine. Traffic comes in, some people sign up, most don't. You've got a feeling the headline could be better but you're not sure what "better" looks like. So you've been staring at it, rewriting it in your head, maybe asking a friend what they think.
Here's a faster way to figure it out: test both versions and let actual visitors decide.
This is a walkthrough of setting up your first A/B test in SplitPea. The whole thing takes about five minutes. You don't need to know anything about statistics or have any technical setup beyond the ability to paste a script tag into your site's HTML.
Step 1: Add the snippet
Copy the SplitPea script tag from your dashboard and paste it before the closing </head> tag on your site. It looks like this:
<script async src="https://app.splitpea.co/snippet/sp.js" data-site="your-site-id"></script>
That's the entire install. It works on any site that lets you edit HTML: static sites, WordPress, Webflow, Squarespace, Shopify, whatever. The script is under 8KB and loads asynchronously, so your page speed won't change.
Once the snippet is live, go back to your SplitPea dashboard. You'll see a confirmation that your site is connected.
Step 2: Create an experiment
Click "New experiment" and fill in the brief. Here's what you'll need:
Name. Something you'll recognise later. "Homepage headline test" is fine. "Test 1" is fine too, but you'll regret it when you have ten experiments in your history and can't remember which was which.
Hypothesis. Optional, but worth writing. Even one sentence helps. Something like "A shorter headline that mentions the free plan will get more signups." You're writing this for your future self, the person who'll look at the results in two weeks and try to remember what the point was.
Page. The URL you want to test. SplitPea needs to know which page the experiment runs on.
Goal. What counts as a conversion. Usually a button click ("Start free," "Book a call," "Add to cart") or a page visit (like a thank-you page after a form submission). Pick the action that matters most for this test. One goal per experiment keeps things clean.
Step 3: Create your variant
This is where you decide what to change. You've got two options.
Visual editor. Click "Edit variant," and SplitPea opens your actual website with an overlay. Click the element you want to change (your headline, a button, a paragraph) and type the new version. You'll see it update live on the page. When it looks right, save it.
CSS selector. If you're comfortable with code, you can target an element with a CSS selector and define the change manually. Same result, different workflow.
For your first test, keep it simple. Change one thing. The headline is a good starting point because it's the first thing visitors read and it's easy to write two versions of. Don't try to test a completely different page layout on your first go.
Some ideas for a headline test:
- Short vs. long ("Get more clients" vs. "A portfolio site that turns visitors into paying clients")
- Benefit vs. feature ("Grow your freelance business" vs. "Custom portfolios with built-in booking")
- Direct vs. question ("Build a better portfolio" vs. "Is your portfolio losing you clients?")
Pick two versions you genuinely think could both work. The point isn't to test something obviously bad against something obviously good. It's to find out which of two reasonable options your visitors prefer.
Step 4: Launch and wait
Hit publish. SplitPea will start splitting traffic between your control (the original page) and your variant. Each visitor gets randomly assigned to one version, and they'll keep seeing the same version if they come back.
Now you wait. This is the hardest part, honestly.
How long depends on your traffic. If your page gets a few hundred visitors a day, you might have a result in a week. If it gets fifty visitors a day, it could take a few weeks. That's normal. The math needs a certain number of data points before it can tell you anything useful.
You can check your experiment page whenever you want. You'll see:
- Visitors in each group
- Conversions for each version
- Conversion rate (conversions divided by visitors)
- Lift (how much better or worse the variant is compared to control)
- Confidence (how sure SplitPea is that the difference is real)
Confidence is the number that matters most. Below 90%, the result is inconclusive. SplitPea can see a difference but can't be sure it's not just random noise. Above 90%, you've got a winner.
Don't end the test early because one version is ahead after a day. Early leads are unreliable. Let it run until SplitPea tells you it has enough data.
Step 5: Read the result
When confidence hits 90% or higher, SplitPea will mark a winner. If you want to understand what that number actually means, we wrote a plain-English explanation. On our platform, you'll see something like:
Variant B outperforms control with 94% confidence.
At that point, you've got a few options. You can end the experiment and apply the winning version to your site permanently. You can keep running it to get even higher confidence. Or, if the result is inconclusive after a reasonable amount of time, you can stop it and try a different test.
Either way, the experiment stays in your decision history. A month from now, when you're wondering "didn't we test that headline already?" you can look it up and see exactly what happened.
What to test next
Once you've run one experiment, the process stops feeling unfamiliar. You'll start noticing things on your site and thinking "I wonder if a different version of that would work better." That's the right instinct.
Some good second experiments:
- Button text on your main CTA ("Start free" vs. "Try it out" vs. "See pricing")
- The first line of your product description
- Social proof placement (above the fold vs. below)
- Form length (fewer fields vs. more fields with better context)
Each test gives you one more data point about what your visitors actually respond to. Over time, that adds up. Your site gets better because you're making decisions based on evidence instead of guesswork.
That's the whole idea.