Your First Test
In this guide, you'll create a button color A/B test from start to finish. By the end, you'll have a live experiment running on your site that measures which button color gets more clicks.
What you'll accomplish:
- Create an experiment in the CADENCE dashboard
- Add the test code to your site
- Track button clicks
- Start the test and verify it works
Watch: Creating your first A/B test
Follow along as we build a button color test, from dashboard setup to live results.
8:30
Video coming soon
What you need before starting:
- CADENCE installed on your site (Installation guide)
- Access to edit your website's code
- A button you want to test (like a "Sign Up" or "Buy Now" button)
Step 1: Create the experiment in your dashboard
Before writing any code, you'll set up the experiment in the CADENCE dashboard. This tells CADENCE what you're testing and how to split traffic.
- Go to your project in the CADENCE dashboard
- Click Create Experiment
- Fill out these fields:
| Field | What to enter | Why |
|-------|--------------|-----|
| Name | signup-button-color | This exact name goes in your code — they must match |
| Variants | control and green-button | Control = your current button. The other is what you're testing. |
| Traffic | 100% | Percentage of visitors who enter the test |
| Split | 50 / 50 | Half see the original, half see the new version |
| Targeting Rules | (optional) | Restrict the test to users with specific URL parameters (e.g., only ?utm_source=google) |
- Click Create — but don't click Start yet (we'll do that after adding the code)
Targeting by URL parameters
You can add targeting conditions so the test only runs for visitors with specific URL parameters. For example, add a condition where utm_source equals google to test only paid search traffic. All conditions must match (AND logic). Leave targeting empty to include all visitors.
Names must match exactly
The experiment name you enter in the dashboard (signup-button-color) must be the exact same string you use in your code. If they don't match, every visitor will see the default version.
Step 2: Add the test code to your site
Now add code to your site that asks CADENCE which variant to show each visitor, then changes the button color accordingly.
Add this after the CADENCE initialization code you added during installation:
<script>
// Wait for CADENCE to finish loading before running the test
Cadence.ready().then(function() {
// Ask CADENCE which version this visitor should see
// Returns either 'control' or 'green-button'
var variant = Cadence.getVariant('signup-button-color');
// Find the signup button on the page
// Change '#signup-btn' to match your button's ID
var button = document.querySelector('#signup-btn');
// If this visitor is in the 'green-button' group, change the color
if (variant === 'green-button') {
button.style.backgroundColor = '#16a34a'; // Green
button.style.color = '#ffffff'; // White text
}
// If variant is 'control', the button stays exactly as-is
// Track when someone clicks the button
// This is how CADENCE measures which color gets more clicks
button.addEventListener('click', function() {
Cadence.trackConversion('signup-click');
});
});
</script>
How to find your button's ID or selector
Right-click your button on the page and select Inspect. In the code that appears, look for id="something" — use #something as your selector. No ID? Look for class="something" and use .something instead. For more details, see the CSS Selectors guide.
Check it worked
Save your changes and refresh your page. About half the time you should see your original button, and the other half, a green button. Each visitor consistently sees the same version. To force a specific variant for testing, add ?cadence_variant=green-button to your URL.
Step 3: Start the test
- Go back to the CADENCE dashboard
- Find your
signup-button-colorexperiment - Click Start
- Your test is now live — visitors are being split between control and green-button
Check it worked
Visit your site and click the button a few times. Then go to your experiment's detail page in the dashboard. Within 10 seconds, you should see exposure events (visitors who saw the button) and conversion events (visitors who clicked it).
Step 4: Preview each variant
Before letting the test run, make sure both variants look right.
From the dashboard (easiest): Open your experiment's detail page and look for the Preview & QA card. It generates a preview link for each variant with a copy button — just click to open any variant in a new tab.
Manually with URL parameters: You can also force a specific variant by adding ?cadence_variant=Name to any page URL:
- See the control version:
https://yoursite.com?cadence_variant=control - See the green button:
https://yoursite.com?cadence_variant=green-button
Preview links bypass targeting rules, so they always work even if you have URL param conditions set.
Check both versions in your browser. Does the green button look good? Is the text still readable? If something looks off, update the code and re-check before letting the test run.
How long should I let it run?
For reliable results, a test needs enough visitors and enough time:
- At least 7 days — people behave differently on weekdays vs. weekends
- At least 100 clicks per variant — not 100 visitors, but 100 people who actually clicked
- Until CADENCE says "Significant" — the dashboard will tell you when the result is reliable
Don't stop your test early
Checking results after one day and seeing one variant ahead doesn't mean it's actually better. Early results are unreliable — like flipping a coin 5 times and getting 4 heads. Wait for CADENCE to confirm the result is statistically significant (meaning it's a real difference, not luck).
When your test is done, head to Reading Results to learn how to interpret the data and decide on a winner.
More test ideas
Once you're comfortable with button color tests, try these patterns:
Copy variations
Test different wording on headlines, buttons, or descriptions:
Cadence.ready().then(function() {
var variant = Cadence.getVariant('hero-headline');
var headlines = {
control: 'Start your free trial',
urgency: 'Start your free trial — limited time',
social: 'Join 10,000+ teams who use CADENCE',
};
document.querySelector('#hero-title').textContent =
headlines[variant] || headlines.control;
});
Pricing experiments
Test different price points or presentation:
Cadence.ready().then(function() {
var variant = Cadence.getVariant('pricing-display');
if (variant === 'annual-focus') {
document.querySelector('#price').textContent = '$19/mo (billed annually)';
} else if (variant === 'monthly-focus') {
document.querySelector('#price').textContent = '$29/month';
}
});
Important for pricing tests
The price shown on the page should only be for display. Your actual payment system should always use the server-side price. A savvy visitor could inspect the page and see different prices.
Multi-variant tests (A/B/C/D)
Test more than two versions by adding more variants in the dashboard and handling them in your code:
Cadence.ready().then(function() {
var variant = Cadence.getVariant('onboarding-flow');
// Could be: 'control', 'short-form', 'video-intro', or 'interactive-tour'
if (variant === 'short-form') {
showShortOnboarding();
} else if (variant === 'video-intro') {
showVideoOnboarding();
} else if (variant === 'interactive-tour') {
showInteractiveOnboarding();
}
// 'control' — show the default onboarding
});
Best practices
-
Test one thing at a time. If you change the button color AND the headline at the same time, you won't know which change made the difference.
-
Let the test run for a full week. Visitor behavior varies by day of week. Monday through Sunday captures the full pattern.
-
Don't peek at results and stop early. Looking at results repeatedly and stopping when you see a winner leads to false conclusions. Set your test duration upfront and stick to it.
-
Use consistent names. Use kebab-case for experiment names (
signup-button-color, notSignupButtonColor). Be descriptive — your future self will thank you.
Don't change tests mid-flight
Changing variant weights, adding variants, or modifying traffic allocation while a test is running invalidates the results. If you need changes, stop the test, make your changes, and start a new one.
Next steps
- Tracking Events — learn what to track beyond button clicks
- Reading Results — interpret your data and pick a winner
- Visual Editor — make visual changes without writing code
- CSS Selectors — find the right selector for any element