Skip to main content
RapidDev - Software Development Agency
bubble-tutorial

How to run A and B testing in Bubble

Run A/B tests in your Bubble app by randomly assigning users to test variants, tracking conversion events per variant, and comparing results to determine which version performs better. This tutorial covers building a simple split testing system using custom states, database tracking, and conversion analytics.

What you'll learn

  • How to randomly assign users to A/B test variants
  • How to show different UI versions based on variant assignment
  • How to track conversion events per variant
  • How to calculate and compare conversion rates
Book a free consultation
4.9Clutch rating
600+Happy partners
17+Countries served
190+Team members
Beginner6 min read20-25 minAll Bubble plansMarch 2026RapidDev Engineering Team
TL;DR

Run A/B tests in your Bubble app by randomly assigning users to test variants, tracking conversion events per variant, and comparing results to determine which version performs better. This tutorial covers building a simple split testing system using custom states, database tracking, and conversion analytics.

Overview: Setting Up A/B Testing in Bubble

A/B testing lets you compare two versions of a page element to see which drives more conversions. This tutorial shows you how to build a split testing system natively in Bubble — no external tools required. You will randomly assign users to variants, show different content, and track which variant leads to more signups, clicks, or purchases.

Prerequisites

  • A Bubble account with an app ready to edit
  • A specific element or page you want to test (e.g., a headline, button color, or layout)
  • Basic understanding of custom states and workflows
  • At least 100 expected visitors for meaningful results

Step-by-step guide

1

Create the experiment data model

Go to the Data tab and create a Data Type called Experiment with fields: name (text), variant_a_label (text), variant_b_label (text), is_active (yes/no), and start_date (date). Create another Data Type called ExperimentAssignment with fields: experiment (Experiment), user (User), variant (text — 'A' or 'B'), converted (yes/no), and assigned_date (date). This tracks which variant each user sees and whether they converted.

Expected result: Two data types ready to manage experiments and track user assignments.

2

Randomly assign users to variants on page load

On the page you want to test, add a Page is loaded workflow. Add a condition: Only when Do a search for ExperimentAssignments where user is Current User and experiment is your test returns empty. This ensures each user is assigned only once. Add a Create new ExperimentAssignment action. Set variant to a random value: use the expression Calculate formula with random number. If the random number is less than 0.5, set variant to A; otherwise set variant to B. Store the result in a custom state called user_variant on the page.

Pro tip: Use a 50/50 split for simple tests. For more advanced tests, adjust the random threshold (e.g., 0.7 for 70/30 split).

Expected result: Each new visitor is randomly assigned to variant A or B, and the assignment is saved permanently.

3

Show different content based on variant

Create two versions of the element you are testing. For example, two Group elements: Group Variant A with a blue call-to-action button and Group Variant B with a green button. Set conditional visibility: Group Variant A is visible when page's user_variant is A. Group Variant B is visible when page's user_variant is B. Enable Collapse when hidden on both groups so they do not take up empty space.

Expected result: Users assigned to variant A see the blue button; variant B users see the green button.

4

Track conversion events

When the user performs the desired action (e.g., clicks the CTA button, completes signup), add a workflow action: Make changes to the ExperimentAssignment where user is Current User and experiment is the active test. Set converted to yes. This records that the user in their assigned variant completed the goal action.

Expected result: Conversions are tracked per user and linked to their assigned variant.

5

Build a results dashboard

Create an admin page called ab-results. Add text elements showing: Variant A conversions (search for ExperimentAssignments where variant is A and converted is yes, then count), Variant A total (search where variant is A, count), and calculate the rate: conversions divided by total times 100. Repeat for Variant B. Display both rates side by side so you can compare performance.

Pro tip: Wait until each variant has at least 100 assignments before drawing conclusions. Small sample sizes produce unreliable results.

Expected result: A dashboard showing conversion rates for both variants with total counts.

6

End the experiment and apply the winner

Once you have statistically significant results (typically 200+ assignments per variant), set the experiment's is_active field to no. Remove the variant B group and keep the winning variant as the permanent version. Optionally, delete or archive the ExperimentAssignment records to clean up your database.

Expected result: The winning variant becomes the permanent version and the experiment data is archived.

Complete working example

Workflow summary
1A/B TESTING WORKFLOW SUMMARY
2================================
3
4DATA MODEL
5 Experiment:
6 - name (text)
7 - variant_a_label (text)
8 - variant_b_label (text)
9 - is_active (yes/no)
10 - start_date (date)
11
12 ExperimentAssignment:
13 - experiment (Experiment)
14 - user (User)
15 - variant (text: A or B)
16 - converted (yes/no)
17 - assigned_date (date)
18
19WORKFLOW: Assign Variant (Page is loaded)
20 Only when: No existing assignment for this user + experiment
21 Step 1: Create ExperimentAssignment
22 - experiment: current test
23 - user: Current User
24 - variant: if random < 0.5 then A else B
25 - converted: no
26 - assigned_date: Current date/time
27 Step 2: Set state user_variant = Result's variant
28
29WORKFLOW: Load Existing Assignment (Page is loaded)
30 Only when: Assignment exists for this user
31 Step 1: Set state user_variant = existing assignment's variant
32
33UI CONDITIONALS
34 Group Variant A: visible when user_variant is A
35 Group Variant B: visible when user_variant is B
36
37WORKFLOW: Track Conversion
38 Trigger: CTA button clicked (or goal action)
39 Step 1: Make changes to user's ExperimentAssignment
40 - converted: yes
41
42DASHBOARD FORMULAS
43 Variant A rate: (A converted count / A total count) * 100
44 Variant B rate: (B converted count / B total count) * 100

Common mistakes when running A and B testing in Bubble

Why it's a problem: Not persisting variant assignments in the database

How to avoid: Always save the variant assignment to the database and load it on subsequent visits.

Why it's a problem: Drawing conclusions from too few visitors

How to avoid: Wait for at least 100-200 assignments per variant before comparing conversion rates.

Why it's a problem: Running multiple A/B tests on the same page simultaneously

How to avoid: Run one test per page at a time. If you must test multiple elements, use multivariate testing with a single experiment tracking all combinations.

Best practices

  • Persist variant assignments in the database so users always see the same version
  • Wait for statistical significance (200+ per variant) before choosing a winner
  • Test only one variable at a time for clear attribution
  • Run tests for at least 1-2 weeks to account for day-of-week variations
  • Use the 50/50 split for most tests to reach significance faster
  • Archive experiment data after completion to keep the database clean
  • Document what you tested and the results for future reference

Still stuck?

Copy one of these prompts to get a personalized, step-by-step explanation.

ChatGPT Prompt

I want to A/B test my Bubble.io landing page headline and CTA button. How do I randomly assign visitors to variants, show different content based on assignment, track conversions, and compare results?

Bubble Prompt

Create an A/B testing system for my landing page. Add Experiment and ExperimentAssignment data types. On page load, randomly assign users to variant A or B. Show different headline text based on variant. Track when users click the signup button as a conversion.

Frequently asked questions

Can I A/B test without requiring user login?

For logged-out users, store the variant in a browser cookie using the Toolbox plugin's JavaScript action. However, cookies reset if the user clears their browser, so logged-in testing is more reliable.

How do I know when my test has enough data?

A rough rule: each variant needs at least 100 conversions for reliable results. Use an online significance calculator to check if the difference between variants is statistically significant.

Can I test more than two variants?

Yes. Instead of A/B, create A/B/C tests by dividing the random number into thirds (0-0.33 = A, 0.33-0.66 = B, 0.66-1.0 = C). This requires more total traffic for significance.

Should I use an external tool like Google Optimize instead?

Google Optimize was sunset in 2023. Building A/B tests natively in Bubble gives you more control and avoids external dependencies. For advanced statistical analysis, export your data to a spreadsheet.

How do I handle users who visit on multiple devices?

If users log in, their assignment follows them across devices via the database. For logged-out users, each device gets an independent assignment.

Can RapidDev help set up advanced experimentation in my Bubble app?

Yes. RapidDev can build sophisticated testing frameworks including multivariate tests, feature flags, gradual rollouts, and statistical significance calculators.

RapidDev

Talk to an Expert

Our team has built 600+ apps. Get personalized help with your project.

Book a free consultation

Need help with your project?

Our experts have built 600+ apps and can accelerate your development. Book a free consultation — no strings attached.

Book a free consultation

We put the rapid in RapidDev

Need a dedicated strategic tech and growth partner? Discover what RapidDev can do for your business! Book a call with our team to schedule a free, no-obligation consultation. We'll discuss your project and provide a custom quote at no cost.