A/B Test Design
Run simple experiments to test what works
What You'll Learn
- What A/B testing is
- How to set up a test
- Sample size basics
- Common mistakes to avoid
What is A/B Testing?
Simple definition: Show version A to some users, version B to others. See which wins!
Example:
- Version A: Blue "Buy Now" button
- Version B: Red "Buy Now" button
- Which gets more clicks?
Common uses:
- Website buttons and colors
- Email subject lines
- Product features
- Pricing options
Why Random Assignment?
The key rule: Randomly assign users to A or B
Why this matters:
ā Bad: Show A on Monday, B on Tuesday (Monday people might be different from Tuesday people!)
ā Good: Randomly split Monday visitors into A and B (Now groups are the same!)
Bottom line: Random = fair comparison
Simple A/B Test Steps
Step 1: Pick what to test Example: Button color
Step 2: Decide what success means Example: More clicks
Step 3: Calculate how many people you need Use online calculator (see below!)
Step 4: Randomly show A or B to users
Step 5: Wait until you have enough data
Step 6: Check if B is significantly better (p < 0.05)
Step 7: Pick the winner!
Sample Size - How Many People?
Don't guess! Use a calculator:
- optimizely.com/sample-size-calculator
- evanmiller.org/ab-testing/sample-size.html
You need to know:
- Current rate: 10% click now
- Target improvement: Want 12% (2% increase)
- Confidence: Use 95% (standard)
Calculator tells you: Need 3,000 people per group
Pro tip: More people = more reliable results!
Common Beginner Mistakes
Mistake 1: Stopping too early ā "We have 100 clicks, let's check!" ā Wait for your calculated sample size!
Mistake 2: Testing during different times ā Version A on Monday, B on Friday ā Run both versions at the same time!
Mistake 3: Changing things mid-test ā Tweaking version B while test runs ā No changes once test starts!
Mistake 4: Not being random ā "Show A to new users, B to returning users" ā Randomly assign everyone!
Mistake 5: Testing too many things Start with ONE change at a time!
Real Example
Scenario: E-commerce checkout button
Current: Green button ā 15% checkout Test: Orange button ā ???
Setup:
- Use calculator: Need 8,500 people per version
- Run for 2 weeks (500 visitors/day)
- Track: Checkout completion rate
Result:
- Green: 15.0% (8500 people)
- Orange: 16.5% (8500 people)
- p-value: 0.02
Conclusion: Orange wins! (p < 0.05 = significant) Switch to orange button!
Next Steps
Learn about Metric Selection!
Tip: Good experiment design prevents analysis headaches!