Kogan.com

Making Smarter A/B Testing Decisions with Event Tracking and Session Replays

2024-9-12

Ryan Barker

A/B tests are theoretically simple but sometimes offer challenges when insufficient data or events are tracked. In e-commerce, conversion rates are often highlighted as a key metric but what causes it and why. By leveraging additional tools that enable event tracking and session replays such as FullStory we can attach context and understand what users are actually doing, allowing us to make data-driven decisions which is crucial in modern business.

Event Tracking: What Happened

When it comes to e-commerce, every click matters, which is why we need to have an event-tracking mechanism. Event-tracking enables tracking user actions across your website, such as adding an item to the cart, hitting a call to action button, or proceeding to checkout. This data is essential in understanding the different behaviors that exist among your A/B test variants.

For example, if you’re testing two versions of a product page, event tracking helps you see:

  • Click Rate: Which of the two gets more clicks on the ‘Add to Cart’ button?
  • Engagement: How long do users spend on the page and how many elements such as images, product descriptions, or reviews do they interact with?
  • Form Submissions: What user feedback method leads to the best uptake for optional benefits?

This approach is beneficial because it allows you to understand what’s happening behind the scenes, beyond just looking at the final sales numbers.

Session Replays: Why it Happened

While event tracking shows what happened, session replays reveal the why. Watching a replay of a customer’s experience (with sensitive data masked) often uncovers behaviors and friction points you, as the developer, didn’t anticipate or encounter during testing. It’s an insight you simply can’t get from final sales numbers, and it’s invaluable when trying to identify behavioral patterns or usability issues.

For example, if event tracking shows a significant drop-off with Variant A users who aren’t reaching the checkout page, session replays might reveal the layout is confusing or that error messages aren’t clear enough.

Here are some of the key benefits:

  • See Drop-Off Points: Identify the exact actions that lead to user drop-offs. Are customers struggling to find important information? Are they hesitating after reading a specific detail or policy that prevents them from moving forward?
  • Spot UX Issues: Observe how users interact with different elements and locate potential friction points. Is adding items to the cart not as intuitive as you thought?
  • Analyse Navigation Patterns: Understand how users move through your site, and see if anything disrupts their journey. Are there too many steps to complete a purchase? How many clicks does it take before users finish their session? Does your marketing align with their needs?

Making Data-Driven Decisions

The key to making smarter decisions from A/B tests is combining event tracking data over a sufficient period with the visual insights gained from session replays. Some changes show clear trends in just a few days, but for more subtle tweaks, you may need weeks or even months to avoid making decisions based on daily fluctuations. Here’s the typical process:

  • Implement Variants: Roll out the new feature to 50% of users while keeping the original version as a baseline. It's best to test one factor at a time, but you can run multiple variants if you’re working with short development cycles, as long as you have a clear understanding of the baseline performance.
  • Gather Data: Track key metrics like clicks and form submissions to gauge how users interact with each variant. In e-commerce, the conversion rate is often the most important data point, but depending on your test, other interactions may be more insightful.
  • Watch Replays: If the data shows significant changes or unexpected differences, watch session replays to uncover the reasons behind user behavior. This helps you reach the root of issues that numbers alone can't explain.
  • Prioritise Changes: Use the data and replays to decide what improvements will have the biggest impact. It could be a simple design tweak, a clearer call-to-action, or even a complete layout overhaul. The key is to be data-driven, avoiding assumptions or personal biases.
  • Iterate Quickly: Roll out small, incremental changes, monitor the results, and keep iterating. A continuous testing and improvement cycle is essential for long-term success.

Wrapping It Up

While conducting an A/B test it's essential to combine event tracking with session replays. Event tracking shows what users are doing, while session replays reveal why they behave that way. Together they form the required knowledge for developing data-driven strategies, addressing user problems, and enhancing the quality of the product. As developers this allows us to move quickly, and most importantly, to align with the business strategy.