Channel
Interviewed Person
Conferences
Take a deep dive into how Vercel helps businesses experiment at scale without sacrificing performance with Kylie Czajkowski, Engineering Manager of Growth at Vercel. Grow, scale, and stay performant: https://vercel.com/home
Vercel
Interviewed: Conferences
Hi, my name is Kylie, and today I will be talking about A/B testing, taking a deep dive into how Vercel achieves experimentation at scale without sacrificing performance. Today we're going to cover what A/B testing actually is, defining it as a concept, why your team should consider introducing A/B tests, how to design a balanced experiment or A/B test, executing that experiment. And then at the end we'll do a demo of A/B testing
using Edge Middleware in a Next.js application. But first, let me tell you a little bit more about my team. So my name is Kylie Czajkowski, I am the engineering manager of the growth team here at Vercel, and I work with six other engineers to partner with growth managers, analysts and a designer to design and implement data-driven experiments. This means that my team takes the time to think about user journeys in the application and what metrics help us understand them better.
We also work with other teams within Vercel to improve features and iterate over ideas. So let's talk about what A/B testing actually is. You might actually be familiar with this concept already. A/B testing is also known as split run or bucket-testing. It is the strategy of comparing two or more versions of an interface against each other to determine which performs better. Otherwise known as experiments, A/B tests present a variant to a user at random and this will help you unlock a better understanding
of user behavior, seeing how folks interact with that variant and whether they prefer the control to the variant or certain elements of that variant. So the best A/B tests or experiments should be simple, focused and measured. This means that they should be simple in how they look at limited variables, maybe only changing a subset of variables or a component at any given time. Focused in on one or two key metrics, really understanding what the user journey is through that metric and how that conversion impacts performance
in your application. And then closely measured and driven by that data. So the metrics that you're assessing should also be driving the decision-making at the end of your experiment. So now that we've defined A/B testing, let's ask ourselves why you might be interested in A/B testing. So I've seen a number of benefits to A/B testing firsthand, but I'd like to highlight just a few that I think are the most impactful to dev teams. The number one benefit of A/B testing that I'd like to highlight is the improved user understanding.
By deepening your understanding of user goals, you can better your product to suit those needs. If we understand what folks are looking for out of your application, you can better highlight those features or get that user journey a little bit cleaner, tighter and better understood so that your user is not having a frustrating experience and you're providing the best possible experience within your application. Another great benefit of A/B testing is that your results are backed by data, so they're a lot harder to argue with. By gathering a wide range of metrics, you empower your team with the data that they need
to make critical decisions. Using empirical evidence, you can make larger overarching changes to features within your application and have an idea of what measurements you'd actually like to see, how they would perform, and whether you would like to turn that variant off, potentially end your experiment early based on a metric that's performing lower, or if maybe you ship it directly to Alpha because the experiment is performing so well at the time of statistical determination. Another benefit to A/B testing that I've seen
is increased conversions. And while this might not necessarily be your dev team's metric, this is something that will benefit your organization as a whole, no matter what application you're building. By targeting and improving the metrics that matter most to your organization, you can make that user experience better for the end user of your platform. Finally, a benefit that I've seen of A/B testing firsthand is the ability to get out of your own way. You and your dev team might get a little caught up in what you should be shipping, what you should be focusing on next.