Multivariate Email Testing: 7 Steps to Mastery

on

|

views

and

comments



MVT is a testing method used to experiment with several elements of a webpage at once. At its core, it’s a way to modify and test multiple components of your page to measure which version would yield the best possible outcome.

Just think about all the different possibilities for combinations of page elements to test. Say you’re a car company and you want to test a set of new images and corresponding copy. Should the document go above the image? What about a video component?

For situations like the one we’re describing here, MVT is the most systematic way to efficiently test which option would get you the results you’re looking for. This won’t be the methodology to use if you’re just trying to decide whether the background should be red or blue (which we’ll get into that in a minute), but if you’re thinking about making more drastic upgrades, MVT is the way to go.

We’ve broken MVT down into seven simple steps:

  1. Identify your pain points: As a manager, business owner, or decision-maker, you know your audience’s pain points when engaging with your material. Be honest with yourself and identify these problems.

  2. Make a plan to resolve them: Digest qualitative and quantitative information to create a calculated hypothesis that addresses your pain points.

  3. Choose a control: Be sure to have a control version to compare with the number of variables. Your control version could be whatever you currently have on the website and that you’re looking to optimize. Identify different alternatives to test and prepare to measure their results using the MVT.

  4. Segment your audience: Divide your audience into measurable configurations. This will help you track how different audiences will receive different designs.

  5. Identify your conversion goals: What should the main takeaway be from this page? Do you want users buying a product or subscribing to a newsletter? Make sure you’re prioritizing the right goal.

  6. Begin the MVT: Have your use cases engage with the control and variants to test variations and track the results until your goal is reached.

  7. Conclude: Identify what decisions led to results with statistical significance to know which to implement.

When using MVT, you’ll learn more about your target audience, and after completion, you’ll have more insight into how to optimize your content to be more attractive to them going forward.

So you’re working on an incredible campaign to boost your sales, and it has a fantastic page layout, hero image, and header. But you want more people to click through to your landing page by making your call-to-action button (CTA) even more enticing. Let’s say you want to try testing two options:

To test which option works best, you would perform an MVT. For that, you’ll create and send four variations (two texts x two positions) of the same email:

  1. CTA “Buy now” before the fold

  2. CTA “Buy now” at the bottom

  3. CTA “Discover our amazing prices!” before the fold

  4. CTA “Discover our amazing prices!” at the bottom

Although this seems relatively straightforward, the test gets a little more complicated when you add a third element. But don’t worry, because this also means the results are even more powerful.

Let’s take this unbelievably creative campaign and your big CTA dilemma. You’ll test the position and text, but what about the color? Think about it. With the button color alone, you have an endless number of combinations. You could use red, blue, or maybe no color at all.

So now we’ve got these three things to test:

  • The text in the CTA: “Buy now” or “Discover our amazing prices”

  • The position of the CTA: Before the fold or at the bottom of your email

  • The color of the CTA: Red or blue?

Now that we’ve got our three considerations, it’s time to create eight variations of our digital content: 2 texts x 2 positions x 2 colors. It might have seemed overwhelming before, but with MVT, we’ve simplified the process by narrowing our choices down to these options:

  1. CTA “Buy now,” before the fold, in red

  2. CTA “Buy now,” before the fold, in blue

  3. CTA “Buy now,” at the bottom, in red

  4. CTA “Buy now,” at the bottom, in blue

  5. CTA “Discover our amazing prices!”, before the fold, in red

  6. CTA “Discover our amazing prices!”, before the fold, in blue

  7. CTA “Discover our amazing prices!”, at the bottom, in red

  8. CTA “Discover our amazing prices!”, at the bottom, in blue

The next step is to simply send these different variables to your audience and gauge which combinations work best to increase interaction effects and result in the most conversions.

You can see which of your CTA combinations encouraged more contacts to click when analyzing the results. Just like that, your decision will lead to better content optimization and a higher conversion rate.

Think of MVT tests and A/B tests as different versions of a way to help you decide on which option to choose. The main difference between multivariate testing and A/B testing (often referred to as split testing) is that A/B testing is the technique you should use when you’re only deciding between two versions of one element.

Let’s say that you’re deciding if you should wear black or blue jeans. A/B testing is the trick up your sleeve to help you quickly decide that black makes way more sense. When you have multiple options to choose from regarding your user experience (UX), email marketing, or web design – or what jeans to wear – stick to MVT to test multiple variables at once.

Hypothetically, if you were trying to test the effectiveness of the color of the CTA button at the bottom of your re-engagement email, you would use the A/B testing approach to help you decide which option to go with. After seeing that option 2 yielded a 7% conversion rate, while option 1 only resulted in 5%, you could make the educated decision to run with option 2’s color scheme for your CTA button.

With A/B testing, you’re also choosing between options that are noticeably more different than one another. Conversely, with MVT, each version you decide between only has slight differences. You might not even notice the impact of some of the variants since the changes are so minute.

Evidently, each of these two mechanisms requires different amounts of time. Since A/B tests only have two variations, you can use a smaller audience to achieve the statistical results you need to make an informed decision between your two options. A/B testing wouldn’t produce reliable information when testing for a larger number of variations. So, although it requires more time to evaluate more variations and a larger audience sample size, MVT is more efficient for making multiple small adjustments. Which type of test you need can determine what key performance indicators (KPI) you want to track or how drastic a change you’re making.

MVT helps you understand the complex relationship between combinations of elements and how they work together to achieve the best results possible. As we mentioned earlier, sometimes you might have eight or more options to choose from, and the differences between them are so slight that they seem negligible. But with MVT, you can get a better sense of how these minor tweaks combine into a more successful option. It makes more sense to use this when you’re enhancing a landing page or a homepage and you don’t want to lose time or revenue.

Your amount of traffic limits MVT’s usability. A/B testing splits your audience in half, and 50% of website traffic goes to option A and the other to option B.

With MVT, however, that audience is fragmented further. For example, if you have eight options, your audience is split into eighths. The smaller the group viewing each variance, the less reliable the results of the MVT will be.

Let’s say you have an audience of 50 people. If you’re using A/B testing, two groups of 25 people will engage with options A and B. But if you use MVT, each of your eight options will be seen by just shy of seven people. If you don’t have enough traffic, there’s a strong likelihood that you may have trouble completing the MVT because you won’t have a meaningful audience.

Now, we know you’re about to get carried away and start planning many crazy ideas for your next email campaign. Hold on, there! First things first, we need to set some ground rules.

  1. Consider why you’re testing. No, ticking a box isn’t a valid answer. Tests are only helpful if you can identify the problem you want to solve. For example, “I’m not getting as much traffic to my website as I predicted, and I want more people to click through to my landing page” is a reason.

  2. Identify the elements that need to be tested. Now, if your open rate is around 35% and you’re still seeing a minimal click-through rate (CTR), the problem might be in the elements that your contacts see when they open their email. Consider what features can factor into your test results and develop ideas to shake things up.

  3. Always think about your metrics. Your email stats will provide the information you’re looking for after your test, but it’s essential to understand their meaningful results. If you’re testing different combinations of subject lines and names, then the best performance indicator will be your open rates, whereas if you’re playing around with the color and text of your CTA, you’ll want to look at the CTR.

At Mailjet, we don’t believe in testing just two email variations. Our customers have been boosting their email campaign conversion rates for over a decade. Our email testing tool is championed as an industry-leading solution to testing many email elements including subject lines, sender names, images, and buttons.

With Mailjet’s AB Testing, you can test up to 10 versions. You can either come up with 10 really witty subject lines to perform a simple test or create totally different variations, altering several elements within your message. If your email provider doesn’t offer something like this, perhaps it’s time to consider switching to Mailjet.

But, we don’t stop there. You can also leverage Campaign Comparison to test different combinations across time, like sending day and time. Does early morning on a weekday work better than midday during the weekend? Or is lunchtime on Mondays best?

Love it? Can’t wait to get started? Play around with our A/B testing tool and tell us how you’ve found your winning combination on Twitter.

With Mailjet, send the email that will generate the best engagement from your lists. Guessing can be costly – design and test up to 10 versions of your emails to maximize your chances of reaching your targets and optimize your campaign’s results.



Share this
Tags

Must-read

A Test: Do Audience Suggestions Matter?

Last week, I ran an ad set without audience suggestions and it resulted in spending 35% of my budget on remarketing. Now let’s...

The POWER of PERSUASION | Advocating Through Gridlock

Education. Economic Impact. Collaboration. At a time when legislative gridlock in Congress means fewer regulations created by statute as...

Roommates Started a Side Hustle With $4k and Made Over $500k

This Side Hustle Spotlight Q&A features Rachael Leina'ala Soares and Heather Aiu, co-founders of ALOHA Collection, which...

Recent articles

More like this

LEAVE A REPLY

Please enter your comment!
Please enter your name here