How Meta’s Algorithmic Audience Targeting Impacts Ad Distribution: A Test

on

|

views

and

comments


A long-running mystery in the era of algorithmic Meta ad delivery can finally be answered: How much do our targeting inputs matter?

I’ve run a test that reveals how much Meta distributes ad delivery between my remarketing audiences and prospecting while relying on algorithmic targeting and expansion. The results are surprising, encouraging, and enlightening.

This post is a bit of a rabbit hole, but it’s worth it. Let’s get to it…

Background and Historical Context

Years ago, targeting was simple. We made a series of selections using custom audiences, lookalike audiences, detailed targeting, and demographics. We then expected that our ads would reach people within those groups.

But, that all began to change with the introduction of Advantage audience expansion. At first, it was an option. Then expansion became the default for detailed targeting and lookalike audiences with certain objectives. And finally, Meta introduced the next level of hands-off, algorithmic delivery: Advantage+ Shopping Campaigns and Advantage+ Audience.

Luckily, Meta made audience segments available to provide important visibility into how Advantage+ Shopping Campaigns were delivered. We could then see how much of our budget went to our engaged audience, existing customers, and new audience (or prospecting). This was critical since these campaigns didn’t allow for any of the audience inputs we typically expected.

Meanwhile, advertisers confronted with the unknown of how Advantage+ Audiences delivered their ads often chose the greater control found with original audiences. But even then, audiences often expanded. The mystery went unanswered.

And then Meta expanded access to audience segments for all campaigns that utilize the Sales objective (this feature is still rolling out). While this includes Advantage+ Shopping Campaigns, it also applies to any manual campaign that utilizes the Sales objective. And this doesn’t require optimizing for a purchase.

This new option opened up a world of possibilities for testing and transparency. I recently wrote a blog post about the test I was starting. And now I’m ready to share my initial results.

My Test

The basis of this test was simple. I wanted to use audience segments to get a better sense of how my ads were delivered when using the following targeting setups:

  1. Advantage+ Audience without suggestions
  2. Advantage+ Audience with suggestions using custom audiences that match my audience segments
  3. Original Audiences using custom audiences that match my audience segments — with Advantage Custom Audience turned on

This was all part of a single campaign that utilized the Sales objective and a website conversion location.

Meta Ads Test

Since the purchase conversion event isn’t required for this objective, I used this test to promote a lead magnet that utilizes the Complete Registration standard event.

Website Conversion Event

In terms of demographics, I used all ages in the countries of the United States, Canada, United Kingdom, and Australia. These are the four countries that make up the largest percentages of my customer base.

I initially started running the ad sets concurrently before I quickly switched gears and ran one at a time without distraction. I spent a modest $270 (or so, not exact) for each ad set.

I contend a large budget wasn’t necessary for this test since my questions were answered rather quickly. My focus wasn’t on whether any of these ad sets were “successful” in terms of generating conversions. Far too many factors impact Cost Per Conversion (the ad, the offer, the landing page), and that just wasn’t a concern here.

Granted, spending thousands of dollars would give me more confidence in these results. And I’ll certainly be monitoring whether what happened here continues with my advertising in the future. But, there were very clear learnings here, even with a modest budget.

My primary concern was simple:

  1. How will ads get delivered?
  2. How will my budget get spent?
  3. How will it be distributed between my engaged audience, existing customers, and new audience?

We have answers.

Defining My Audience Segments

A critical piece to this test is how I’ve defined my audience segments. This is done within your ad account settings.

1. Engaged Audience. These are people who have engaged with my business but have not made a purchase. I’ve used a website custom audience for all visitors during the past 180 days and a data file of all of my newsletter subscribers.

Engaged Audience

2. Existing Customers. These are people who have made a purchase. I used website custom audiences and data file custom audiences for those who have bought from me before.

Existing Customers Audience Segment

There will be overlap between these two groups, of course. A Meta representative confirmed that if anyone is in both groups, they will only be counted as an existing customer.

Once these are defined, we’ll be able to use breakdowns by audience segments in Ads Manager to see results of sales campaigns for each group.

Breakdown by Audience Segment

Test Group 1: Advantage+ Audience Without Suggestions

This may have been the biggest mystery of all. When you use Advantage+ Audience without suggestions, who will see your ads?

Advantage+ Audience

Meta gave us some clues in their documentation, indicating that remarketing was likely a big part of where delivery starts.

Advantage+ Audience

But this passage isn’t definitive, and I wanted to prove this actually happens — or doesn’t. Well, it happens. Boy, does it happen.

Advantage+ Audience No Suggestions Audience Segments

I didn’t provide any audience suggestions, yet a very large chunk of my budget was spent on remarketing to my defined audience segments. More specifically, percentages dedicated to my engaged audience and existing customers…

1. 35.4% of amount spent
2. 23.7% of impressions

That’s incredible. I never would have expected the percentages to be that high. Note that the percentage of impressions is lower because the CPM to reach my audience segments is nearly twice as high as that for the new audience.

This is a relief. While I’ve trusted in Advantage+ Audience up until now, I generally provide audience suggestions because of that small amount of doubt in the back of my mind. But, this proves that Advantage+ Audience doesn’t require suggestions to reach a highly relevant audience.

Test Group 2: Advantage+ Audience With Suggestions

This got me thinking. If Advantage+ Audience without suggestions results in spending 35.4% of my budget on remarketing to my audience segments, what would happen if I provided suggestions? More accurately, what if I provided suggestions that were custom audiences that exactly match the definitions of my audience segments?

Advantage+ Audience Suggestion

It’s reasonable to assume that even more of my budget would be dedicated to these groups. Once again, if we were to take Meta’s explanation of how Advantage+ Audience works, that’s a reasonable explanation. Meta says that if you provide an audience suggestion, they will “prioritize audiences matching your suggestions, before searching more broadly.”

Well, here’s what happened…

Advantage+ Audience Suggestions Audience Segments

So that you don’t have to do any math, here are the percentages dedicated to my engaged audience and existing customers when using audience suggestions that matched those audience segments…

1. 32.4% of amount spent
2. 29.0% of impressions

By comparison, here are the percentages when not providing any suggestions:

1. 35.4% of amount spent
2. 23.7% of impressions

So, a higher percentage (by 3%, but still higher) of my budget was spent on reaching my audience segments when not providing suggestions than when using custom audiences that matched those audience segments as suggestions. While the percentage of impressions dedicated to those groups was higher, that’s because the CPM to reach my new audience was higher with this approach.

If we hadn’t first tested Advantage+ Audience without suggestions, we’d say that this test proved that Meta did in fact prioritize reaching the audience suggestions before going broader. But, since at best there was no difference in prioritization when not providing any suggestions at all, it’s inconclusive.

My take: Audience suggestions are optional, and in some cases they are unnecessary. If you have an established ad account with extensive conversion and pixel history like I do, you probably don’t need it. In fact, it may even be (slightly) detrimental.

Test Group 3: Original Audiences Using Advantage Custom Audience

Many advertisers have chosen to use original audiences instead of Advantage+ Audience because they don’t trust the lack of transparency of algorithmic targeting. So, I wanted to test one more thing that could be proven with audience segments.

Audience segments won’t help us with better understanding ad distribution with Advantage Detailed Targeting or Advantage Lookalike. While they will help us understand how many of the people reached were already connected to us, it won’t answer questions about how much the audience is expanded — and how that compares to using Advantage+ Audience with or without suggestions.

But, we can learn a lot from how expansion works with Advantage Custom Audience. In that case, Meta should prioritize the custom audiences we provide before expanding and going broader. Technically, it may not have to go broader, and we don’t know how much broader it goes when it does.

So, I ran a test that was similar to the one where I used Advantage+ Audience with suggestions. In this case, I used original audiences and provided the custom audiences that match my audience segments. And then I turned Advantage Custom Audience on.

Advantage Custom Audience

Here are the results…

Advantage Custom Audience Audience Segments

Here’s how that breaks down by budget spent and impressions towards the original custom audiences…

1. 26.4% of amount spent
2. 24.1% of impressions

Interesting! In this case, we’d assume that the audience would expand the least and a higher percentage of the budget would be spent on the custom audiences. But, this approach actually resulted in the lowest percentage of budget spent on those groups. The percentage of impressions dedicated to those groups is about the same as when using Advantage+ Audience without suggestions.

Another point to note is that the overall CPM was highest with this approach, though it’s not much higher than when using suggestions. That’s largely driven by a higher CPM to reach the new audience.

The Results: Overall Evaluation

To recap, here are each of those ad sets in one view…

Meta Ads Test Results for Audience Expansion with Audience Segments

There’s no reason to split hairs here about which approach led to spending more or reaching more of my audience segments. It’s within a margin of error related to randomness that could flip if we tested again — or continued the test.

The main takeaway is this: The overall breakdown in distribution between my remarketing audience segments and new/prospecting audiences was virtually the same for each approach. It made very little difference when using Advantage+ Audience without suggestions, Advantage+ Audience with suggestions, or original audiences using custom audiences and Advantage Custom Audience turned on.

This provides strong evidence that Advantage+ Audience does exactly what Meta says it does. At least in my case, there’s strong evidence that using suggestions is completely unnecessary — or marginally impactful.

I’m also a bit surprised that using the original audiences approach resulted in as much expansion as it did. I expected delivery to hold closer to the custom audiences that I provided — at least in comparison to using Advantage+ Audience.

I didn’t want Cost Per Conversion results to be a distraction in this test because they were not a priority when evaluating distribution. But in case you’re wondering, those results followed very similar trends. Each ad set generated virtually the same number of conversions (within a range of randomness). But, Advantage+ Audience without suggestions provided the most conversions, followed very closely by the other two approaches.

Contributing Factors

It’s important to remember that while these results are generally reflective of how algorithmic ad delivery distributes our ads, they are also unique to me and how this test was set up. There are several factors that may have contributed to what I saw, and you may get very different results.

1. Budget. As I’ve said before, a lower budget still gives us meaningful information here. But, it’s reasonable to expect that the more money I spend, the less will be spent on my audience segments, audience suggestions, or custom audiences. Those audiences will become exhausted and more would likely be spent on the new audience.

2. Audience segment sizes. Very closely related to budget, but this clearly impacts the volume of results I can see from remarketing to these groups. The total sizes of these groups for my test are roughly over 200,000, but closer to 100,000 when limited by the four countries I targeted. The smaller this pool, the less can be spent there.

3. Time elapsed. It’s reasonable to assume that the greatest distribution to these audience segments and custom audiences will happen in the beginning, prior to growing expansion to new audiences. This is again related to the sizes of the audiences and the rate of exhausting them. None of these ad sets ran for a full week, so those percentages would likely drop with time.

4. Conversion event. Since I’m still in the very early stages of analyzing results using audience segments, it’s not clear how much the conversion event used for optimization impacts distribution. We know it does — Meta will make algorithmic changes to find people willing to perform the action that you want. But, it’s not clear how much the event impacts distribution to audience segments, if at all. I used Complete Registration for the conversion event here. Distribution may be different for purchases or custom events.

5. Ad account history. There’s a strong argument that can be made that I should use Advantage+ Audience and there’s no reason to provide audience suggestions. But, that doesn’t mean that’s the case for everyone. It’s possible this is viable for me because of an extensive ad account history with pixel and conversion data to pull from. New accounts, new pixels, or websites that get minimal activity may not have the same advantage. They may see much different results here.

6. Campaign construction. I went back and forth on how to run this. I didn’t run this as an A/B test because I wanted to evaluate natural distribution, rather than forcing delivery without overlap. I also chose to run these ad sets at separate times, rather than concurrently. Even though they ran separately, overlapping delivery was likely (some people may have seen the same ad from multiple ad sets). These decisions likely impacted my results.

Overall, this has been a fun test, but it’s also incomplete. These are numbers I will continue to monitor with my ads going forward to see how it plays out in the future.

Your Turn

Have you run a similar test of manual sales campaigns to see how ads are distributed for you? What did you learn?

Let me know in the comments below!

Share this
Tags

Must-read

How To Get on TikTok’s For You Page

TikTok, with over 1 billion monthly users,...

Understanding Demand Generation Vs Lead Generation

Two critical strategies often emerge in marketing and sales: Demand Generation (demandgen) and Lead Generation (leadgen). While these terms are sometimes used interchangeably,...

5 Effective Strategies for Building a High-Performing Global Team

Opinions expressed by Entrepreneur contributors are their own. Global expansion is a...

Recent articles

More like this

LEAVE A REPLY

Please enter your comment!
Please enter your name here