I think I’ve discovered the EASIEST way to earn more money from Facebook Ads.
I spent yesterday working on a case study that will be shared on Facebook’s blog. They asked me to share campaign examples that produced the best results for our company.
I thought I would end up sending a case study on one of our dynamic remarketing campaigns, but as I looked back through our numbers, I found that ALL of our best results in this past year have come from 1 campaign type...
Tune in to today's podcast to know which campaign type it is that can let you earn more from Facebook Ads.
As always, if you have any questions and suggestions, please feel free to leave a comment below. Don’t forget to share this with someone who needs to hear it.
What's Covered in This Episode:
Links From This Episode:
What's up everybody? Anton Kraly here and welcome back to the eCommerce Lifestyle Podcast. If you're new, just tuning in, just know that this show comes out twice a week. We have a new episode every single Monday and Thursday morning. And every episode is designed to help eCommerce store owners to increase their revenue, automate their operations, and become the authority in their niche. Now, this show is called eCommerce Lifestyle. My main training company is called Drop Ship Lifestyle. And the lifestyle part of it means that everything that we do when building and working on our businesses is designed to help us get the biggest upside possible while putting in the smallest effort. That doesn't mean we don't work hard, but it means that everything that I share with you, whether it be on this podcast, at our live event every year, the drop ship lifestyle retreat, or in our training programs, is all designed to help you make sure your time and your investment of money is being maximized and optimized to get you the best results.
And what I want to share with you in today's podcast should definitely help you achieve just that, because what I realized as I was looking into our Facebook ads, going back for the past six to nine months or so, this is definitely what has helped us the most in terms of being able to actually spend more, but get cheaper results, cheaper cost per results. So, basically be able to scale and scale even more profitably than the numbers we were doing at a smaller ad spend. So, I'll also say before we get into this, this episode is definitely designed for people that are already familiar with the whole Facebook Ads ecosystem that are ideally already running them. If you're not, feel free, obviously still listen in, get some takeaways from this, but these aren't things that you'll want to do from day one.
These are things that you'll want to do as your ads are already running, you're getting results, and again, you want to get better. You want to make your current ads perform better. So, basically what happened, when was it? Maybe, I don't know, a few weeks ago I was on a bi-weekly call that I do with our Facebook account rep. And she said, "I see you guys are getting great results. You've been growing. I would like to basically send you this form that you can complete and it's for you to have a case study for your business published on Facebook's official blog." So, they have an advertising blog. And I was like, "Okay, that sounds awesome. I'd love to do it." And she sent over basically what they would need from me, all the requirements, the date ranges, the links to accounts, our objectives and everything that they want to basically polish it up and publish it on their blog.
So I was looking through what they're looking for and I spent a few hours yesterday going through our ads and what I thought I was going to end up submitting and using for this case study really was our dynamic remarketing ads, because those are by far what get us our best return on ad spend. So money in versus money back, our dynamic remarketing ads do extremely well. So I thought that would make a great case study for Facebook.
But again, I pulled up our ads, dove deep. And what I realized is the things that have had the biggest impact, and again, the things that have allowed us to scale, so overall spending more money, but actually spending less to get results, so spending more even more profitably, which is unheard of in advertising, is using Facebook experiments.
So, what I want to do in this episode is just talk through the few experiments options there are within Facebook. Then I'll talk about the ones that we use and how we use them to get really amazing results. I would hope they're better than most if we're going to be writing about it for Facebook's blog. But basically when you go into your Facebook business manager, there is a section called Facebook experiments. Now, I don't know when they renamed it that. Maybe a year ago or so. It used to be called, it just said split test. It said test and learn. And then there was some other verbiage. But only, I think, within the past year did they rebrand that to Facebook experiments. So, what this is when you click into it is the option to test things and be able to do it within Facebook natively instead of having to use some expensive third party tool or get a developer to figure it out for you.
So when you go to Facebook experiments in your ad account, you'll see three different types. So the three different types that you can run, first is an A-B test. The next is what's called a holdout test. And the third option is a brand survey. So, those are the three. Now with A-B tests, we use this personally to split test creatives. So, if I wanted to see how, let's say, a remarketing ad would do that was a carousel ad that was showing dynamic products to the people that just saw them on our website, maybe that would be one version I was testing. And maybe I would test that versus, instead of a carousel, just one static image or one video. So that could be a creative split test. Another A-B test that we can use there could be, okay, both of them, we're going to use the product carousel for the dynamic remarketing.
But the thing that we're going to test is the copy above it. So, maybe version A, we're going to test, seems like you forgot to complete your order, come back now and save 10%. And maybe another version is something, I don't know, we make it a little bit funnier or we add more scarcity to it. Like, "Hey, we saw you, didn't complete your checkout for product name. We have X amount left in stock. Come back now. Secure yours." So, whatever it may be, whatever you want to test for your business. Those are the types of things that you could test within the A-B testing tool inside Facebook experiments. And that's where we are really putting the, I would say, the majority of our experiments into. Now the holdout tests, this one's pretty cool too. It's more useful if you are currently advertising on multiple platforms. If you're listening to this podcast and you're part of Drop Ship lifestyle, I would assume you are most likely advertising, at least on Facebook Ads, definitely Google Ads. And most likely you're doing Bing as well.
And most likely you're getting organic traffic. So from Google and Bing on not the ad side and what the holdout test does is it allows you to choose an audience. So, let's just say your best audience is, I don't know, a 1% look alike audience of previous customers. Now, if you run a holdout test, what it will do is continue showing your ads to 90% of the people in that audience, but it will hold 10% of them from ever seeing your Facebook Ads. And what's cool about this is once the experiment is completed, you'll be able to see how much did my Facebook Ads contribute to converting people? What kind of lift did that cause? Because sometimes what we'll be looking at inside of our Facebook business manager and inside of just the Facebook ads view is what's our return on ad spend on everything.
So, it really helps to see Facebook's overall impact on your business. And again, I found this really helpful because we use so many different ad channels. So, we were able to do this and see how much Facebook actually impacted our sales. Even for the people that don't click them but are just scrolling through and have them in their feed. For that brand recognition. And we found that it's definitely worthwhile and that we are even making more money than we thought we were from Facebook Ads. So I would recommend the holdout test for anybody that is spending a significant amount of money or has been running ads historically on multiple platforms.
Now, the third option that you get inside of Facebook experiments is called brand survey. This one we have never used, but what it does is basically allow you to show ... They're not even ads, but show these surveys. Show these survey is in people's newsfeeds on Facebook that have seen your ads in the past. And basically it's asking them a couple of questions. So, the first is, "Do you recall seeing an ad from Antonstore.com?" Right? So are your ads memorable? And the person can say yes or no. And if they say yes, it will ask them about their experience. So, did they like the ad, basically? Were they annoyed by it? And it'll give you some of that feedback from people that decide to click through and do these surveys. Again, I've never done that. It could be helpful, but I'd rather just look at the numbers and see if the ads are converting or not. Because I know that for the general feedback scores we see in ads manager sometimes, say the feedback is definitely not great. But the ads have an insane return on ad spend.
So, I don't know. For me the brand survey, I don't see how I would use that data and make sense of it, because I would rather just see the return on ad spend and be able to make our ads better and better because they obviously work. So, those are the three types. Again, most of the money that we spend goes to the A-B tests, not overall in our campaigns, but between the experiment options, most of our money and our experiments go to A-B. Now, why do we do this? Because our normal Facebook set up, our normal campaign structure, we have CBOs running. So, I talked about this a lot, I don't even know, maybe 18 months ago, I did some podcasts on it. But about conversion budget optimization campaigns. So they by far outperform anything else we do.
But the thing is when you have them running and you have them running for a long time, what's going to happen is let's just say you have a conversion budget optimization campaign for remarketing, right? And let's say inside of that campaign, you have three different ad sets. And one ad set is two day website visitors that didn't purchase. So, people in that past two day window. And one of them is four day and one of them is 10 day. And let's say inside of each of those ad sets, you have two different ads running, right? And then you have an idea a month later and you're like, "You know what. I really want to test this new creative. I want to test this new copy in the ad." And you go in there, into your two day remarketing ad set, which is probably your most profitable, and you add a new one.
Facebook, I've never seen it give it a fair chance. Because what Facebook knows, their algorithm knows, is maybe add one is doing amazing, it's getting you the results you're looking for. So, when you add ad three into that ad set, it's not just going to spend a bunch of money there to see how it performs. Maybe it'll give it 100 impressions or 200 impressions, meaning maybe 100 or 200 people see it in their Facebook feed. And unless it gets insane results right there, which it won't because that's too small of a sample size, it's just not going to spend money on that ad anymore. And the money is going to continue to be spent on ad one in that example, the winner. So, this was always a problem even if ad one is doing great, we want to do better.
And just by adding more ads into the existing ad sets, that's not a sufficient way to test because it won't get any money spent. So, with the A-B test, what we can do is go into Facebook experiments, we can create a new experiment, we can choose A-B test. We can choose the targeting, the audience, as the same thing, let's say two day website visitors that didn't complete a purchase, and then we can have version A be that control. So version A could be the winning ad. In that example, it would be ad one. And then for version two, or version B of the split test, that could be the new thing that we want to test. That could be the different headline, the new idea we had or whatever it was. And because we're setting it up as a Facebook experiments, Facebook is going to evenly split our budget between the two.
So we don't have to worry about Facebook not giving it its fair chance. It's basically starting from scratch. It's going to split the budget, whatever that may be, 50/50. And this is what has gotten us by far the best results over these past six or nine months or so, because we've been able, sometimes to not find winners, but sometimes we've been able to take already great results from again, version A and see version B just totally outperform it. And then what we're able to do is go back into our main CBOs, our conversion budget optimization campaigns, and we're able to actually turn off the current winner, the control, and we're able to put in version B if that's the winner and turn it on and basically take those great results over.
So it's a really good way to be able to, again, make even sometimes failing ad sets do better, but for us it's been a way to make already successful ad sets and campaigns perform even better because now we know confidently Facebook will actually give you a percentage how confident they are that version B will beat version A or version A will be at version B, whatever that may be. And we're able to con confidently transfer that into our main campaigns. We're able to turn off the control or the old winner and the money gets spent with the new results.And this has resulted in up to 80% savings cost per result for us. So, 80% off things that were already successful. So again, something I don't often think about, I just do it as a habit when we want to test new things, but again, while I was looking into this case study and what is working for us, that is definitely what has gotten us the biggest results. It's allowed us to find new winners. It's allowed us to transfer them over and go in confidently knowing that they will perform better. So, I don't know when this Facebook case study will be shared or when the case study will be shared on Facebook's blog. According to my rep, obviously Q4 is super busy for them too. So, I don't know. It might be a few months, but I want it to record this quick podcast, get it out to you so that if you're running Facebook ads, you can definitely be using Facebook experiments. Again, we've found the most opportunity with the A-B tests, with creatives, but also the holdout test to see how Facebook ads lift your overall conversion rate. That's also good for people that have been running ads for a while and that are using multiple ad sources. So, I definitely recommend those two. So, that's it guys. Just wanted to get this quick episode out on this Monday. Hope everybody has a great week. Hope you have a very successful week, make a bunch of sales, make a bunch of money, and create a new Facebook experiment. Also as always, if you are not subscribed to this podcast yet, go to your favorite podcast player, search for eCommerce lifestyle, click subscribe, so you can get notified every Monday and Thursday when the new episode goes live. And if you're listening right now and you don't have a store yet, you're just getting started, you want to know our whole process, A to Z, for how we build highly profitable semi-automated stores, go to dropshipwebinar.com. D-R-O-P-S-H-I-P webinar.com for a free training that shows you exactly how we do this. So, I will link to that in the podcast description as well. Click that, go to the training, and see how we start and finish the whole process. So, thank you. I appreciate you. And I will talk to you on Thursday for the next episode of the eCommerce Lifestyle Podcast.