One of the readers of the blog was discussing his Facebook ad strategy and mentioned that he has to increase the sale price of the product due to expensive shipping.
This quickly reminded me of my personal experience with variable pricing and Facebook ads and I thought to write a bit about that.
When you start advertising a product on Facebook on a specified price point e.g $19 and have a ton of qualified events stored in your pixel and ad account, a change of pricing can be sometimes disastrous. When you record hundreds of add to carts, check-outs, and purchases at a $19 price point, you’ve trained the Facebook ad algorithm to bring you buyers who are comfortable to spend money in that range. Facebook looks into the historical purchase patterns of the buyers and their average cart value in order to serve your ads to the right audience.
An increase of pricing mid-way in the campaign with a lot of recorded data will have more negative impact as you’re not just going to have lower conversion rate due to the hike in price, but also because your ads will not be served to the right audience further reducing your conversion rate. Due to this reason, personally, I like to start my ads with the final sale price and not something lower.
It was father’s day yesterday. I sat down with my father and spoke for a few hours on various subjects. We briefly discussed a topic that reminded me of something I learnt way back in college. It’s a concept from physics. I was actually quite awful at physics so I apologize in advance if my interpretation is incorrect. But here it goes anyway.
In physics, “work” happens when a force is applied to an object such that it moves from A to B. If you exert the force, but the object doesn’t move, the work done is zero.
In regular life, we don’t think of work the same way. But I think we should. If our actions aren’t bringing about a change or a result, I’d like to think that the work done is zero. It doesn’t matter how hard you try, or how many hours you put in, because as long as you couldn’t move the needle, the work comes down to zero.
I feel that in normal life work should be measured as Force x Displacement and we should move away from our current definition of work as only force. In addition, we should make our best effort in order to have maximum displacement for the force we put in. If employee A works longer hours than employee B, but delivers the same value in the end, it may seem that A worked more, but in my opinion, they worked just as much as they pushed the object just as much. Here’s an interesting case-study.
Savannah Sanchez, a Facebook marketer, did an interesting video that I don’t completely agree with, but I find it interesting to share here today. Her thesis is that Facebook’s AI has gotten so advanced, that a human marketer working over 10 hours a week on an ad account was able to deliver just about the same ROAS that another ad account delivered which wasn’t touched at all during same period and was only optimized by Facebook’s algorithm. If you want to watch the video at the exact time, you can do so here. Or you could watch the full video below
The reason why I shared this here is because when a robot can deliver the same value, your working hours i-e force is worthless. It’s the displacement that counts, and if a robot can do that better than you, you’ll be replaced. Although, I don’t think that time has come yet but I know it isn’t far.
I’ve used Facebook as a marketing tool for about 10 years now, and to think that there’s no more human tweaking possible on Facebook any longer is an alien concept to me. 10 years ago, you could do a thousand tweaks and the system would play along. Now, it’s increasingly harder but I feel expert marketers i-e a small percentage of all marketers could still do better than the system. In a few more years though, I wouldn’t be surprised if that number shrinks to a mere fraction.
When we ran our network of content sites that leveraged Facebook’s influencer marketing, I realized that we were working against the force. That each day, Facebook would do something to cap, limit or control our business or business model in a certain way. I knew that they will take over the business model. So much out-bound traffic making tons of ad-revenue all outside the Facebook platform.
They let us run for a while, for a long while, because their users engaged very well with the content. But eventually they launched instant articles to get a piece of this pie. A native facebook article reading experience where Facebook serves the ads and also takes a cut.
They did the same with the videos. In the early days of Facebook, Youtube videos appeared as embedded content on the platform and not as links. As they started to prioritize their videos on the platform, they started treating Youtube videos as regular links. Eventually they launched in-stream ads, a video monetization program just like one for Youtube.
Now they are doing this for e-commerce. Every day 100s of new D2C brands are launched. Their primary source of customer acquisition has been Facebook. It is estimated that most indie e-commerce brands acquire over 50% of their customers through Facebook advertising. While Facebook already takes a massive chunk of the revenue generated (often 40-50%), having more control by hosting native e-commerce experience on the pages could mean more revenue for the company, better user experience & higher conversion rates.
Just like instant articles, each product would need to be approved by Facebook now when you import the catalog.
How do I perceive this news? troubling. I know this will improve user experience but Facebook already practices more control than I appreciate and the counter-party risks continues to increase.
While you practiced full autonomy over your woo-commerce and Shopify stores, now you’ll be on the mercy of Facebook. In the worst case scenario, which by the way is often the normal case scenario for me, no longer will I only lose ad accounts, I could also lose my “shop”, because of course my “shop” has to adhere to Facebook’s TOS. If the AI, which isn’t very fond of me, constantly throws ban hammers for allegedly violating advertising policies, why wouldn’t it do the same for the shops.
Facebook’s robots are great at finding customers. You can create a CBO with 10 ad-sets targeting 10 different things, 5 creatives for each of the ad-sets and the robots will start optimizing on the first day to get you purchases before you’ve even spent $50. That’s great, isn’t it?
When Facebook’s AI gets mad at you though, its quite like Skynet from the Terminator. It tries to erase you and it goes to great lengths to do so. Let me explain that one.
What happens when you use a credit card on Facebook that was issued in your name but by a different country than the one you’re residing in? You’re accused of credit card theft of course. Then you go to great lengths to justify it but it doesn’t work. Then after a few days of efforts, you make it work. But doesn’t matter if you’ve made it work even, cause that’s strike # 1 and AI is going to be tougher with you.
What happens next? One of your ads is rejected for “circumventing Facebook’s advertising policies”. You appeal it, and the ad gets approved. Doesn’t matter though, cause that’s strike #2.
And just like that, before you know it, you lose you business manager. Because you as a person was the cause of trouble, you also lose your personal ad account. In addition you lose your ability to advertise on Facebook or add or remove anyone or anything in any business manager.
If this sounds bad, let me tell you what else happens.
Any pages that you have previously advertised on Facebook get penalized too including the one for this blog that you’re reading. It doesn’t matter if the advertising was done months ago, you still get the hammer. The page gets zero reach syndrome.
I have been following some fellow marketers run this strategy for a while and I had tested it out myself too a few weeks ago. I’ve seen decent results with it. Unfortunately, I didn’t have a lot of data to work with so I couldn’t test it at full scale but I believe in the concept for sure.
The strategy is called lookalike stacks and it’s dead simple to implement. You create 1% lookalikes of various options like 1% of view content, add to cart, initiate checkout and purchase and stack them together in 1 ad-set. You can do the same for 2%, 3% and so on.
There are two reasons why I think this strategy works better than most other lookalike configurations.
Firstly, Facebook prefers broader audiences over narrow audience. When you let Facebook work with broader audiences, it has more room to play with, find buyers, and generate cheaper sales in turn. When you stack 1% of everything together, the audience size is much larger than 1% of individual lookalikes. It is often 2-3 times larger. Sure you could individually use 3% of purchase or 3 % of view content to achieve same audience size. However 3% is never going to be as good as 1%.
Secondly, best lookalikes are created when you not only use more qualified events but also have a large amount of data in them. For example, if you have 1000 ATCs and 100 purchases, your ATC lookalike is likely to be better than your purchase lookalike because Facebook had 1000 people to create lookalike from. Although purchase is a more qualified event than ATC, it will create a more qualified lookalike once there are large number of purchases in the data-set. When you create a stack, you’re able to leverage from the best of VC, ATC, IC, Purchase together in 1 ad-set. This kind of ad-set is the best of both worlds as it has both: lookalikes of most qualified events (IC, PUR etc), and lookalikes of events with most data in it (VC, ATC etc).
I hope I was able to explain myself just fine. If I didn’t, please feel free to ask me questions.
I conducted an experiment a few years ago while I was trying to gain more insights on how Facebook’s algorithm worked and today I thought to share the results.
I can’t be more wrong with the timing of this blog though as Facebook has went after and sued a cloaking company that allowed the advertisers to cloak their real landing pages from Facebook’s review systems and instead served a “safe page”. So while I publish the results of this experiment, please know that I conducted it for research purposes, have not used the outcomes for monetary benefit, and publishing the findings here for educational purposes. I also encourage all of you to always use these platforms as defined by the terms of services. Now let’s dive in.
As I knew that Facebook is capping reach of certain domains and giving extra ordinary mileage to other domains, I went out on the search to understand deeply how the systems worked. We eventually found a way to leverage this while staying well within the TOS by using sub-domains. However, we didn’t just come up with sub-domains on the day 1. It took many other tests, one of which I’m going to publish here.
To conduct this test, I used a programmable link shortening service called Tr.im. I picked up an article from a domain that Facebook had given preferential treatment to and shortened the URL using tr.im. I then created a rule in the tr.im shortening system to continue forwarding all desktop traffic to the link I shortened but instead send mobile traffic to another dummy website hosting an exact replica of the same article.
I then picked up 2 Facebook pages with similar demographics, size and performance. I shared the dummy link directly on Page 1 and the tr.im shortened link on Page 2. In the next couple of minutes, we saw 987% more traffic coming on from the shortened link as it leveraged organic reach of the whitelisted domain but sent all mobile traffic our way.
This experiment set the foundation for us about how Facebook newsfeed is treating different domains and we eventually scaled and moved forward using the sub-domain method.
If you’re familiar with Facebook advertising, you may have seen that some people always run multiple copies of the same ads in an ad-set. Those unfamiliar with this strategy always wonder, why would someone create 2 identical copies of the same ad and place them in an ad-set. Here’s the reason why.
When you target a large audience (for example 1 million to 100 million) which Facebook also encourages you to do so, not every person in your audience (interest/behavior) is going to be identical.
When you place two identical ads in an ad-set you’re hoping that your first copy will be seen by a small pocket of your large audience, and your second copy will be seen by a different small pocket. Based on the performance of the audience in those pockets, Facebook will continue to find similar audience using it’s machine learning capabilities.
It is obvious that one of the pockets of the audience would be superior to the other one and by having multiple copies you’re giving their machine learning a better chance of spending budget in your interest in a more optimal manner.
I found this difficult to convey over the text, but I hope that I’m able to do so. If you have any questions, please feel free to ask in comments.
I have profitably spent hundreds of thousands of dollars on Facebook ads. I have been doing this heavily since 2016. I could attribute most of my basic learning to Travis’s free resources that he put up on YouTube.
Travis has been playing this game for over a decade so he’s pretty good at what he does. Much of his content may be dated although still very useful. This is still my favorite resource for getting the basics right.
For intermediate strategies, I’d recommend that you check out Verum. To understand what he’s saying, you need to be well aware of basics. If you’re well aware of the basics, you’d love his content and find it very easy to digest. Otherwise you probably wouldn’t understand much of what he’s saying.
The most advanced players, however, are the AdLeakers. I don’t think there’s any value for anyone here unless he’s already spending a lot of ad budget profitably and wants to further up his game by working on cost reduction strategies to achieve lower cost per acquisitions.
I don’t suggest that you invest in any course if you’re just starting. Investing in the actual ad budget might be a much better idea. But before you even do that, I strongly recommend that you consume Travis’s KingPinning tutorials.
Facebook launched campaign level budgets in the mid of 2019. Initially, I was skeptical but I’ve started to like CBOs a lot. By using campaign level budget, I can now test 5-10 adsets in the same budget that I needed before to test 1 adset.
Facebook simply spends higher budget on the adsets within a CBO that are more worthy of my budget and spends lesser budget on adsets that are more likely to burn cash.
There is always a risk of missing out on a potentially winning adset but the reward overshadows the risk. In addition, you could still define minimum spend per adset within a CBO to ensure that each adset gets a bare minimum spotlight. Although, I generally advise against that.
My only problem with CBOs thus far is the organizational structure. Prior to CBOs, I only had to create 1 campaign per product. My campaign could then have hundreds of adsets.
Now I’ve to create multiple campaigns per product with each campaign grouping similar adsets together. Because of this, I’ve to create 10s of campaigns per product. The downside is I can’t group together data for 1 product without using filters which is just an added inconvenience
If Facebook introduces something which is above the campaigns level only for sake the of categorization, I’d really like that.
Facebook has very advanced machine learning capabilities. More often than not, you’re better off reaching your customers for a cheaper cost by reaching a broader audience instead of a narrow targeted audience. But how is that possible? In theory, targeted audience should work better? But with strong ML, the broader audience delivers better and cheaper results provided that the initial customer dataset was correct.
But what happens if you get the initial data wrong? It puts their ML chase your customers in the wrong direction. Let me explain.
When building a Facebook page, growth is going to depend a lot on you first 100s or 1000s likes. Hence getting your first subscribers or customers wrong, can put you altogether in the wrong direction. I can think of 2 reasons why that could happen. Firstly, your upcoming page subscribers are likely to come from the network of your existing subscribers due to sharing and other engagement. And secondly, the engagement behavior of the first data set of subscribers with your content will define how engaging your page is and eventually define the placement of your page in the newsfeed and other Facebook algorithms.
So getting the initial dataset of subscribers/customers is extremely important. It is why I’m generally way more careful in the start when building a Facebook page or an e-commerce store through Facebook ads but later on take the liberty to test all kinds of traffic. It keeps my seed-data clean. The data that is going to be used to build the entire user-base later on.
If you have a question, please feel free to ask in comments.