Google recently announced the end of adsenseformobileapps.com, a site that allowedexclude all mobile apps from Display campaigns at once. One of my good practices that I adopted after reading this article in 2014 …
Adsenseformobileapps.com is however replaced by another means that can be found in the campaign settings, so it is not a drama and Google does not take control of us (for this time).
However, this change made me want to answer a question: is it always good practice to exclude mobile applications from Display campaigns in 2018? The answer is no ! But there is more !
I’ve studied more than 50 million GDN impressions in the last 90 days through dozens of accounts, and this is what I found.
#GoogleAds: 50M impressions studied. Let’s redefine Display best practices in 2018
Mobile, the leading contributor to volume (and fraud?)
Watch out for the scoop : the first traffic driver is mobile. ?
On the studied sample, the mobile generates:
- 6.6 times more clicks than on a computer;
- 4.15 times more sessions than on computer;
- 3.54 times more users than on a computer.
What is shocking is to note the strong disparity between the number of clicks and the number of visits and corresponding users when on the mobile.
In other words, to note that on mobile, almost 2 clicks on 3 do not generate a visit.
A phenomenon much less marked on computers.
This obviously a strong impact on the cost per user.
Indeed, if the mobile CPC is 200% cheaper than on a computer, the cost per user acquired is only 56%.
What is the reason for this huge difference between the number of clicks and the number of users?
This is a difficult question. In my opinion 3 possible causes: fraud, loss or attribution. ?
We know for example that in many cases, when a user comes from a mobile application to a website, the referrer information and / or the gclid do not pass, and therefore, traffic is assigned (wrongly) as direct traffic by Google Analytics.
So let’s start by validating that the phenomenon is more marked for mobile applications than for the rest.
If we compare the click / user ratio for the different types of placements, the difference is obvious:
- Mobiles: YouTube mobile app: 1 user for 4.85 clicks;
- Mobiles: Mobile applications (excluding Gmail and YouTube): 1 user for 4.5 clicks;
- Mobiles: Web: 1 user for 2.7 clicks;
- Computer: YouTube: 1 user for 1.78 clicks;
- Computers: Web (excluding Gmail and YouTube): 1 user for 1.54 clicks.
(Gmail is deliberately not included. A full part is dedicated to it below, don’t miss it?)
What do these ratios tell us?
The previous hypothesis seems to be confirmed. On mobile and in even more applications, the difference between clicks, sessions and users is much more marked.
I then tried to correlate direct traffic to display ad clicks, on mobile, to see if it was an attribution problem.
For this I segmented the data from all the accounts that I used to carry out the study.
I created a segment which corresponds to the users who came through display campaigns on mobile vs direct mobile traffic.
And here we try to find similar patterns. I put 2 representative for you, for fun (where is Charlie ??):
What do we see?
No real correlation. And it’s the same on all of the accounts studied. To me, that eliminates attribution.
So there remains fraud and wastage.
According to Google, site speed can have a huge impact. And on mobiles not connected to a wifi, loading times are longer, we know it.
There are also certain formats such as Gmail and YouTube which are known to have large differences between clicks and sessions.
I would like to believe that part is explainable. But I find it hard to believe such a big difference.
Everyone will do so here his own interpretation. ?
For my part I see loss AND fraudulent clicks. But difficult to know the exact proportion.
In the end, despite that, if we study the cost per user per type of location and per device, the mobile remains competitive.
And especially, mobile applications perform better than the mobile web.
Applications, the leading provider of conversions on mobiles
Ok, users are fine, but what about conversions?
Well, same conclusion as above. More volume on the mobile with a lower cost per conversion.
And the majority of mobile conversions come from apps (44%).
They are also the cheapest when looking at the cost per conversion:
Ok. I can hear you already. Yes more conversions. Yes they are cheaper. But what about their quality? Of their value?
Did I make that effort too ??
I systematically put a value on my Analytics conversions / objectives.
So I was able to compare the value generated by each of the locations.
In order not to be biased by the volume, I measured conversion value per impression. And here are the results:
The print conversion value of apps is higher than all other types of placements.
Finally, as counterintuitive as it sounds, apps produce the most volume of conversions at the best price, and overall the value of those conversions is higher than that of other placement types.
But why have I systematically fired them for so long! ?
Special case of Gmail
The Gmail location (both mobile and desktop) generates many clicks.
, But almost no user arrives on your site 😕
At first, I thought of an attribution problem.
I thought that clicking on the action button of “sponsored email” had to be counted as direct traffic.
So I made correlations again: no result. (I’ll give you the screenshots this time?)
I then decided to compare the performance of dedicated Gmail campaigns vs conventional Display campaigns (which, as a reminder, also target the Gmail location by default).
Let’s explain: my study focuses on classic Display campaigns only. This is why even if in the accounts I have dedicated Gmail campaigns, I did not take them into account in my sample.
But now I include them to give new light. And comparing them, surprise:
Sessions and users are therefore well recorded and correctly assigned by Google Analytics. This is not an attribution problem.
It’s just that when Gmail is used in a classic Display campaign, the results are very bad.
The new good practice is no longer to exclude mobile applications, but rather the Gmail location, in order to deal with it separately in a dedicated campaign.
What about the performance of Display Gmail campaigns compared to other Display placements?
Let’s compare the conversion value per impression:
In my sample, the dedicated Gmail campaigns are the most valuable Display campaigns.
Best practices for your Display campaigns in 2018
Ok let’s take 5 minutes to summarize the lessons of this study :
Leaving the Gmail location in your Classic Display campaigns (targeted by default) is a mistake. The performances are bad.
#GoogleAds: exclude Gmail from your classic #Display campaigns. The results are very bad
Dedicated Gmail campaigns are the best-performing campaigns of all. => You must absolutely test them in all your accounts
Mobile apps are one of the best places => Try to give them a chance in your existing campaigns if you had excluded them and no longer automatically exclude them from your campaigns.
#GoogleAds: Stop excluding mobile apps, it’s one of the best places
The Evolution of Google Display Campaigns
Now I’d like to open up a bit of discussion about the future of Display and where Google’s going.
Mobile apps are automatically included in new Smart campaigns
Now, whether it’s Smart Shopping campaigns or Smart Display campaigns, they target mobile applications without being able to exclude them.
Google constantly simplifying and automating campaigns, it’s a safe bet that soon we won’t really have any control at all, whatever the type of campaign.
Is this a problem? Yes and no. (if you are afraid for your job read this).
You certainly lose control.
But in the end, if we play the Google game and we set up smartbidding well, we get almost the same result without having to do ant work to exclude all that is not good.
You have to be pragmatic. The whole industry is moving in this direction. Less control, more artificial intelligence.
And that’s good, because it gives us time to do other things.
Much more important things like studying customers or going deeper into business analysis to help our customers measure what really matters.
This will allow you to improve the performance of campaigns, more than all the ant work you could have done in the account.
#GoogleAds: machine to machine work. Bring your value differently
Which will increase your pay if you are performing. ? ?
Towards a performance display (pay for conversions)
Another important development (at Google at least) is that more and more you will no longer pay for clicks , but for conversions.
It changes that too.
In this context, click fraud no longer really matters.
What about conversion fraud? It already exists, that’s clear.
In some countries, you get lots of forms for example. But often the information is false every second time.
What to do against it? Use the system!
Separate the good conversion from the bad by scoring. And make full use of Target ROAS.
For me it works, no problem. (I’ll explain everything in my next article, meanwhile?)
In this context, the time it takes for the algorithm to learn doesn’t really matter anymore either.
It was one of the main obstacles to adopting Smart Display campaigns.
Who could afford to spend 2000 to 3000 before the results were there? ?
But Google got it right …
The Display, essential?
I don’t know about you, but even just 2 years ago, I really wasn’t a fan of the Display:
- Fairly average targeting criteria;
- Visually not very beautiful ads;
- Results not really there.
If I compare today, the Display has really changed a lot. In good.
It is now not uncommon for I cut unbranded Search campaigns because I get better Display results.
It becomes almost a good practice when the budget is limited and the non-branded search clicks are too competitive.
#GoogleAds: Test Display rather than Unbranded Search in Competitive Industries
For me the Display is clearly essential today in Google’s advertising offer and I multiply the victories every day.
Your turn now. What is your experience with Display?
Have you noticed like me that the quality and results generated have really improved in the last two years?
Did this study convince you to give mobile applications a chance?