Digital advertising has a quality control issue and as Google’s recent ‘Bad Ads' report shows, it’s getting worse. Last year, the media giant removed 1.7 billion ads that contravened its online policies, a figure Google states is more than double the number of ads taken down in 2015.
For the industry, the report is a mixed blessing. On one hand, it’s a ‘good guy’ victory; with fewer bad actors around to diminish the user experience and siphon off revenue, there is more space for those trying to deliver relevant and engaging ads. But on the other, it also highlights the ad quality challenge.
Tackling this challenge is an ongoing task. As the industry grows and more money comes into the system, the volume of bad actors rises. To keep them at bay we must keep adapting and anticipating their tactics in addition to maintaining high standards.
So, let’s take a look at what the industry is up against, and how we can beat it.
Bad ads by numbers
The core takeaway of the report is that the scale of bad ads is escalating. Within the overall number of bad actors were 112 million ‘trick to click’ ads (six times more than in 2015) – which use innocent-looking system warnings to hoodwink users into clicking on ads loaded with harmful software – as well as many illegal, misleading, and fraudulent ads.
In brief, Google disabled 69 million ads that breached healthcare regulations, over 17 million illegal gambling ads, almost 80 million misleading ads (such as miracle diet pills), 23,000 self-clicking mobile ads, seven million ads in disguise (attempting to evade detection), and 6,000 bad sites.
Although these figures provide valuable insight, possibly the most illuminating are the masked bandits and profiteering sites. In just 12 months, 1,300 accounts were identified and apprehended for ‘tabloid’ cloaking, whereby ads pose as news stories about trending topics to get clicks, and 900,000 sites with malware were exposed.
Not only do such ads damage user trust, but they also increase the risk for brands to place genuine ads on sites that harm their audiences, and reputation.
Scale is an Issue
Digital advertising has become a super-sized industry powered by the automation, efficiency and scale of programmatic. But the problem with running a vast number of deals at once is there’s more room for error if the correct prevention measures are not taken. Let me demonstrate: say an ad exchange runs one billion impressions per day with a strong ad quality accuracy rate of 99.9%, due to the high number of transactions, this would mean 1,000,000 daily bad ads. If we took that further, to a precision rate of 99.99999%, 100 ads would still be lower quality.
The truth is no matter how hard we try, it’s not a fool proof system, but that doesn’t mean we can’t get close to eliminating the threat of bad ads. Of course, publishers, brands, agencies, technology partners and third-party verification companies want closer to zero incidents. But to achieve this feat we must work together; only a combined effort can eliminate the threat of poor ad quality.
To do that, the industry should consider these recommendations:
1. Create better standards
Several industry leaders have already set up their own initiatives to stamp out bad ads, such as the Internet Advertising Bureau’s (IAB) LEAN principles and revamped Ad Unit Portfolio. While these are positive developments, until everyone recognises the need for higher standards, there will continue to be discrepancies.
This means everyone: ad exchanges, networks, demand-side platforms, and publishers. Exchanges and networks need multiple layers of defence, including human and automated review processes to catch bad actors before they enter the system, and increased controls that allow publishers to pick the ad formats, categories, and activities they want on their sites.
Demand-side platforms should also reduce risk by deploying safety measures that identify and remove malicious ads, and publishers must be vigilant. Many are unaware of exactly which ads appear on their sites – and some poor quality campaigns can offer temptingly high CPMs. It is vital for publishers to obtain a clear view of the ads they run and steer clear of disruptive ads that can impinge on the user experience.
2. Guard against site latency
Slow loading times can drastically lower ad quality perceptions — as well as boosting ad blocking rates — which means minimising latency is essential. Again, slow pages often arise from a lack of clarity, with publishers in the dark about how long ads will take to load, which makes it hard to tailor ads for different devices and connection types. From a publisher perspective, page design is paramount and directly under their control. While there are many elements to consider, the top factors that impact speed include the use of a content delivery network (or CDN), loading only the necessary page elements, simplifying page layout, and combining and compressing static resources.
For elements of speed that are outside the publisher's control, it's beneficial to choose advertising partners who have healthy relationships with DSPs and advertisers, strong technical acumen, and a history of innovation. Transparency is paramount and the industry must give publishers more insight and autonomy with the ads they serve.
By putting its fight against bad ads in the spotlight, Google’s ad report provides the industry with renewed motivation to take action. It’s a problem that impacts every stakeholder in the digital advertising ecosystem, particularly the consumers around which it revolves, and is one we must work to address.
Andrew Buckman is managing director EMEA at OpenX