Whatagraph is a leading marketing tool that helps agencies, in-house marketing teams, freelancers, and businesses to save time and create the most aesthetically pleasing, easy to digest reports
This promoted content is produced by a publishing partner of Open Mic. A paid-for membership product for partners of The Drum to self-publish their news, opinions and insights on thedrum.com - Find out more
What Is A Google Analytics Bot Filtering And How To Use It
July 20, 2021
Bots in your website's traffic: the good, the bad and how to identify and filter them out in your reports. Learn more about Google Analytics bot filtering features.
Imagine that you log in to your Google Analytics account one day and see a spike in traffic to your website. However, after looking closer, you understand that it’s mostly bots, not bringing any conversions, just making you worry about the website’s safety. Luckily, Google Analytics bot filtering features are here to sort things out.
Undoubtedly, GA (Google Analytics) is one of the marketers’ favorite tools for analyzing and understanding traffic coming to the business website. And since online marketing efforts are only as good as clean data, every marketer should know that it’s essential to keep Google Analytics reports as clean and accurate as possible. And the quickest way to filter out all of the noise out of data and make valuable insights for your business is to use Google Analytics bot filtering tools.
Simply put - bot traffic is any online traffic that comes not from a real person. A study conducted by Imperva shared an insight that in 2020, 37.2% of all online users were robots. It contained 13.1% good bots and 24.1% bad bots.
Good bots are responsible for automating tasks: crawling and indexing the websites on search engines or powering smart assistants like Siri or Alexa with the website’s information. The most common good bots, according to Imperva, are:
Monitoring bots (1.2%) - website health checkers;
Commercial crawlers (2.9%) - metric crawlers (AHREFs, Majestic);
Search engine bots (6.6%) like Google bot or Yahoo bot;
Feed fetchers (12.2%) that convert sites to mobile content.
All of these bots, spiders or crawlers are playing important roles in keeping websites running smoothly. If a marketer would decide to block one from the site, it could negatively affect traffic.
Bad bots are the ones who are responsible for such things as website scraping or spamming. Some people are launching bots to crawl and scrape the data off websites to upload the content to their own site or scavenge for sites to find vulnerabilities to exploit. According to the survey, the most popular malicious bots were:
Spam bots (0.3%) who automatically posts comments and messages on the websites;
Web scrapers (1.7%) are used to scrape prices, content pieces off the sites;
Hacker bots (2.6%) that scan for vulnerabilities on sites;
Impersonator (24.3%) bots that look like real human traffic are often used in DDoS attacks.
Bot traffic can impact analytics metrics such as page views, bounce rate, session duration, location of users and conversions. These aberrations in metrics can create a lot of frustration for the site owner and data analysts. It is tough to measure a website’s performance that is being swamped with bot activity. Attempts to improve the site, such as A/B testing and conversion rate optimization, are becoming ineffective because of the noise created by bots.
Filtered data will provide a more accurate picture of how users are interacting with the site. Getting objective metrics will allow you to understand which pages, acquisition sources, and campaigns provide the best ROI for the business.
Although Google Analytics provides the data monitoring dashboard, it may be hard to understand for both the marketers and clients. Data analytics tool allows marketers to save time and create visual reports, automate them at the preferred frequency, share them with teammates and clients within minutes.
Bots will usually show as Direct traffic in GA. Data analysts will need to look for patterns in other dimensions to be able to filter them out. The bots traffic would mostly come with the tail from websites, for example, trafficmonster.info, morerefferal.net, et cetera. Here are a few other signs that the traffic bots have visited you:
Increase of traffic across a specific day or hour. If the website is typically getting 1 000 visitors daily and without any particular marketing or sales campaigns, the traffic spiked to 10 000 – it’s most likely bot-generated traffic. Andy Crestodina from Orbit Media Studios says, “When you see a spike that can’t be otherwise explained, look for evidence that it’s a bot”.
Another red flag when identifying bot traffic is an unusual spike in visits to a particular page, usually the homepage or about page. When the bots flood the site, they visit the most visited page to scrape all the information users might look for - headlines, pricing tables, et cetera.
High bounce rates and low time on site are the metrics to look for when unraveling the unusual traffic. The bots are not real people and they come to the website with an already clear purpose. That means they automatically scan the website and leave immediately. Bot traffic bounce rate is usually near 100% and the time on site is almost 00:00.
The source of traffic is a different country to the business. As competitors, scrapers, or hackers run the bot, they generate the information randomly. That resembles a foreign country, where the company is not even operating in.
Junk conversions such as new accounts created with gibberish credentials or contact forms filled with random names and fake phone numbers to access more information on the side, if the login or contact information is mandatory.
GA does provide an option to “exclude all hits from known bots and spiders”. If the source of the bot traffic can be identified, data analysts can also provide a specific list of IP addresses to be ignored by Google Analytics. To enable the native filter on reports, go to the “Admin” tab, select the view you want to apply the filter to and click “View Settings”. Then mark the checkbox entitled “Exclude all hits from known bots and spiders”.
While these measures will stop some bots from flooding analytics, they won’t stop all. Most malicious bots pursue an objective besides disrupting traffic data. These measures do nothing to decrease harmful bot activity outside of preserving analytics. Here are a couple more tips on how to filter out the bot traffic through Google Analytics:
Create a new view. Having multiple views in GA will enable a data analyst to measure filters’ impact and get a recourse in case a filter accidentally impacts the data. Once a filter is applied on Google Analytics, it takes effect retrospectively, meaning that the historical data will be unfiltered even with filters set up on the business account. To create a new view in Analytics, head to the “Admin” tab, click “Create View”, then set up “Raw Data View”, “Testing View” and “Reporting View”.
Data segmentation will help to filter out bot traffic if the analyst has determined a spammy domain. Unlike filters, segments have an effect on historical data as well as the data going forward. To create a new segment, press “Add Segment”, then “New Segment” and enter the domains you want to exclude. To exclude multiple domains, add a pipe/vertical bar between each domain.
Having real and accurate data is essential for your Google Analytics - and all marketing analytics, for that matter. It’s also vital to filter out the data properly, without any artificial information, which is not useful. The marketer must understand that if GA reports contain unnecessary data, it will likely make wrong insights when deciding on the next steps for the website or business.
After learning what bot traffic is, identifying it and filtering it out, the last step is to get a clear view of the data Google Analytics has collected. Marketers are advised to use a reporting tool to gather the data and present it in a visually appealing and easy-to-digest report.