Data & Privacy Data Marketing

How overlooking quality assurance can sabotage data analytics

By César Requena, Marketing sciences associate director

November 17, 2022 | 6 min read

For The Drum’s Data and Privacy Deep Dive, R/GA’s César Requena and Nicolás Fernández make the case for stronger data quality assurance.

Computer with analytical graphics

Effective data analysis relies on a strong foundation of quality assurance / Adobe Stock

Data analysis is a fundamental part of the world we live in today, responsible for driving any number of important business decisions, so ensuring data quality is of paramount importance. Yet, too often, a crucial step gets overlooked in the process.

Knowing what to measure on websites and applications – and how to measure it – is an extremely important factor when gathering information about users, predicting behaviors and measuring performance, among other relevant variables for a business. Yet too often, businesses don’t invest enough in ensuring that data is of the utmost quality.

In most analytics projects, it’s easy to overlook the importance of adequate testing – which can mean that collected data falls short of expectations and ultimately cause a number of issues: extra hours to find and fix problems, constant bugs, untrackable errors, data mistrust, data leakage and non-retroactive fixes, to name a few.

What separates great analysts from adequate ones is often attention to proper quality assurance (QA) throughout the process of collecting, inputting and analyzing data. Unfortunately, most analysts aren’t familiar enough with QA best practices.

Analytics QA consists of testing and assessing whether inputs and outputs work seamlessly and generate quality data during the collection, ingestion and storing of analytics solutions in alignment with business goals and requirements.

There are two main pillars that drive analytics QA. First is data QA, which involves collecting the right data, such as maintaining the complete e-commerce information of each transaction enabled for reporting. Next is implementation QA, which entails verifying that the implementation is working as expected, such as ensuring that the tag management tool loads correctly, tracker scripts are working, custom JavaScript is running without errors and tags are being triggered appropriately.

To accomplish this, a solid analytics QA will consist of testing the following elements: workspaces and preparation; the implementation itself; tags; triggers and variables; the payload; data processing and the stored data.

Of course, this is not a static process, and the testing cycle should occur at different stages – such as testing and production, when testing new tags and when apps are sent for approval to the Apple App Store or Google Play Store.

Perfecting data analytics QA practices requires never getting tired of testing. While the level of effort required to deploy QA is higher during a testing phase, data quality analysis efforts actually carry more weight during production, since that’s the final set of data that will be applied at the end of the process.

Needless to say, depending on what and how an organization is testing, methods and tools can vary. But if data cannot be trusted or it isn’t reliable enough to inform decision-making, it doesn’t matter if a team uses a free analytics tool or pays for the most expensive tools on the market, as either will be equally useless.

Suggested newsletters for you

Daily Briefing

Daily

Catch up on the most important stories of the day, curated by our editorial team.

Ads of the Week

Wednesday

See the best ads of the last week - all in one place.

The Drum Insider

Once a month

Learn how to pitch to our editors and get published on The Drum.

Luckily, there are a few simple steps that any team can put in place to ensure their analytics QA is always effective.

For one, it’s critical to get a solid grounding on how everything works: read official materials about the issues at play. Basic web and app knowledge is fundamental to understanding the working process.

Then, create a framework to organize your team based on the prioritizing of workload and understanding of the process. Different analytics vendors may differ in capabilities, features, interface and specifications but, in the end, they all work following the same set of rules.

With a solid QA practice in place, the organization will be well-positioned to flourish in our data-driven world.

Cesar Requena is the marketing sciences associate director and Nicolás Fernández is the lead data analyst at R/GA. For more on how the world of data-driven advertising and marketing is evolving, check out our latest Deep Dive.

Data & Privacy Data Marketing

More from Data & Privacy

View all

Trending

Industry insights

View all
Add your own content +