Research by consulting firm Gartner has shown that poor data quality impacts organizations to the tune of $15 million per year. IBM also discovered that in the US alone, businesses lose $3.1 trillion annually due to poor data quality.
Some of these costs are more tangible than others. Mammoth Growth recently helped a 650-person company save over $750k on tool fees by fixing a few data governance issues.
These statistics highlight the waste and poor business performance caused by weak data governance and the importance of investing in a strategy.
The Costs Of Bad Data Governance
Garbage In = Garbage Out
When poor decisions are made using bad data — meaning information that lacks accuracy, timeliness, and relevance — then the cost to your business can be immense. For example, our team worked with a European firm that was capturing their marketing data incorrectly, leading them to overestimate the cost per conversion for one marketing channel by 26%. As a result, they invested in another, more expensive channel that was less efficient at acquiring users.
This issue compounds as people lose trust in the data, which ultimately has a material impact on the ability of managers to run their departments. Marketers and product leaders struggle to justify investment requests, and they end up working with suboptimal budgets. Conversely, by improving trust in data and reporting, the long-term results are extremely positive. Mammoth Growth increased the budget of a Demand Generation team at a leading PLG company by 150%.
SaaS Tool Overage Fees
Bad data governance can lead to unnecessary overage (pay-per-use) costs associated with the SaaS tools the company uses. After all, most SaaS tools charge based on the level of usage in some fashion. When data is not properly governed, there’s a risk of it being fed into one of your many business tools when this is completely unnecessary.
Not only is this a privacy risk (most data protection laws have a minimization principle stating that only the data required should be shared) - it can also result in unnecessary charges for data storage within the tool. For example, one of our clients had this issue when all of their analytics data was flowing into one of their tools. Had it not been resolved, it would have led to $260k in data overage charges.
Slow Time To Value For New Analysts
Poor data governance makes onboarding new analysts time-consuming and inefficient. In situations where your employees have cut corners, new staff are often prevented from creating reliable analyses until they are aware of all of the existing data caveats.
We often see companies collect multiple analytics events for the same action, which leaves new analysts confused as to the correct source of truth.
For example, we worked with a client to reduce their number of priority analytics events by 90% without limiting their analysis capabilities. In this case simplifying the events schema helped to unlock considerably more analysis.
Now the company only had to query one single event, down from the 25 they were dealing with previously.
Bad data governance can also put a company at risk of legal and compliance issues. For example, if data is not properly protected, it can be sent to third parties without the appropriate consent, which may constitute a violation of data privacy laws.
Breaches can, and do, result in legal and financial penalties, with punitive fines and reputational costs involved - especially in jurisdictions with strict privacy laws, like the EU. In fact, GDPR sets a maximum fine of €20 million (about $22 million) or 4% of annual global turnover.
Data Governance: Best Practices From Working With Over 800 Companies
1) Define and document your centralized analytics and tracking strategy.
The first step is to implement a centralized analytics strategy that covers your preferred business use cases, your tracking plan, and the correct tooling for each use case.
This strategy should include a clear understanding of your business objectives and your plan for tracking the relevant analytics events. The semantic naming of these events should be consistent and obvious so that all stakeholders can understand what is actually being tracked.
It’s also important to select the appropriate tooling for each use case, ensuring that the chosen tools are fully capable of meeting your performance requirements. Once you have established the use cases, you should then direct your focus into understanding the data required for each.
The final element of your strategy should be to map out all of the data flows between your tools. You need to make sure only the relevant data is fed into these tools. Here, the particular focus should be on understanding the cost and complexity of each data flow. Study the cost and complexity of each data flow, and remember that even the smallest changes can have a material impact.
For instance, Mammoth identified a saving of $70,000 saving in ongoing annual SaaS fees by changing just one single data flow. This job was completed in one day of engineering work.
Your processes and tooling should be clearly documented in your knowledge management center so that all of your analysts and stakeholders know exactly what’s required of them.
2) Leverage automation and tooling across all stages.
Strategies and documentation are often ignored or forgotten when employees are in a rush to hit deadlines. To address this, we suggest automating your data governance using tools. This should help to streamline processes, and ensure consistency and accuracy in data management.
Automation can help you to enforce data governance policies, monitor data quality and alert administrators to potential issues - all of which can help to improve the overall efficiency and effectiveness of data governance.
Of course, it’s a tricky balancing act, since you need highly sophisticated automated checks and controls in place to ensure best-in-class data governance - but without ‘getting in the way’ of the analysts who use the data.
Here are a few ways that automation can help:
- Ensure that analytics events follow tracking plans. You can apply Segment Protocols and similar tools to confirm that your analytics events match the predefined tracking plan. If an event is unexpected or not in the right format, you can block it from firing or send automated violation alerts to Slack and other tools.
- Make the default to NOT share data. Having achieved a thorough understanding of the data that your tools need (and stopped any unnecessary data flowing into these tools), this should be your new default. When new data is created, it is never sent to the destination by default. Make changes on a case-by-case basis.
When using Segment or another CDP, we recommend setting up all non-analytics destinations with Destination Filters to apply ‘Allow Lists’ rather than ‘Block Lists’.
- Monitoring and alerting. Most product analytics tools will inform users of the volume of received data and if/when it deviates from the expected amount. Make sure these alerts are baked into the daily workflow of all relevant users. You can use a dedicated channel in Slack to enable a quick-fire response to any breach in data governance.
- Define roles and permissions. The key is to avoid spreading permissions too far and wide. All tools offer distinct user roles with different permissions. However, it is vital that not everyone is permitted to make changes. You should only allow a select few to make alterations that significantly impact your centralized tracking strategy.
A strong data governance policy is imperative for organizations of all sizes. If your team needs support in building or reevaluating a policy for your organization, we can help. Simply reach out to us at firstname.lastname@example.org.