Demand for advanced analytics is increasing, and so, too, is the expectation that management accountants will perform the analytics. This growth in “self-service” analytics has reached the point that research and advisory firm Gartner predicted more analysis will come from end users than from data professionals in 2019.

But how often have we heard of an analysis resulting in a business decision that later proved problematic because of faulty data? As technology makes it easier for employees to access data, write reports, and conduct their own analysis, data governance becomes an even more important safeguard to ensure the integrity of underlying data.

Self-service is the next step in analytics maturity. As businesses become more data-driven, data governance provides the foundation for growth into predictive modeling and automation.


When asked, 99% of leaders of large organizations say they want a data-driven culture to maximize the value of data through analytics. Specifically, they aim to make business decisions faster and more accurately through automation and predictive modeling.

Why emphasize culture? The limitation for achieving analytics maturity isn’t usually related to data or technology but, rather, people’s reluctance to use data and technology to answer business questions—in other words, using data analytics rather than intuition as a driver for business decisions. A shift is needed toward a culture that trusts that these data-driven decisions will be effective.


Trust in data analytics is the foundation of a data-driven culture. Fortunately, progression through the stages of analytics maturity also builds a foundation of trust, and eventually business decision makers gain enough confidence in predictive models to let the models themselves make the decisions:

  • Descriptive analytics builds trust in data by confirming the current state of business.
  • Diagnostic analytics builds trust in data-driven decisions by people.
  • Predictive analytics builds trust in data-driven recommendations by statistical models.
  • Automated analytics is only possible when data, decisions, and predictions are trusted enough to enable data-driven decisions by statistical models.

For this analytics process to function effectively, the data inputs (“raw materials”) must be consistent and reliable for the information outputs (“finished goods”) to be relevant and comparable. Relevant, reliable, comparable, and consistent are the four desired attributes of accounting information. Effective governance means that data used in decision making is of consistent quality and from reliable sources. Efficient governance leverages connectivity and technology to enable the comparison of data from many different sources and to deliver relevant analysis.


Unfortunately, only about one-third of data-driven culture initiatives succeed in larger firms. Often the reasons for failure stem from insufficient data governance. If an organization suggests its data quality is insufficient to use for decision making, that signals ineffective data governance. If the data can’t be accessed, that’s a symptom of inefficient data governance.

Ineffectiveness. “I think you forgot to adjust your dates for time zone…”; “I’ve heard ‘churn’ defined three different ways today…”; “We can’t do that analysis; we don’t have good data….” Likely every person reading this has heard or said every one of these phrases.

Issues with data and analyses, from little mistakes in calculations to instability in data sets, all build to create a culture of mistrust in data and, by extension, mistrust in any decision based on that data. The root cause of these issues is often ineffective data governance. Organizations may be unaware of critical governance tools such as data dictionaries, data catalogs, and heuristic layers, or the organization may suffer from a lack of communication between the data owners and the end users.

Inefficiency. Certain phrases signal inefficient data governance: “Will you email me that database extract?”; “Why can’t I access that reporting database? There’s no way for me to get approval?”; “It’s going to take more than a week to get access?”; “I can’t use Python or R? Why not?”; “I’ll just get someone to build out a new data platform for my department.”

Inefficient data governance is usually more difficult to resolve than ineffectiveness. Sometimes it’s necessary due to regulations, e.g., open-source language restrictions or the EU General Data Protection Regulation (GDPR); in other cases, it’s a symptom of an organization failing to commit sufficient resources to data governance.

Often the easiest way to govern a data set is to block access. But blocking access can create inefficiencies: While no access means no one can compromise the data, it also means no one can use it to improve the business.

Nonconnected data sharing is also inefficient. If users are constantly getting data from FTP or email, then it will be difficult for them to create a report that updates automatically and impossible to automate decisions. Commitment to analytics governance means giving people access to data in a way that facilitates analytics maturity, even though that takes time.

If a department is willing and able to build its own analytics platform, that shows the group is highly motivated to help the business and should be congratulated. But it also shouldn’t build its own platform in a silo because the entire business could benefit and learn from it.

A department or employee “going rogue” with analysis is a red flag for governance inefficiencies. When such behavior is discovered, a committed organization should see it as an opportunity to learn how governance is failing and give a sense of agency to the rogue by making that individual a part of the solution.


Efficient and effective data governance provides clear ownership and standards for data and data processes to ensure data quality. Although many approaches exist for data governance implementation, most share the following principles:

Accountability. There must be clearly defined ownership of, and accountability for, different types of data. Interconnected data managed throughout the organization requires consistent practices in order to maintain its effectiveness and value. In most organizations, data oversight doesn’t reside within one department. Human resources is the keeper of employee-related data, for example, while accounting maintains financial data. Shared governance brings consistency by establishing organization-wide policies and procedures.

Standardization. Data is an asset and must be protected like one. Clear policies on access, definitions, privacy, and security standards are needed. The committee must define the policies, and each department head must ensure adherence. This approach will ensure that the organization is in compliance with regulations such as GDPR.

Quality. Analysis is a critical tool for decision making and is only as good as the data upon which it relies. The quality of data should be managed from the time it’s captured. Good data governance includes defining one set of data-quality standards for the organization and establishing consistency in how that data quality is measured and recorded.

We live in a fast-paced world where management accountants are asked to provide insightful analytics, often on short notice. Data governance helps ensure that our data is readily accessible and accurate. That’s especially true in situations where data is spread throughout disparate systems and departments. Often the analysis we’re asked to perform relies on data that we don’t oversee. By coordinating with other data owners in the organization, we can protect the integrity of data and spend more time on value-added analysis than on scrubbing data.

About the Authors