Skip to main content

Enterprise Data Governance as the management of data across its pipeline - from production to consumption (and retirement). How to enable our “data fellows” to leverage on data to save money, save time and reduce risk?

What are the key elements to success? What are your thoughts? 

Great topic to raise Luca, something that is on the minds of many in organisations, but often not yet established. Unfortunately there is still no “formal” definition of data governance and I see so many different/conflicting and confusing definitions on it that I can understand the caution that many senior leaders in organisations show when the concept of data governance is raised.

When I am explaining data governance to stakeholders I follow the DAMA model. Data Governance is the coordination of Data Management activities. It ensures that each individual data management activity aligns towards a single common goal.

This then leads into the concept that any activity that impacts on data can be grouped under the “data management” banner. Be it the population of data in a warehouse, initiatives to improve data quality, or ensuring that appropriate safeguards are in place keeping data secure. Outlining that most of these data management activities area already being performed in most organisations, just that they are often uncoordinating resulting in confusion, domain disputes regarding whose problem is it and an overall loss in efficiency and productivity.

I am not a huge fan of graphics like the one you have shown as it often raises more confusion and questions than answers. If Data Governance is about action (a verb) why are most of the components listed about objects/concepts and things (nouns). For example having strong trusted data lineage helps out undertaking a root cause analysis, or an upstream impact analysis but by itself it is not data governance. Same as a hammer by itself is not a carpenter. It is a tool to achieve a certain outcome, but it is not the outcome itself.

As to key elements of success with data governance, humility. I have almost never heard of organisations getting data governance right the first time. Some plan and plan and plan trying to design the perfect governance framework but never get anything into practice. Others try something and then when they find it doesn’t fit the organisation keep trying to make it fit.

The best approach I have found is start small, consider a MDM approach, get something happening at a small scale and then allow it to organically grow. Allow the framework to shift, change and evolve to what best fits the organisation, and yes sometimes it may need to fail. But that is good as failure is one of the best ways to learn. Acknowledge that failure is an option, be ready for it, embrace it and then pick the pieces up and try again.

But most importantly at the outset define what these terms mean for your organisation. Have them recorded in a public document somewhere otherwise the risk that different stakeholders having different understandings, concepts and expectations is too great, resulting in a great deal of time lost.   


Amazing explanation Dan, appreciated and yes, I agree - any form of diagram or simplification just shows that the blanket is too short. I like thinking of Data Governance as the sum of people, processes and technologies around the whole data life-cycle, from production to consumption and retirement  - yes, all manners of sins :)

And your example starting from master Data is a very good one indeed.

I also like saying that DG should aim, ultimately, to save money, save time and reduce risk - answer concrete business questions, as you say, rather than being the exercise of deploying some technology not supported by a clear and mature data culture.

I feel it still is a discipline very much in development over the last 20 years - where we have seen all kinds of technologies and principles rise and fall.

Personally, one of the biggest challenges I have always found being on the customer side is to create and implement a culture for Data Stewardship - a whole universe in itself.

Thanks very much for your precious suggestions - indeed one thing is to produce a technical solution, another thing is to drive a new way of thinking and working.

Please stay in touch!

 

Luca


One approach that I like to take when looking for buy-in regarding data stewardship and coordinated data management activities is to flip the concept around from “responsibility” and “control” to one of value generation.

 

An organisation has already invested a certain amount of capital in the original generation/capture of data. This was agreed to with the understanding that the data was needed to deliver a specific outcome. More than likely the organisation did not consider it as a data generation activity, but through almost any business activity data is generated.

Data stewardship and data management activity helps ensure that the data being generated is fit for purpose, that the organisation is able to extract the desired value of the data asset.

But, a good data steward will see the advantage that the data can be re-used by others in the organisation. The data can be shared with others for a tiny fraction of the original invested capital while the value they maybe able to extract is a bonus for the organisation. I sell the idea of a data steward as that of a salesperson, not a controller. Like a good salesperson if they can get others to be using their product (dataset) then mission accomplished. If they can shine up the dataset, make it more accessible, discoverable, make it fit for another purpose, then the chance that others will engage and use their dataset(s) will increase.

The only issue with this approach is that it is hard to quantify without undertaking a complete data valuation process. Establishing an organisational value metric on datasets sounds great, up until you try to establish the value metric. An approach I’ve used in the past is baseline all catalogued datasets to a nominal value (100). A snapshot of the upstream impact (ie the utilisation of the data) and certain core quantitively data quality metrics are taken. As time progresses a revised dataset value can be generated through any changes to the upstream impact, and improvement/degradation of the quality metrics.
This then gives the data stewards something more certain than hand-waviness to show for the investment in improving their data. Being able to state that they have seen an X% improvement in their datasets has a big impact or when trying to establish a business case for data quality initiatives. 


Reply