Skip to main content

Study on public sector data strategies policies and governance

Published on: 19/06/2020 Last update: 28/07/2020 Discussion Archived

This study deals with the issue of data strategies and governance. This includes initiatives in the public sector both at the strategic level, such as data strategies, data governances and data, management plans; and at organisational level, aimed to create units or departments, and to elaborate new processes and role. The report is available for download together with the full annex.

The key recommendations stemming from the analysis are:

  1. Start with the problem, not with the technology. Building a data strategy is more than an investment in a technological data analytics platform. Few strategies include such investment, and those who do are typically vertically focussed on specific sectors or organisations and have strong links with users. On the other hand, there are interesting examples of focussed centralised technological components and services, as shown by the Reproducible Analytical Pipeline case analysed in task three. A common trait of most advanced horizontal and vertical strategies is a demand driven approach: providing a variety of support mechanisms, from governance to skills to support services, to address real problems, such as health, poverty, urban issues. Focus on the key questions to be answered and the policy problems to be solved. This is important in order to deliver tangible results.
  2. Analyse permanently user needs. Users include both data holders and data reusers, both internal and external. Too often user needs remain assumed or based on anecdotical evidence. Not only it is necessary to formally analyse them in the first place, but perhaps more importantly to constantly monitor them over time to adapt to how solutions are used. The constant collaboration between the Danish Data Mining Unit and the municipalities frontline case workers is a clear example of this. Iteration of delivery is therefore crucial – no service is designed perfectly the first time.
  3. Co-creation is a fundamental component of the strategy. Bringing internal and external stakeholders onboard is a necessary (not sufficient) condition of success. But it is equally important to keep stakeholders onboard after the strategy is launched, during the implementation. Other government agencies need to see the benefit to share data and to conform to the required standard and processes, because there are costs in doing so. Of course, there is a shared perception among decision-makers that data is a strategic resource and that investment is needed, but this is only sufficient for kickstarting the process: the difficult part lies ahead.
  4. It is not sufficient to consult and co-create with stakeholders: what matters is delivering results. There is a lack of business case for data innovation. Existing strategies should focus, as in the case of the Netherlands and New Zealand, on delivering short term results via small scale pilots on topical issues. But pilots should be the beginning of service delivery, as shown by the Findata case, and their results should be well documented and shared. The problem is not only the difficulty in demonstrating impact – the ultimate benefits in terms of quality of public service. It is the actual difficulty to demonstrate deployment and adoption – simple projects that work and deliver. Data strategies should balance long term perspectives to data stewardship with short term delivery of pilots.
  5. In order to ensure delivery, it is crucial to take a practitioner led approach. The most successful strategies are those were data experts in public administrations are brought together and given a visible role in the process, as in the Netherlands with the creation of a cross department sounding board with data analysts and policy experts. There is a permanent gap between data experts and decision makers, and for data strategies to work, data experts should be empowered. And communities of practices are the fundamental tool to enable mutual learning and empowerment of practitioners.  
  6. Create a data culture across department and institutional level. Data-driven innovation requires cultural change, training and bringing in new resources from the outside. New centre of competences (such as the Dutch labs) have to be created. Data training should be provided to all civil servants, and in particular to decision makers. But it also requires the reinforcement of internal capacity and the creation of effective communities of practice that cut across government silos, and the creation of knowledge and expertise centre to facilitate knowledge exchange between data champions and novices.
  7. Because it’s a long-term process, expectations need to be managed correctly and hype should be avoided. Delivering data driven innovation is not easy, it’s not a low hanging fruit. Data is not a commodity. It requires extensive work for access, preparation and cleaning, but also for processing and reprocessing. There is a constant risk of disappointment that backfires. It is important for data leaders to raise realistic expectations from other stakeholders and to start by focussing on data availability. Pilots should be selected based on two criteria: a genuine need and access to available data. Luckily, the evolution towards a data culture is visible across society and the economy, and it is here to stay – particularly so following the ongoing pandemic crisis. There is no need to overhype the opportunities.
  8. A robust ethical framework is crucial and can be instrumental to innovation. The results are long term, and it is important to avoid crisis in the short term that would “put back the clock”. The safeguards can work hand in hand with more data reuse, by creating a shared data stewardship culture. Actions for data protection compliance should be integrated with those on increased data literacy: in fact, the lack of a data culture is damaging for both data protection and data innovation. But an ethical approach goes beyond compliance with data protection and includes also what is done with the data, for instance to avoid any punitive spirit in the services being put in place to fight poverty based on the data gathered.
  9. Monitoring should be present and structured but not drive the process. Milestones and KPI should be core part of any strategy – and it is currently very rarely the case. KPI should not concern only outputs, but also the inputs and the process, such as the percentage of datasets in line with the required standards, the access to base registries, and the number of departments taking part in the different activities. In fact, the main compliance mechanism in the case of such soft strategies is monitoring and reporting, as shown by the Dutch case where the most important control mechanism is reporting to Parliament. And they become fundamental in ensuring the long term collaboration of different stakeholders, as in the Danish case.

Do you agree with the recommendations? Why?

Do you have specific comments or questions on the study?