Integrated data service

The report for ONS's Integrated data service alpha assessment on the 22nd of September 2021

Service Standard assessment report

Integrated data service

From: Central Digital & Data Office (CDDO)
Assessment date: 22/09/2021
Stage: Alpha
Result: Met
Service provider: Office for National Statistics

Previous assessment reports

N/A

Service description

The Integrated Data Service is being created in the cloud in line with the One Government Cloud Strategy to support statistical analysis giving access to a wide range of data. The platform takes us towards the future of data sharing across departmental boundaries and increased research collaboration across government and beyond. The ultimate ambition is to bring together all the talents within virtual teams drawn from across government (and beyond) working together to conduct research that addresses complex policy questions.

As a key component of the National Data Strategy, Integrated Data Platform’s vision is to create a safe, secure, and trusted infrastructure for government data, enabling analysis to support economic growth, better public services, and improving the lives of citizens.

The technology platform will utilise, as far as possible, managed and cloud-native services (rather than bespoke development) initially deployed within a single cloud supplier (with data accessed via the virtualisation layer) but over time it will become truly multi-cloud and hence will seek to adopt equivalent services from other providers where they are available.

A major element of the platform is the use of data virtualisation technology to enable access to different data sources whether hosted on-premises or stored with any of the cloud providers as agreed with the data supplier.

In summary, the value proposition for the service is the access it provides to a wide range of data, the tools to support innovative data analysis and data science and access to a cross-government community of data professionals who collaborate to help solve complex policy questions.

Service users

This service is for:

Current focus:

  • Data Analysts
  • Data Scientists

Others to be added:

  • Data Providers
  • Project Coordinators
  • Policy Makers
  • Chief Digital Information Officers
  • Administrators
  • Technical Operations

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has delivered an acceptable amount of research as they have carried out research with 36 data scientists/analysts and 51 users from across different government agencies using various research methods. They have also delivered user research with five users with accessibility needs
  • the team has a good understanding of their users and their needs and identified that there are various types of users for this service. The current focus is on Data Analysts and Data Scientists while the following users will also be added shortly: Data Providers, Project Coordinators, Policy Makers, Chief Digital Information Officers, Administrators and also Technical Operations. Personas of users have been identified and created for each of the user types
  • there is a solid plan in place for Private Beta and the team has identified they need to broaden their usability testing and conversations with those with access needs

What the team needs to explore

Before their next assessment, the team needs to:

  • as the service is not only going to be used by civil servants, the assessment panel believe that it would be beneficial to test with non civil servants
  • as already identified by the service team, it would be beneficial if the team was able to deliver more research with users with access needs

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated an understanding of the whole problem which exists outside a service (finding and accessing research-ready data sources) and has created a platform to bring trusted sources together
  • they have been speaking to and working with other organisations within government, including a x-gov team, to define the metadata standards and tagging strategy and learning and collaborating with others in the space such as SRS who will help vet external researchers
  • they have broken a complex service into seven microservices, each with a responsible service owner, with service design working across all to ensure cohesion
  • the team considered a multi tier approach of enable/donate/ deposit to address the needs of different levels of users and how they might engage with the service

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to examine the behaviours that cause organisations to be reticent in releasing their data to ensure that they will contribute to the platform. There might be existing knowledge the Behavioural Insights Team can assist with
  • explore how users want to engage with others’ work and have others engage in theirs, how they might comment or flag poor data and how they might be able to contribute to improving the data
  • research into what needs people have of the data sources themselves - for example, do data accessors need to exclude by date or source and do data providers need to remove or amend data supplied and how might the service accommodate those needs

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has an empowered platform support team familiar with the different elements, such as the service workbench. The project team works closely with the support team to learn about the users’ support needs
  • the team is working with various offline channels, including word of mouth, professional communities and relevant workshop groups to access users but will also leverage these as part of future communications plans
  • while the various services within the platform are at different maturity levels, there is a strong understanding of the remit of each and the user journey across them

What the team needs to explore

Before their next assessment, the team needs to:

  • do more work on the collaboration aspect of how users might interact with each other and the data sources others are working on
  • create a service blueprint for the overarching user journey across all seven services
  • more work is needed on the start point to the service. Currently it is an Okta page although it will begin in the undeveloped accredit service
  • more work is needed to consider the inviting and onboarding users to the service as well as testing any associated interactions (eg account set up, confirmation email). Both the invitation journey as well as the request access journey should be considered
  • the panel strongly recommends there be a service wrap in GOV.UK patterns to guide and support the users
  • before the end of private beta, the team should ensure that either all the services are ready or there is a clear and joined up journey for users accessing this service, and a plan to bring the other services online

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working on the information architecture of the data sources to ensure that when the data sources increase, the usability and searchability will scale with the system. This will be helpful for the users when exploring sources
  • the team has done co-creative workshops at data events to learn from their users and ensure the service fits around their mental models
  • the team is focussed on testing the UI to ensure that it is clear for the users

What the team needs to explore

Before their next assessment, the team needs to:

  • test on mobile devices, particularly tablets
  • do more testing with people whose primary role are not data experts (product, policy and admin type roles) to ensure their user journeys can be completed with ease. These may be either data providers or data accessors
  • continue to collect and analyse their Jira tickets and other means of user feedback to ensure they are learning how to improve the service
  • the service team should also do more testing on how people search for data sources, particularly if they are not aware of what sources are available. Will they need to create alerts for when new datasets or sources on a topic are added? What might prompt them to return?
  • one place to consider learning about is where similar professions are centralising. In particular, user research within the public sector has a shared user research library. The local government research library was developed by Hackney Council and their learnings around contribution, searching, taxonomy and scaling might be helpful. https://research.localgov.digital/

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team set up an accessibility working group to try to reach users with accessibility needs and learn from others
  • they have engaged the Digital Accessibility Centre (DAC) already
  • the team has created training demos and materials based on questions received, particularly as virtualisation is a new concept to many people, which has received positive feedback

What the team needs to explore

Before their next assessment, the team needs to:

  • continue research with users who have accessibility needs, looking potentially outside government to associated data professions as well as those within the other user groups to include data providers
  • ensure there is no oversimplification between user needs of data analysts and data scientists; likewise the needs of more junior, less experienced users in those groups
  • ensure that any training collateral is accessible for example, videos have transcripts
  • learn from the questions users are asking when they request support, and answer them not just in training but also in system improvements
  • test the system, once the selected one is decided, address accessibility fixes and create an accessibility statement with an eye to having as minimal exceptions as possible
  • explore best practices for making the data accessible and readable. Consider GOV.UK patterns wherever possible for the platform, and ensure that the chosen supplier adheres to WCAG 2.1 standards

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made great progress in recruiting a permanent team to take the capability forward into Beta and have good ways of working that will facilitate being able to change the product to meet user and policy needs in the Beta phase
  • the team has developed a good product and tested it with a good number of users through the Alpha phase. The team has responded to the asks of the Data Community and has a plan for developing the product in Beta
  • the team is managing dependencies with others across ONS and with other departments. The team has good governance and has set up cross government design authorities which will help develop the product through Beta

What the team needs to explore

Before their next assessment, the team needs to:

  • gaps that need to be plugged before the Beta phase darts are Performance Analyst, content and developer resources. This will facilitate the capacity to be able to respond to needs from the Data Community and also hand hold users through the out of the box technology
  • the team should think beyond the technology and the product and about how they are offering a Service. A good example of a scenario that the team could face is around the quality of data: who does the end-user contact if there are quality of data issues? The department that owns the data or the Service Team? This could be resolved through disclaimers and content. But does need to be agreed

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has established and demonstrated good agile ways of working which will allow them to change and adapt the product. The team has operated in 3 week cycles using Kanban approaches through Alpha but have got into a good rhythm from design, test and adapt
  • the Delivery Manager gave a good account of ceremonies: daily stand-ups, show and tells, retrospectives, scrum of scrums. This will again put the team in a good place when responding to conflicting priorities in Beta
  • the teams demonstrated good practices regarding ways of working and dealt with conflicts in the team well; these are resolved through co-design sessions
  • the team demonstrated a good level of empowerment and has good governance in place to give the team air cover to aid rapid delivery

What the team needs to explore

Before their next assessment, the team needs to:

  • think about some structures to deal with support calls post the Beta phase. The team is looking to go live in March 2022 and there will need to be formal mechanisms in place to handle user queries and support calls
  • think how Agile ways of working can improve the process to increase the quality of data and ensure that there are structures in place to make sure any challenges on the quality of data are picked up by the correct team
  • think about how a GDS designed skin on the out of the box product could be tested. This could help the data community in understanding and navigating the product especially if they are not advanced users

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has tested with a good number of data analysts during Alpha which has allowed them to develop a product that users want
  • the team has been through a number of iterations of the product although they should ensure that the ‘service wrap’ as well as the product is iterated
  • the team has worked well with other departments and communities to develop a product that offers two service lines of data

What the team needs to explore

Before their next assessment, the team needs to:

  • explore developing a GDS designed front end that can be tested with users. This may help them easily navigate the front end of the product and then enable them to easily query data sources
  • make sure that all the roles needed to iterate and change the product are recruited. This includes content, performance analyst and developers (this may require more than just configures)
  • explore the role of data governance and ensure that the testing of data ownership is part of Beta. Users will challenge the quality of data and they will expect a clear route to challenge if the data quality is poor

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has engaged with the right personnels: security officers and information assurance team to ensure the platform is secured and protect data privacy
  • the project is governed by ONS’s security processes and measures, including pen tests, DPIA and vulnerability assessment
  • the team engages with the policy workstream to consider the ethical and legal aspects of a particular dataset, MoU and DPIA
  • data is protected in transit and at rest

What the team needs to explore

Before their next assessment, the team needs to:

  • explain, with examples of projects and datasets on IDS, how data risk and privacy is evaluated and managed for different types of data criticalities and each use case

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a good understanding of what they need to measure including take up, self service of data, as well as the 4 mandatory KPIs
  • have tested with many users to inform the development of the product

What the team needs to explore

Before their next assessment, the team needs to:

  • should define data governance for the data and datasets on the platform and measure them
  • the team should think about the quality of data metric and success measure
  • think about the service that needs to wrap around the data products. Users will expect a Service to be able to contact and resolve their issues
  • the service element should be tested in Beta and a plan developed for public Beta. If the service takes off users will expect a service desk to be able to contact

11. Choose the right tools and technology

The platform used a mixture of software-as-a-service (SaaS) and self-hosted licensed commercial software in the public cloud to create a proof of concept. It was understood that a procurement exercise was underway to choose the products to be used in beta and onwards.

It was appropriate to use commercial software to help solve the hard engineering/ automation problem of data ingestion, integration and transformation so that the team could focus on delivering the critical values of the project and move at a faster pace. Moving forward, the team should consider how to architect the next solution to reduce the dependence on a particular commercial software

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used commercial-off-the-shelf products in prototyping to quicken time to market and add value to the project
  • the solution was started with using proven technologies from other projects
  • the team used ONS common toolings to align the skills and capabilities in the organisation

What the team needs to explore

Before their next assessment, the team needs to:

  • consider architect the solution in a way to abstract out the dependency on a particular commercial tool as much as they can to minimise vendor lock-in. (e.g. using APIs with bespoke frontend rather than using the user interface provided in the software)
  • have a high-level consideration of exit strategy for these products

12. Make new source code open

The only code that exists in the Alpha was infrastructure codes. The team used infrastructure-as-code and a self hosted tool to integrate and deploy the infrastructure codes.

Decision

The service met point 12 of the Standard.

What the team needs to explore

Before their next assessment, the team needs to:

  • have sought the governance board’s approval to open source
  • code in the open from start for any bespoke application and data architecture codes
  • design the platform to enable users to code and collaborate in the open

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was working with another ONS team in defining the meta-data standards, which would be used for tagging and categorising datasets in the platform
  • ONS is working closely with the Data Standard Authority to define and influence related data standards
  • the team contributes to the wider data communities

What the team needs to explore

Before their next assessment, the team needs to:

  • apply API, OAuth, OIDC and other open standards in the ongoing architectural design to allow for interoperability
  • adopt open data standards for ingesting from and exposing data to the users
  • whether the use of GOV.UK patterns can be implemented and if so, are there any new patterns a service embarking on a new concept, virtualisation, might contribute to the shared library

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has considered security and service monitoring from the beginning
  • had user acceptance tests (UAT) in progress to test the platform end-to-end
  • used metrics available from the COTS products to monitor the performance of the platform

What the team needs to explore

Before their next assessment, the team needs to:

  • understand the interdependency of the different components to the availability of the whole platform, identify bottlenecks and single points of failure, if any
  • use automated tests wherever appropriate
  • define end-to-end reliability targets in beta (availability, latency, etc) and measure against them
  • implement a status page to inform the users on service status

Updates to this page

Published 12 October 2021