Integrated Data Platform - Dissemination service alpha assessment
Report for the Integrated Data Platform - Dissemination service alpha assessment on 8 June 2022
Service Standard assessment report
Integrated Data Platform - Dissemination service
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 08/06/2022 |
Stage: | Alpha |
Result: | Met |
Service provider: | Office for National Statistics |
Service description
Deliver an improved dissemination service for analysts and researchers through the IDS platform. The dissemination workstream will engage with the broadest range of users, both internally and externally, to ensure we are co-creating a service that meets all needs.
For analysts it will provide the guidance, support, and tools to create data and metadata consistent with the standards set out by the W3C. Following the data on the web best practices, we have adopted open standards such as CSV on the Web (CSVW) (which comes recommended by GDS) to achieve benefits in reuse, comprehension, linkability, discoverability, trust, access, interoperability and processability.
Additionally, the service will deliver systems that enable a data-driven approach to building more flexible and engaging content. We will target new innovations that will drive more automation through the production process, saving time and effort to do more of the value-added aspects. In particular, we will look to closely align with the Government Statistical Service (GSS) recommendations on Reproducible Analytical Pipelines (RAP). These address most of the pain points that have been uncovered in the research that are a barrier to producing analysis that meet their needs. Bake in security and accessibility standards by design.
For the users and consumers, growing a data landscape built on open standards will aid discoverability and offer full access to the range of government statistics and associated content. Developing new products that will provide an inclusive service for all abilities to understand, engage and interact with those outcomes. This addresses the three key tenets of our research which is users want data to be easier to find, easier to use and easier to compare.
Service users
Current focus:
- Analysts
- Data producers
- Data publishers
- Government users
Others to be added:
- Statistical output producers
- Policy makers
- Public access
- Academia
- Commercial
- 3rd sector
- Statistical output publishers
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has conducted user research with a variety of users, considering more and less technical groups, for their various products
- the team has conducted task-based testing on their Data Explorer product
What the team needs to explore
Before their next assessment, the team needs to:
- rethink how they write, evidence and use their user needs for beta prioritisation and research planning. Specifically, user needs generally focus on what users need to do (verb) to be able to achieve a goal, rather than contain the solution in them
- map their user needs against their core user groups with a scoring system related to, for example, how important they are to the user. This should be accompanied by selected evidence to support the priority and confidence levels, which can then in turn be used to plan and prioritise features on the closed beta build. This should include the full on-boarding process from the user’s perspective and include the extra complexities around data security and all key supporting interactions (for example, account set up, confirmation email)
- use the prioritised user needs to plan research to test if and how their products meet them, focusing on those that present higher risk. These could be, for example, those needs that are important, but in which the team has low confidence or those needs that are important, but the user group is varied and non-homogeneous. This approach will allow the team to test early in beta their riskiest hypotheses and recommend appropriate mitigating measures
- do accessibility testing with a wide variety of users, to decrease the already acute tendency of data-work to exclude people. Additionally, consider introducing in their user research protocols an accessibility needs capture-form to get a sense of their participants’ composition
- organise their user categories in a way that more clearly outlines how they map against specific journeys and products. For example, start with the largest ones, such as producers and consumers, and define sub-categories in each according to different goals, tasks or skills
- show more clearly how insights and evidence from user research have influenced and produced change in design, technical capabilities, senior stakeholders’ priorities and decisions
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team are using the service to solve bigger problems such as the lack of common standards and the challenges in finding data sets
- there is built-in meta data to ensure that data are validated before output
- compatibility across multiple devices has included testing this service on mobile devices and an appreciation of user behaviour in this space
- the team has collaborated with other departments to build and test the service and trialled the concept with timely issues of covid response and COP26
What the team needs to explore
Before their next assessment, the team needs to:
- complete more end-to-end testing of the whole journey. It is important for such a large programme of work of this scale to articulate the portion they bring to assessment as a service, as something that helps somebody (a user) do something
3. Provide a joined-up experience across all channels
Decision
The service did not meet point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team have plans to leverage data.gov.uk as a central point of access
What the team needs to explore
Before their next assessment, the team needs to:
- make sure they understand and have designed the support and guidance users will need
- articulate the behaviours that cause organisations to be reticent in releasing their data. How do you know that they will contribute to the platform
- demonstrate how well they understand how people search for data, particularly if they are not aware of what sources are available, for example consider an alerts service for when new datasets or sources on a topic are added so that they can explain what might prompt users to return to IDP
- consider the onboarding journey and support channels, test and refine
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team is using GOV.UK patterns and finding ways to apply the patterns to new problems
- they are designing the service to meet the needs of users with various skill levels from occasional data user (eg policy professionals) to more heavy data user (eg data engineers)
What the team needs to explore
Before their next assessment, the team needs to:
- once the team are happy with the patterns they create, they should share them back to the design community so others can use them
- when testing the end-to-end service, they should ensure they consider the content that was created. While this is not a content heavy service, they will need to ensure that any text, including guidance text, is clear to users
5. Make sure everyone can use the service
Decision
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team have been designing the service with accessibility in mind from the start and have checked for internal staff who might have access needs who could test the service
What the team needs to explore
Before their next assessment, the team needs to:
- consider who the service could exclude because they cannot access it and prioritise testing the service with users who have accessibility needs
- ensure balance with meeting point 2 (how users think) to using simple language. This can be a challenge in a service which has users with such varying skill levels but that is why it is so important to test the content
- run the service through an accessibility audit although do note that it is not a substitute for testing with actual users
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- there is a very comprehensive team structure with effective governance demonstrated through a number of Agile methods including daily stand ups, retrospectives and so on
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate how they plan to ensure that the roles needed to iterate and develop this service will be in place ready for moving into Beta and beyond
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team has Agile disciplines in place
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate how using Agile has improved – and will continue to improve - data quality
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- there is a roadmap in place to identify roles and requirements moving forward
- the team have an awareness of how other organisations (eg Scottish Gov and DfE) have overcome similar cultural issues around automation
- there is a commitment to co-production to ensure users and producers buy into continuous improvement
What the team needs to explore
Before their next assessment, the team needs to:
-
demonstrate how they will manage user feedback to ensure continuous learning and service improvement
-
explain how users might choose to comment or flag poor data and be able to contribute to improving the data
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- where possible they are leveraging existing services and platforms including Publish My Data
- The team are using OAuth2.0 with an OpenAPI based service called Drafter
- anonymous / unauthenticated web users can search, browse, view, filter and run SPARQL queries only on open, published data
What the team needs to explore
Before their next assessment, the team needs to:
- ensure in private beta assessment there is a clear diagram outlining the users, groups and roles used by the service and at which point of the service they are employed. Demonstrate this as part of the private beta assessment
- consider a PEN test for this area of the IDP service
- have a DPIA for the dissemination service
- show and explain how leavers and joiners of the service are managed
- ensure there is appropriate documentation and knowledge sharing around security and privacy, as areas of the service are managed by a partner
- understand, track and monitor dependencies on partner organisations
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team are keen to use existing resources and reports
- the team understand the importance of this area for private beta
What the team needs to explore
Before their next assessment, the team needs to:
- employ a resource dedicated to this area of the service
- conduct user research around the KPIs which should be used by the service
- create dashboards and reports with appropriate update frequency and show them at the private beta assessment
- consider where there are gaps around KPIs for this service in the current reporting
- as this service is data centric, consider Data Quality and Data Governance metrics
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- wherever possible they are looking at open-source technology, so in theory it should be easier for them to migrate between vendors
- the team are using some common foundation tools and platforms used in government: GCP, Docker, Jenkins
- the primary programming languages used on the project are Python and JavaScript
What the team needs to explore
Before their next assessment, the team needs to:
- at the next assessment provide more extensive documentation around each of the pieces of technology they are using, any customisation or new products they are creating, who is managing the build, and who is developing the component
- given the size of the IDS program as a whole and of this service, use and show a detailed risk log at the next assessment
- the Publish My Data platform is written in Clojure – as part of private beta assess whether there is any risk around this given it is not widely used in HMG
- ensure that data can be extracted from core platforms the service will be using: Publish My Data, Plone and Stardog
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- wherever possible they are publishing on GitHub
What the team needs to explore
Before their next assessment, the team needs to:
- have agreed to an overview page on GitHub to explain the different repositories and what they are holding
- ensure the repositories are organised in a consistent fashion, particularly around documentation
- in the overview page, ensure all the documentation in the underlying repositories is linked
- assess where there may be gaps in the documentation of the underlying repositories
- consider that content in private repositories should be made public, wherever possible
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- they are following wherever possible the GOV.UK design system
- they are using CSVw and RDF - two popular standards within areas of government
- they are following the tidy data approach – this aligns statistical data with Codd’s 3rd normal
- they understand the importance of metadata and are actively developing and encouraging its use
What the team needs to explore
Before their next assessment, the team needs to:
- understand and explain the effort required by users to adopt these standards
- agree relevant metadata that should be displayed on the final CMS shown to the users; in other words, creator, date created and so on
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- they are using partners and tools that have been used previously at ONS
- they are following test driven development practices
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that end-to-end testing is done across the service
- have a data journey mapped out clearly so this can be tested and the possible points of failure are clear
- provide KPIs from existing services regarding their performance and therefore the benchmarks that this service should meet
- consider various worst-case scenarios and how they will minimise their impact
- as this service integrates with other parts of IDP, explain at the next assessment how they will do integration testing with IDP service as a whole and where responsibilities around reliability will reside
- as they are introducing new tools and components, have a detailed plan around how these will be rolled out to minimise reliability risk