Data Marketplace alpha assessment

Service Standard assessment report Data Marketplace 23/03/2023

Service Standard assessment report

Data Marketplace

From: Central Digital & Data Office (CDDO)
Assessment date: 23/03/2023
Stage: Alpha assessment
Result: Not Met
Service provider: Cabinet Office

Service description

The overarching Data Marketplace programme vision is to have a central place to discover and understand how to access data from across government in a legal, ethical, and effective way – built with government for government. This aligns to the ministerial commitment for ‘Transforming for a digital future: 2022 to 2025 roadmap for digital and data’.

The Data Marketplace helps government officials find and access data, including:

  • a single place to find government data

  • assess usefulness of data

  • shared support, guidance, and tools

  • follow repeatable and responsible processes

Three products were demonstrated: ‘Discover data’, ‘Design a data share’ and ‘Access and fix API specifications’. For this report, they will be collectively referred to as ‘Discover, Design and Specifications’.

Service users

  • ‘Discover data’ product: Non-technical acquirer, Technical acquirer, Supplier information asset owner, Supplier department catalogue owner

  • ‘Design a data share’ product: Acquirer share lead, Supplier share lead, Expert advisors

  • ‘Access and fix API specifications’ product: Supplier developer, Supplier technical architect

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has appropriately considered the research findings from the discovery phase

  • the team has developed a set of personas and their associated needs comprising the typical roles involved in sharing data

  • they have engaged with a good number of job roles and government departments ● they have considered and evaluated existing solutions

What the team needs to explore

Before the next assessment, the team needs to:

  • understand the impact of the fragmented processes across government, because identifying that it exists is a really good first step, but they’ve not fully understood the implications on this service

  • do more research with users in organisations with varying levels of data sharing maturity, from those in organisations with a low digital level of maturity to users in those organisations that regularly engage in digital data sharing agreements. This is a crucial aspect of inclusion for the service, those organisations (and their users) with a low level of maturity could easily be excluded from the service, so making sure their needs are understood is extremely important in this case.

  • do more research with users of varying levels of experience in their role (so far the team has only done research with 1 user new to their role)

  • do more research with users with disabilities. It is noted that the team has a plan to engage with disability groups, hopefully that will bring access to that segment of the population.

  • do more research with suppliers in organisations with a low level of maturity and also with those who might have a high level of risk aversion. This will really test the team’s assumption that “supplier organisations will be willing and able to share their metadata”. There seems to be a gap about this as the compiled user needs do not make reference to risk aversion, for example.

  • map the whole journey of data sharing with user pain points and opportunities at various points. The additional research with suppliers suggested in the previous point should help to create those artefacts.

  • do more research with acquirers in the ‘Deliver’ part of the product. So far research on that part seems to have focused on suppliers, which is understandable as they will be the ones developing the APIs; however acquirers will have to understand and work with the technical documentation, so understanding their needs seems important.

  • test the usability of the developed prototypes more thoroughly. So far it’s not clear there have been enough cycles of refinement taking into account a variety of roles, levels of expertise and in organisations with a variety of maturity levels (for example it seems that the catalogue only had one round of usability testing).

  • Civil Servant participant recruitment can be challenging, the team might want to explore including policy and engagement professionals to help with this, as well as engaging with Service Communities.

  • clearly communicate how you’ve arrived at your personas using your research data, specifying for each product and persona created, what methods you have used and how many participants you have built your evidence on. Also, review/add to these when more research has been conducted with supplier groups in particular prioritise open ended interviews as a way to get a more holistic perspective on their context and aims.

  • develop a plan as to how to measure that the service is meeting the user’s needs. This is part of the performance framework but interacts closely with understanding the users and their needs.

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has identified that the whole problem could not be solved by one product and put together three products to aim to do this

  • the team has understood the problem space by understanding the legislation and what previous projects had done and why they’d failed

  • the team understands that the different interpretations of data sharing legislation was a problem and a barrier to sharing but had made the right decision to consider this as out of scope. They should continue to feed their findings into the relevant other working groups.

What the team needs to explore

Before the next assessment, the team needs to:

  • clearly explain the distinct products within the space and how they are coming together to solve the whole problem. A diagram of the whole service or ecosystem would show how the products all come together and will help explain to future panels more clearly.

  • create a landscape map (e.g., an onion diagram or vectors) that connects products (not just the three products presented in the assessment) but also other products in the programme, departmental data catalogues and other touchpoints (e.g., Office of National Statistics, data.gov.uk or support channels).

  • treat the landscape map as a living document and update on an ongoing basis as additional touchpoints are understood. The benefit of creating the landscape map will provide focus on specific products being developed and how they are aiming to solve the whole problem for users. It will also identify improvements that need to happen to other products to achieve this that sit outside of the scope of the delivery team.

  • understand data suppliers and design for their needs. One way to do this could be to break out the supply of data into a separate step before ‘Discover’ so there are 4 stages, these could be (e.g., Share, Discover, Design, Specifications).

  • map the whole journey of data sharing with user pain points and opportunities

  • consider the barriers that could be faced to collecting data from departments, for example working with departments that don’t yet have an internal data catalogue and also thinking about the mechanism for sharing data into the catalogue. The team identified on several occasions that finding data was a challenge, and the panel felt the team had not fully solved the challenge of getting a better way for gathering good quality data to be searched.

  • focus on the risks of departments’ not using the products. What are the barriers to users’ using the products (e.g., physical, technical, policy, process, motivational, awareness etc.)? Understand these barriers and how they can be overcome to drive success.

  • understand the supplier needs including their reporting needs, for example do they need to monitor how the data is used?

  • plan how the different products feed into each other. For example could ‘Design a datashare’ and ‘Access and fix API specifications’ be used to identify data that isn’t in the catalogue?

  • think about what other solutions might be needed to meet users’ needs as they move into beta. For example, the current solution assumes there are skills to build APIs across the government, if this isn’t true there may be a need for a tool to help build APIs.

  • consider what it will take to increase capability in the data community beyond tools and share those needs with the wider authorities they are working with.

  • appraise ‘Deliver, Design and Specifications’ problem statements over time as more is understood to make sure the right problem is being solved for users.

  • further understand the legal, policy and ethical landscape, including whether prime legislation due in 2024 for sharing data across government departments will influence the project direction.

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team considered support that may be needed

  • the team has been realistic that the service will be used by DDaT professionals with higher digital literacy than the public. For example civil servants and contractors all have access to relatively modern laptops and need to share data using technology.

  • the team is considering mindsets and behaviour changes needed to shift to a future state

What the team needs to explore

Before the next assessment, the team needs to:

  • understand more about the reality of how data is shared now. What channels are currently used (spreadsheets and email) and what opportunity there is for quick wins in that space.

  • make contact with the data science community to understand what solutions they already have in place

  • continue to explore content and language, and how this can remain consistent across the products and wider landscape

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has considered the content and interaction of each product well, designing solutions to meet user’s needs

  • the team understands best practice and standards from governments around the world and private sector and uses them to inform their designs

What the team needs to explore

Before the next assessment, the team needs to:

  • think about the transitions between the products and what the ecosystem looks like

  • think about findability of the service, as users will need to be logged in they won’t be able to google to find it, so how will they know it exists, what other service could and should this service fit with. For example could it be a logged in state of data.gov.uk?

  • do more rounds of research on the prototypes, particularly the ‘Discover data’ tool

  • the team should consider the volume of use cases and design for the most common use case first, for example with ‘Design a datashare’ there are a lot of Memorandum of Understandings (MOUs) for simple data shares that teams do very regularly. So consider designing the tool to ask a few simple questions to understand the complexity of the data sharing agreement needed and for the simpler ones generate the MOU based on templates that already exist.

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has partnered with DWP to understand how a new tool could pick up metadata from the DWP catalogue

  • the team is allowing users to define which ruleset they want to use to check their API. The panel recommends the team go with as few rule sets as possible for initial beta launch and iterate as needed.

  • the team is using design patterns based on jobs and tasks that need to be done to understand how the products need to be designed so they are familiar to the user
  • the team has identified that finding users with accessibility needs was a challenge, however, continue to understand the profile of these users

What the team needs to explore

Before the next assessment, the team needs to:

  • do more research on the Find Government Data tool to understand how the search functionality works in reality (for example do people want to filter by the fields contained in a data set) and what are the needs of the data science community who are already sharing a lot of data across government. Reaching out to the data science community to learn from them should be a priority.

  • make sure technology choices are accessible

  • use insights from user research to make sure designs are able to be used by everyone including those with accessibility and digital inclusion needs

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • there has been a dedicated teams for the ‘Discover, Design and Specification’ products to focus on the requirements of each

  • despite having three different alpha product teams it was clear each team understood the interconnectivity of the products as part of an overarching journey. This was reinforced by having an oversight team in place.

  • there is Data Marketplace oversight group and a metadata working group for codesign in place

  • Chief Design Office (CDO) Council engagement is in place for strategic planning

What the team needs to explore

Before the next assessment, the team needs to:

  • obtain skills from a Performance Analyst as a priority, to develop a detailed performance framework aligned to actual user needs and goals

  • appraise the suggested Oversight, Design and Deliver team structure for beta. It is important to recognise that the ‘Discover, Design and Specifications’ products are individual, but they are also interconnected. Reflect upon the team structure and skills needed making changes when and where necessary.

  • define roles and responsibilities clearly. Currently, the Programme Manager, Product Owner and Product Manager are the same person

  • aim to increase the number of Civil Servants on the project. Based on current beta plans approximately 74% of the team will be non-Civil Servants. There is always an increased risk of churn with non-Civil Servants.

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the teams are following agile delivery principles including expected ceremonies. They are especially good at sharing information at regular weekly team show and tells, and other show and tell sessions to wider stakeholders monthly.

  • the teams are working well together on the individual products they own, working collectively with the oversight team

  • high-level project milestones and a roadmap of associated activities is documented for developing the products in private and public beta leading to a sustainable end state

  • the teams are actively using collaboration tools such as Miro and Jira. Note: check Miro suitability as security issues have been identified for use in government. Mural may be a preferred option.

  • the team has defined the scope appropriately to their position at the centre of government. They are aiming to do only what the centre can do and balance bespoke user needs with the most popular needs. The scope of the products they’d created demonstrated they’d done this well.

What the team needs to explore

Before the next assessment, the team needs to:

  • prioritise product deliveries as some are at different levels of maturity than others and additional research and iterations may be required, or other ideas need to be tested to resolve riskiest assumptions. Prioritise without holding-up other services moving into beta to deliver value early.

  • maintain team knowledge and audit trails. For example, logs of research conducted, synthesised insight, design decision logs, landscape maps, stakeholder maps, technical documentation. As there is a high percentage of non-Civil Servants, it is essential that this knowledge is up-to-date and available to upskill new people as quickly as possible.

  • consider using design histories to document what they are designing and why. This will help with joining-up between teams.

  • expand upon opportunities to collaborate with other government departments to further understand the data landscape, but also specific features. For example, the HMRC API catalogue team has done extensive research on search behaviour that could translate to the products.

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the Heroku prototyping kit is being used well to quickly iterate user interface ideas for all three products

  • the team are actively making changes based on the feedback of users

What the team needs to explore

Before the next assessment, the team needs to:

  • make sure different ideas are fully explored and iterated based on a breadth and depth of feedback from identified user groups across government departments. Prioritise these based on providing the greatest value to the user when in beta.

  • consider the value the ‘Discover, Design, Specification’ products will bring to the user and implement what the user needs, not a set of pre-defined requirements

  • be willing to discard ideas when iterating on the products, justifying why they need to be discarded

  • demonstrate how the beta delivery team releases code frequently improving the products based on ongoing user research and service design, for example a DevOps approach to continuous integration

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had understood the legal and data protection implications of data sharing and the needs of various users in that space

  • the team has shown there is a plan for data access with the transit of API gateways with clear consideration for credentials handling

  • the team has considered they will be engaging with the National Cyber Security Centre when the overall design is understood

What the team needs to explore

Before the next assessment, the team needs to:

  • think about the anti-personas or bad actors to explore vulnerabilities in the design

  • the team should consider Security architect as part of the initial design

  • the next step should involve a discussion with the National Cyber Security Centre and Information Assurance

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have identified the four standard KPIs and some additional areas to measure as success factors

  • the team have created a basic performance framework in the absence of a Performance Analyst

  • the team understand the principles of measurement governance, including General Data Protection Regulations (GDPR), Personal Protected Information (PPI), and reporting aggregated data

What the team needs to explore

Before the next assessment, the team needs to:

  • obtain the skills of a Performance Analyst

  • create a detailed performance framework. This should include the overarching problem statement, goals and objectives, performance indicators, actual user needs, metrics used, measurement tools and reporting methods

  • understand how to measure digital take-up in the context of those that could and are using the products, versus those that won’t use the products, and those that can’t use the products. Ultimately, the success of what is being developed can only be accurately measured and continuous improvements made by actual usage.

  • understand that accessibility is not a success measure. It is a legal requirement for public sector services.

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • there has been clear analysis on the technology in response to the data sharing requirements; i.e. transport APIs and data cataloguing along with the use of Linting tools for editing API to assess against best practices and exploration of data cataloguing tools with consideration which are at field details level.

  • to aid the key design document, the team explored both open source and commercial tools during analytical exploration

  • the panel was satisfied to learn that good consideration was made for the supplier with respect to how they expected to expose their data, the rule set, the mediation advice and the consistency to be given

What the team needs to explore

Before the next assessment, the team needs to:

  • think about how technology can help with the quality of the data in Find Government Data because the usability will be reliant on the quality of the data in there

  • the panel expects the completed KDD to inform the decision for build versus buy.

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team will be making their code available in github where possible; commercial tools with IP may prohibit the possibility of code being made open source, otherwise the team intends to make everything else open source

What the team needs to explore

Before the next assessment, the team needs to:

  • identify keys, credentials, algorithms used to detect fraud and unreleased policy to help decipher which code will not be made open source.

  • clarify which Open Source Initiative licence applies to released source code

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is using and contributing to open standards for data access. Using Data Catalogue, Data Transit and Data Share using standard tools recommended. E.g. GDS prototyping kit, Heroku platforming.

  • the team have used open source technology along with globally accepted tools recommended by CDDO for security assessment of APIs for validity and security vulnerabilities; checking API upload using OWASP, and 42Crunch, (at free entry), combining security-left protection, swagger files checks.

What the team needs to explore

Before the next assessment, the team needs to:

  • use open standards, and engage with the open standards team at the Central Digital and Data Office (CDDO) to propose a new open standard if there is not one that already meets their needs.

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • to learn that a review is underway for placing APIs into department Catalogues for Self-service, reusable and 3rd party access. With the ability for clients e.g. developers to check the APIs in detail including rule-sets, extensions with government standards.

  • the team has considered and performed testing processes. Using “Ruby on Rails application (Hosted on Heroic). Working across government.

  • the team is involved with the Cross government TDA, the importance of this being items of the solution will be endorsed against policy and guidelines, hence reducing risk

What the team needs to explore

Before the next assessment, the team needs to:

  • sort out vocabulary terms between supplier and acquirer

  • clarify further Transit API gateways providing wrapper between endpoints

  • make sure that there is appropriate monitoring with a sustainable plan to respond to issues and or problems.

Updates to this page

Published 23 November 2023