Find UK government data alpha assessment

The report from the alpha assessment for GDS's find UK government data service on 30 May 2017.

From: Central Digital and Data Office
Assessment date: 30/05/2017
Stage: Alpha
Result: Met
Service provider: GDS

The service met the Standard because:

  • The service team have considered all users finding data and their needs, and were able to define clearly who these users are and their need for the service. The team have been responsive to their findings in the alpha phase enabling them to rapidly iterate the service.

About the service

Description

The service helps users:

  • find open government data
  • quickly evaluate whether or not it meets their needs

Service users

The users of this service are divided in to three segments:

  • people who collect knowledge and resources that exist around a topic

  • people who use data to answer a question and

  • people who use data to build product or services

Detail

User needs

The panel was impressed with the way in which user research has been conducted by the team. This has given them a firm foundation of the user needs people have when trying to find data. The team has also conducted user testing with people to develop the service during the Alpha. This has involved testing in the workplace which has enable the team to develop an empathy for their users, and understand further how users search and use data.

The team has reviewed a number of aspects of the site including feedback methods, rating methods, refinement of search (e.g. by geography), and it is recommended that they continue with this through the beta.

The panel felt that the research will need to increase its reach in the Beta phase, however the plan the team has presented for this is clear and should enable them to test the service with a range of organisations, and users with a mixed level of data skills. This should also include users with access needs to ensure the service is fully accessible.

Team

There have been recent additions to the team including a Product Manager and developers to complete the gaps in roles previously identified. This has been an intense time for the service considering the alpha phase has been six weeks and there has been a programme of induction and shadowing for the new team members. The responsiveness to the findings of the initial user research and subsequent iteration in this short time impressed the panel and the team should continue to build on this work in private beta. The intensity and drive of the work was also seen in the weekly sprints being held with goals and objectives being set and reviewed. The team also conduct show and tells and use a range of tools to communicate including, trello, slack and google hang outs. They have also used a Change Log to share and track knowledge of the design and development of the prototypes.

Technology

Developers are creating some unit tests for their own benefit, and their work is peer reviewed internally between developers. There’s isn’t an automated pipeline to deploy to production, but the iteration cycle is short and effective. While this is acceptable for an Alpha, the team highlighted as one must-do action on their roadmap to beta.

The prototype is built and deployed using open source technologies and common platform such as Gov.UK prototype toolkit and Gov.UK Pass.

The team evaluated ElasticSearch as a search engine and suggested they may look into alternatives for storing user sessions and links to external datasets such as Postgres. Also, the current architecture uses ElasticSearch as the primary datastore. While the decision is acceptable for an Alpha release, it needs to be revisited in Beta particularly if the service has to iterate on the search results while they’re accessible to the public.

It was highlighted that the future architecture is worker based and tasks are executed in a decentralised manner. This will allow the team to scale the system more easily and add more complex tasks - such as checking validity of links - as they onboard more publishers.

The team should consider how they could mitigate the risks of malicious, inaccurate or poor quality data from being published. There wasn’t evidence that the team had already begun thinking about this.

Additionally, the problem of maintaining links to third parties is one that warrants further thought. Individuals are responsible for maintaining their own data sets, and there is currently a “high churn” of publishers. The team have committed to investigating whether it would be feasible to provide a data hosting option (e.g. file upload) for the next phase, as there appears to be a strong user need.

The team didn’t offer sufficient evidence they have mitigations for the most common security threats such as domains theft and malware.

The team didn’t offer enough evidence that they have a plan for being offline.

Design

The panel were impressed that the team had discovered and focused on the primary user journey - googling for data, working out whether this is the right data set and downloading it. It’s good to see a clear scope for alpha that’s been informed by data.

The redesign of this journey was clearly informed by user research - it was a great example of where taking things away (from the old design) was the best thing to do, to make things clearer for users.

It is still ambiguous whether data.gov.uk is part of gov.uk and therefore should use all the same patterns and styles, or whether it should be a separate site with its own branding. This is a question that should be a matter of priority. Not just for the branding issue, but probably more importantly in terms of how a user’s journey may well involve both gov.uk and data.gov.uk, as we saw in the presentation (gov.uk often returns in search results for data gov google searches). In terms of service design, users need a cohesive journey - ‘people shouldn’t have to know how government works to interact with government’.

Given that current ambiguity, the prototype makes excellent use of GOV.UK design patterns. The new tables are a much clearer way to present link data, the pages are well laid out, and the typography clear with a good hierarchy.

It was great to see a clear focus on getting user feedback in order to improve data. Data.gov.uk is an intermediate between publishers and users, and should take on the responsibility of helping publishers improve data, and publish missing data where there’s a user need.

On the home page, it’s good to see new ideas like Trending and Upcoming data sets. In Beta it would be great to see the new design help people discover data that’s useful to them.

Analytics

The panel was impressed that the service team has used analytics, particularly using search data to feed into building the service. We would like to see the metrics mapped back to user needs to ensure that the team can correctly measure that they are meeting the user needs. The performance analytics community can run a performance framework session to help with this

Recommendations

To pass the next assessment, the service team must:

  • The service team should continue to review aspects of feedback methods, rating methods and refinement of search into the beta phase
  • The reach of user research needs to be increased and should include users with access needs
  • The panel would expect a breakdown of the risks and mitigations for the service (or evidence that the business is willing to accept the risk). As an example:

  • Risk: The dataset links to an external domain that expired and has been recently purchased by an attacher with malicious intentions. Mitigation #1: Only .gov.uk domains can be allowed to download datasets. Mitigation #2: Alternatively we plan to host all the data on our servers.
  • Risk: Spreadsheets could contain malware Mitigation: Only CSV files can be downloaded.

  • The panel would also expect to see a list of key impacts such as:

       If none of the above is mitigated we could incur in:
    
  • Reputational damage for Gov.uk, institutions, commercial organisations
  • Disruption to business

  • The service team will need to provide evidence that they have a plan for being offline. The panel would expect the service team to describe user journeys for when the website is unavailable:

  • Escalation path. Is there a call centre? Who notifies who when the website is down?
  • Handling downtime. Is there a holding page? As a user, how do I know the website is down and someone is fixing it? How do I report a broken page?

  • The service team should decide on the best way to integrate with GOV.UK in order to meet user needs.
  • The service team should ensure the metrics being used can be traced back to the user needs so they can be sure they are being met.

The service team should also:

  • Review the use of ElasticSearch as the primary store if the service has to iterate the search results while accessible to the public.
  • Consider how people can discover what data is useful to them when designing the service in beta.
  • Review the metrics collected with the performance analytics community.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Not met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Not met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 30 July 2018