Data.gov.uk alpha assessment

The report from the alpha assessment for GDS's data.gov.uk service on 31 March 2017.

From: Government Digital Service
Assessment date: 31st March 2017
Stage: Alpha
Result: Met
Service provider: Government Digital Service

The service met the Standard because

  • The team were able to demonstrate a strong understanding of user needs and how these have been translated into developing a new publishing process

About the service

Description

The service aim is to make it easier to publish and find high quality open public sector data.

Service users

The users of this service are publishers consisting of public sector workers in government departments, arm’s length bodies, local authorities and councils. Also data users, a broad spectrum of people who use the data for a variety of purposes.

The assessment team were informed there were two sets of users, “publishers” and “data users” and user needs were defined for both set of users. The current scope of the service was explained to the assessment team and also the future proposition in respect of the “data users”. For this assessment the service was assessed against the “publish data” and the assessment team recommend the service team attend an alpha assessment for the “use data/find data” part of the service.

Detail

User needs

The panel was very impressed how user research has been integrated in the scrum activities and the continuous engagement of the team with user research activities. The team has carried a wide range of user research activities and spoken with a significant amount of users.

The panel was pleased to see the plan for the ongoing user research and continuous engagement with an extremely diverse set of users. We are also glad that there is more support from researchers to help support the research activities of the service.

The team has clearly articulated the user needs of the service and mapped them against the segments/profiles of users identified for the service. The team should continue iterating the profiles of the service, while considering generating personas or sub-profiles to capture the needs of the diverse user groups in each profile.

More work needs to be done in terms of access needs of the service. At this stage of the service we need to have a more clear idea user needs for all users, including those with access needs. The team is planning to do usability testing with this group, but it hasn’t yet identified their user needs and how they’d potentially impact the design of the service. The panel recommends the team to work closely with the accessibility team at GDS and carry out a couple of rounds of user research to identify the user needs of this group.

Team

All disciplines are present except for a product owner who is currently being recruited and to date the service manager and delivery manager have taken on the responsibilities associated with this role. There will also be a change in Delivery Manager in May but the transfer of knowledge has begun and is being undertaken through shadowing and use of tools such as trello. Developer support is also being sought and will be critical to the development of the service. The team were able to demonstrate the way they work using a regular rhythm of ceremonies and how stories are worked through from initiation to completion. It was evident there is a strong connection and collusion between all team members.

Technology

The team justified the decision not to iterate on the current technology stack (CKAN) on the basis that there was a large accumulation of technical debt, that it is difficult to recruit skills for the incumbent technology, that the incumbent technology was unacceptably slow due to the number of loaded extensions, and that it impeded the team’s ability to iterate quickly. The new stack is demonstrably easier to recruit the required skills, and with dramatically improved performance.

ElasticSearch has been chosen while the team investigate the possibilities to improve search. This choice will need to be justified in later phases of assessment, as it does introduce additional complexity with data duplication and infrastructure maintenance (it is not currently supported by PaaS). PostgreSQL could be used to provide search functionality, if the user need does not demand the advanced features of ElasticSearch.

So far, the focus of the team’s efforts has been to improve the publisher workflow. They have identified issues with the current data consumer workflows but have not yet addressed them. It was demonstrated that it is very difficult to find relevant data because the search functionality is based entirely on the name and description provided by the publisher, which doesn’t always match the user’s needs. Given the ElasticSearch technology currently being investigated it would seem feasible to attempt to improve the search functionality significantly, including the possibility of indexing the data as well as the metadata - for which it is recognised there is no technical impediment. For this reason the assessment team would recommend that the team submit to another alpha assessment for the ‘find data’ service.

There was evidence that the team had already begun thinking about mitigating the risks of malicious, inaccurate or poor quality data from being published.

Additionally, the problem of maintaining links to third parties is one that warrants further thought. Individuals are responsible for maintaining their own data sets, and there is currently a “high churn” of publishers. The team have committed to investigating whether it would be feasible to provide a data hosting option (e.g. file upload) for the next phase, as there appears to be a strong user need.

The service provides administrators with the ability to remove inactive accounts, and more thought to password expiration and account locking policies, perhaps automated, would be warranted in the next phase.

Developers are creating some unit tests for their own benefit, and their work is peer reviewed internally between the two developers. However, there is no involvement from other stakeholders in curating these tests. Given the format in which the user need was demonstrated there is an opportunity to practise behaviour-driven development, which the team should consider going forward. This would improve confidence in the new technology solution and encourage greater collaboration within the team.

The service is divided into seven interdependent Django applications. In order to improve confidence in the new solution and to justify the team’s decision to rewrite instead of iterate on the basis of technical debt, the team should avoid creating inter-dependency between the applications, and consider changing their approach and refactoring to make separation of responsibilities clearer, with clearly defined interfaces where necessary.

Design

The new process to publish data seems to be a significant improvement for users. The panel was impressed by the drafts feature - meeting a clear need, and a great example of doing the hard work to make it simple.

Design and research seems to be a well embedded part of the team, and it was good to hear that while there were some early issues with smooth collaboration between developers and design/research, this was discussed and addressed. That’s a healthy team process.

It was good to see clear iteration based on user research. For example, separating the user’s own datasets from all datasets in their organisation. It’s also good to see clear thinking around user journeys - separating out the publisher and user journey as they have different tasks and needs.

Data.gov.uk is not currently eligible to use the GOV.UK branding (including the font and crown logo). Going forward, we recommend that the team do research to be confident that either:

  1. Users would benefit from data.gov.uk being branded as part of GOV.UK, or
  2. Users benefit from a clear separate brand that is not GOV.UK

The prototype makes excellent use of GOV.UK design patterns in general. However a lot of the inputs had quite long hint text.

The idea of ‘link name’ stood out as something with the potential to cause confusion. We recommend looking into for example having a default name, based on the data set title. This might work in the case of a single link, perhaps custom names are only needed with multiple links?

There was a good phrase mentioned in the assessment: ‘Open data is a culture change’. This is a great insight and perhaps the service itself can help with this? For example linking to useful guidance or blog posts.

It was good to see a wide variety of design and prototyping tools being used - the right tool for the right situation - paper, Sketch, InVision, the GOV.UK prototype kit and a Django working prototype. Keep working closely as a team to make sure that prototyping new ideas and testing them with users is as fast and low friction as possible. For example you might want to consider another copy of the Django codebase which is much more ‘throwaway’ if that meant more design ideas could be tried more quickly.

It was also good to see new collaboration tools such as the decision log. The assessment panel would recommend blogging about anything you’ve found useful like that, as other teams could benefit.

Analytics

It was good to see the service team were able to provide information around the data points and indicators they intend to use to measure the service. This included the plan to increase user satisfaction which involved using performance already known from the existing service and gathering feedback from a range of sources. The team are in the process of registering with the performance platform and are working with the analysis team within GDS.

Recommendations

To pass the next assessment for “publish data”, the service team must:

  • Undertake more research with people with access needs to identify how their needs may impact the service
  • Ensure developer support is sought
  • As the team move into the beta phase they should consider how they could mitigate the risks of malicious, inaccurate or poor quality data from being published
  • Work with a content designer to see if the long hint text can be made more concise while still being clear and helpful. You may want to test some inputs without hint text to see if there’s really a need.
  • Ensure that technical debt is actively mitigated as they progress, and to facilitate confidence in the new solution - such as a good suite of acceptance tests
  • Look into whether an on boarding process could help first time users. Try out GOV.UK Notify’s on boarding process: https://www.notifications.service.gov.uk/
  • The service team should undertake further research to confirm if Data.gov.uk should be branded as part of GOV.UK or be a separate brand

The service team should also:

  • Continue to develop personas for all user needs found within the profiles of the service
  • Investigate whether it would be feasible to provide a data hosting option (e.g. file upload) for the next phase, as there appears to be a strong user need.
  • Give more thought to password expiration and account locking policies, perhaps automated, which would be warranted in the next phase.
  • When curating unit tests involve wider stakeholders

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 24 July 2018