Data, Analytics and Situational Awareness Hub live assessment

Service Standard assessment report Data, Analytics and Situational Awareness Hub 28/04/2023

Service Standard assessment report

Data, Analytics and Situational Awareness Hub

From: Central Digital & Data Office (CDDO)
Assessment date: 28/04/2023
Stage: Live assessment
Result: Met
Service provider: Cabinet Office Digital and Joint Data and Analysis Centre

Service description

This service provides a place for the Joint Data and Analysis Centre (JDAC) to upload charts, maps and documents so they can be shared with government users to support policy making, long term analysis and crisis management within government.

Service users

This service is for:

  • Administrators / Content creators
    • JDAC Administrators
    • Data providers
  • Cross government end users
    • Policy maker
    • Briefer / storyteller
    • Crisis responder

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team researched and tested the usability of pages and functions that were rapidly developed during the Covid crisis with a more thorough focus on user needs.
  • they prioritised research focus based on the most used areas of the website and paid proportionate attention to the administrative side of the service.
  • the team appears to have gone through several rounds of research activity to ensure user needs are identified. Though, the structure of the approach, participant numbers and the timeframe(s) remained unclear. However, the team followed up with detail indicating several rounds of user research with over 80 participants since the emergency circumstances in which the service was spun up.

What the team needs to explore

Before the next assessment, the team needs to:

  • show how qualitative research activity was structured and how the prototype iterations presented to the assessment panel eventually tied back to the resolution of user stories. The team should also seek to maintain logs of the research conducted and design decision logs for the purposes of future knowledge transfer.
  • present a more comprehensive view of user personas. The team could have done more to bring to life their understanding of the value the service provides to each user type and illustrate the main constraints/pain points each persona has experienced.
  • as far as possible, seek out non-core users and convey in more detail the numbers and diversity of participants taking part in each round of research.

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has focused their efforts on re-orienting user capability and service value around situation awareness and away from a simple dispensation of charts and maps. They showed how uncovering findability constraints and reusing menu categories familiar to users from the National Risk Register led them to implement an enhanced and more understandable information architecture across the hub.
  • the team demonstrated how they’ve widened their engagement with stakeholders and data providers to support a comprehensive move away from simply serving policy-driven VIPs and towards meeting the needs of 68 organisations across government and local government in the last 12 months.
  • they have engaged with and secured the involvement in service development of stakeholders beyond the immediate EDS/JDAC cohort. This included ONS and the devolved administrations.
  • they have demonstrated the evolution of the service through successive versions of the journey map.

What the team needs to explore

Before the next assessment, the team needs to:

  • take into fuller account any barriers in navigating the wider service journey for users supported with assistive technologies.
  • showing a diagram of the distinct services within the civil contingencies ecosystem and explaining how they are connecting with others in this space to enhance situation awareness and accelerate the information gathering process for users.

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has justified functional enhancements like the website’s content type upload feature and their Data Access Catalogue by gathering feedback and testing with administrative and data provider users on a continual basis. The introduction of automation to onboard charts from ONS and other sources on a timed-release basis and 24/7 is also a well-considered addition that allows for human intervention to be switched off where required.

What the team needs to explore

Before the next assessment, the team needs to:

  • show more distinctly how the stated ambition for the service to be the single source of truth for locating information on national incidents can be enhanced by touchpoints with other civil contingency services and government intelligence information sources.

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a number of improvements which have resulted in a service that feels simpler and more easily navigable than its crisis-born predecessor.
  • the service demonstrates the use of design patterns and components from the GDS and MoJ design systems which supports its compliance with government service standards.
  • the ability to bookmark frequently used areas of the service and immediately locate incident information via tabular access are beneficial improvements.

What the team needs to explore

Before the next assessment, the team needs to:

  • continue research activity in a more structured way and in collaboration with design expertise to accelerate continuous improvement.

  • consider the service’s potential to increase data literacy by sharing insights with the CO data community and wider government stakeholders.
  • feed their learnings and any limitations in deploying design patterns back into the GDS and MoJ design communities to help improve them for others.

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team took a comprehensive approach to testing the service’s accessibility by checking the most used frontend and admin pages against 48 WCAG success criteria and using assistive technologies and open source tooling to support the work. They tracked and reported on the resolution of the issues uncovered in a consistent and organised way.
  • the team iterated the service using component coding from the GOVUK Design System to make sure any accessibility improvements are automatically received.

What the team needs to explore

Before the next assessment, the team needs to:

  • ensure all users are understood, including those with accessibility and/or access needs and that they are integral to research activity.
  • review the ONS guidance on making analytical publications accessible to make sure data visualisations features of the service and their setting are as perceivable and understandable to as many users as possible.
  • make sure the service’s accessibility statement is updated as soon as possible and that all outstanding issues listed are resolved on behalf of users.

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team represents an appropriate range of expertise across key agile disciplines. These are relevant to the continuous improvement of a live service and include a Performance Analyst and a QA Tester. The latter of which proved a compensatory advantage against some of the constraints in finding more diverse research participation.
  • the team have capitalised on a measure of resource stability due to the relatively long internship of some of the team members which has been valuable for knowledge retention and is demonstrated in their steady focus on continuous service improvement.
  • the team demonstrates a close and supportive relationship with the JDAC business unit team. This is helped by their mutual efforts to engage in regular on-site workshop sessions. As a consequence, the whole team appears on the same page in the collective effort to observe and honour the perspective of users.

What the team needs to explore

Before the next assessment, the team needs to:

  • engage with other government departments to further understand the data landscape. For example, the HMRC API catalogue team has undertaken extensive research on search behaviour that could be useful to the work the team are doing.
  • consider recruiting business analysis expertise to the team when needed rather than having this covered by other roles on the team.
  • aim to increase the number of Civil Servants on the project to take further advantage of the relatively high degree of knowledge retention on the team.

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are actively using a range of collaboration tools such as Figma, Jira, Mural and Slack.
  • they have followed a range of agile delivery principles and ceremonies, including daily stand ups, regular retrospectives and Show and Tell sessions.

What the team needs to explore

Before the next assessment, the team needs to:

  • do more to share some of the good practice and ways of working with others by intermittently opening up Show and Tells sessions to a broader set of delegates and by feeding back workarounds and constraints into the design system communities and data networks.

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team presented evidence of recurring periods of research activity during the live phase to ensure continuous improvement of the service.
  • they integrated the service design to include personalisation in the form of the bookmarks function and pursued an information architecture that realised the goal of “situation awareness” over a successive stream of charts and data.
  • the team showed how they work collaboratively with data providers to improve data standards and better understand connections between data sets.
  • the team have considered contextual information well in the light of the positive benefits that automation brings by retaining the option not to use the auto-update data ingest function for circumstantial reasons that meet some admin user needs.

What the team needs to explore

Before the next assessment, the team needs to:

  • consider streamlining the list of 18 menu categories and testing this with users to help simplify service navigation. This will be particularly beneficial to users supported by assistive technologies.

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have undertaken a personal data risk assessment and the service has an updated DPIA and an available Privacy Notice published on the website.
  • the team understands the risk associated with sharing data and have taken steps to make sure data is secure at rest and in transit via encryption. They have limited data transfer to being via https only.

What the team needs to explore

Before the next assessment, the team needs to:

  • N/A

10. Define what success looks like and publish performance data

Decision

The service did not meet point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have worked closely with a Performance Analyst and developed a performance framework that links user needs to measures.
  • they have a plan to test hypotheses of iterations and will use A-B testing to analyse outcomes.
  • the team have considered how to publish mandatory KPIs.

What the team needs to explore

Before the next assessment, the team needs to:

  • increase the breadth of the performance framework, highlight individual users and their differing needs.
  • particularly consider success. For example, did the user find what they needed? Was the downloaded thing the right thing? Have the users been able to solve their problem using the service?
  • consider the needs of admin users and how they will be measured.
  • set benchmarks in the performance framework to better define success and plan for reaching it.
  • show how they have measured the success of iterations and made a difference to users.

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • reuse and development of common tools, including COLA, has helped both the product itself and the development of those tools allowing for better implementation in other digital projects.

What the team needs to explore

Before the next assessment, the team needs to:

  • N/A

12. Make new source code open

Decision

The service partially met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a plan to facilitate the outsourcing of the code.

What the team needs to explore

Before the next assessment, the team needs to:

  • make source code on CO GitHub is made open as soon as the current service development work is complete, taking care to resolve any vulnerabilities in the code base before publication.

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have utilised a range of common components and patterns including COLA, GOV.UK Notify, AWS CloudFormation templates and CO GitHub.
  • the team have used GOV.UK and MoJ design patterns and components.

What the team needs to explore

Before the next assessment, the team needs to:

  • engage with the open standards team at the Central Digital and Data Office (CDDO) to find out if there is an open data standard that meets the user’s needs.

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have well-rehearsed preventative security measures in place in the form of protective monitoring and a disaster recovery plan.
  • business continuity and service uptime are mitigated by incident monitoring.

What the team needs to explore

Before the next assessment, the team needs to:

  • make sure that service iteration and service support are adequately resourced given the prospect of reduced development team headcount over the medium term and beyond.
  • ensure the safe decommissioning of the legacy entity, Resilience Dashboard, before available team resources scale down. This includes engagement with the Data Knowledge and Information Management team to review your data retention and deletion plan.

Updates to this page

Published 23 November 2023