GOV.UK Notify live assessment report

The report from Government Digital Services' GOV.UK Notify live assessment on 10/11/20

Digital Service Standard assessment report

GOV.UK Notify

From: Government Digital Service

Assessment date: 10/11/20

Stage: Live

Result: Met

Service provider: Government Digital Service


Previous assessment reports

  • Alpha assessment report: 22nd March 2016 - Met
  • Beta assessment report: 1st February 2017 - Met

Service description

GOV.UK Notify is a service that allows UK public sector service teams to send notifications (text message, email and letters) to the people that use their services.

The notifications will typically take the form of status updates, requests for action, acknowledgement of receipt of applications or supporting information, and reminders.

Notify can be used by integrating frontend or back office systems with our API, or manually using a web interface.

Service users

Service teams across the public sector (central and local government, NHS, emergency services, schools). A list of services using Notify can be found on the performance platform.

Service teams use Notify, but the recipients of the notifications don’t know about Notify. To them, they are just receiving something from the service team.

Within the service teams there are different types of users, which will vary from service to service:

  • content designers preparing message templates
  • developers integrating their systems
  • case workers and contact centre staff using admin interface to send one-off messages, or upload contact details to send batches of notifications
  • team managers, inviting users, setting permissions and accessing usage information

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated a good understanding of primary user’s needs and their day to day working lives
  • the team has a sound grasp of secondary user’s needs and work with service teams across the sector to ensure this remains current
  • the team take a full end-to-end view of the user journey as evidenced by their current discovery on the quality of messages
  • the team regularly consider accessibility as part of their work. Linking up with service teams and other Government as a Platform services is an excellent example of meeting these user’s needs across the problem space
  • the support model is well used and is meeting user needs
  • the team demonstrated productive ways of working between user research and other roles, particularly content and interaction design

What the team needs to explore

Before their next assessment, the team needs to:

  • undertake the planned work to better understand and map primary user’s functionality requests. The team to consider how best to manage user expectations against the necessary constraints of the Notify platform (for example, formatting constraints that guarantee messages are accessible)
  • regularly review the needs of those organisations and teams not using Notify. The panel appreciate it may not be practical or desirable to meet all of them. But it’s important for the strategic ambition of the project to continue to strive to attract new users

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated that the service forms a key part of solving a whole problem for users across many different services.
  • the team have a really strong view of the value that their service provides. This is impressive and really great to see it evidenced through their discussion of their work
  • the team have a clear understanding of their users’ needs and plans to extend this knowledge further through their work to improve the content of communications sent using their service

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have a really broad view on where their service starts and ends and work closely with their primary users to improve the service delivered to citizens. They have a clear plan for how to continue this work to deliver this on an ongoing basis in Live

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has consistently and continually improved the service so that it continues to keep on making more and more simple for the users to do what they need to do - both in the case of their ‘primary users’ (public sector colleagues) who send messages and ‘secondary users’ (citizens) who receive the messages
  • the team does frequent usability testing using appropriate research techniques
  • the team tests all the parts of the service that the user interacts with - including the non-digital channels such as letters being sent by the service
  • the service appears to work well on mobile devices and a wide range of browsers - and importantly the messages being sent are tested with a wide range of devices
  • the service is generally consistent with GOV.UK branding and patterns, with notable exceptions

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that where variations from the design system are made, such as different types of buttons or other interactions, these variations are fed into the appropriate place in the design system so that other designers in government can benefit from the design work done on Notify
  • the team couldn’t show an overall completion rate due to the user’s journey being cut into several processes/components. However, it is recommended that if not already monitored, that the team develop ways to monitor shorter journey completion rates that align with breakpoints (for example the text verification) and then monitor return rates, providing a way to understand page navigation success and the subsequent drop out rates. This may highlight previously unknown user challenges with the separate parts of each user journey from registration to design and send

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done regular accessibility testing, including visiting the Digital Accessibility Centre in Wales to see how people with various access needs use the service
  • the team has an accessibility statement available on their website which shows which accessibility issues they still need to fix

What the team needs to explore

Before their next assessment, the team needs to:

  • fix the accessibility issues highlighted in the accessibility audit- especially those which are small fixes, such as missing titles from pages

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are a mature, confident and dedicated multidisciplinary team. It was a pleasure to hear them talk knowledgeably about their work to date and their plans for the future
  • the team’s governance structures were clear and appropriate, and seemed to help the team do their work

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team were able to evidence that they are using agile ways of working and adapt these to better support the team’s needs where appropriate
  • the team have adapted well to fully remote working and had a clear idea of how this had affected their work and put in place activities to mitigate for this

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed strong evidence of multiple improvements, including changing visuals and adding functionality based on analytics trends and thematic reviews of feedback
  • the team used Google Analytics, Zendesk thematic reviews of comments and User Research feedback to understand pain points and opportunity areas, showing a strong focus on triangulation of findings before creating solution options

What the team needs to explore

Before their next assessment, the team needs to:

  • the team didn’t take advantage of experimentation techniques within improvements. The nature of the iterations presented showed that experimentation wasn’t necessary, however for future iterations it is recommended the team consider the use of univariate and multivariate experiments if there are several design options to consider and the traffic volumes allow

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • Everyone in the team is responsible for security, all have done OWASP training and share security knowledge across the team
  • the team has run threat modelling workshops and
  • team work in collaboration with the GDS Cyber security and IA team with monthly check-ins for tracking security issues
  • team is doing Pen tests once a year, Snyk and Dependabot are used for security and vulnerability scanning
  • team informed that they don’t keep the data more than the required and is deleted as soon as the service is not required
  • team informed that the data is encrypted at rest and in transit with TLS1.2

What the team needs to explore

Before their next assessment, the team needs to:

  • with PII being present on users screens and GA events being scripted (rather than GTM tagging). It is advised that the team design a report that refreshes daily in sheets that searches all the common PII risk areas for structured PII data. This could include H1 tags, page titles, event text strings. This is a failsafe in case future updates cause PII bleed/contamination in any of the common page elements

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have multiple dashboards showing a range of KPIs and have been able to illustrate the key KPIs that are used to monitor the services performance in both the positive and risk areas
  • the team use GA and Notify log data which is dash-boarded to monitor performance as well as audit environments like Grafana for hardware monitoring. The team have built-in key KPIs for the user that include sent, failed volumes among a large number of others, enabling the users to self manage their campaigns. Each users production data is brought into a high level admin board for the team to be able to cut national level data into all the KPIs users have, as well as higher level segments including directorate, account volume by team and directorate over time and so on
  • the team look to have established a strong statistically sound validation process where they appreciate the limitations of GA therefore split journey measures with Notify performance metrics to ensure reporting and iteration evidence is correctly sourced
  • the team discussed plans to look at the quality of the copywriting being sent, from a placement, grammar and flow perspective. This is an impressive position to take as it feels fundamentally outside of the remit of the service, however the team have recognised that the quality of the users output would as a consequence have a direct impact on the continued use of Notify. Therefore if the team has resource and evidence that more automated assistance will clean and improve notification media, this should be encouraged

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using HTML/CSS/Javascript
  • team is using GOV.UK patterns for front end and Postgres and RDS for database
  • team is using various tools like Selenium for testing, other unit testing tools, Concourse for continuous integration
  • team informed that they are using manual as well as automated tools like Selenium for testing
  • team is using Python Unit testing, Pytest and JavaScript testing
  • team is running smoke tests before code is deployed into production

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • team has published the most of their code on Github
  • team is using various open standard tools and systems for the Notify
  • team is using Pivotal Tracker for project management and backlogs

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • team informed that they are following the open standard guidelines for frontend and backend development and deployment
  • team is using GOV.UK PaaS and AWS for hosting
  • team is using GOV.UK design patterns and guidelines

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • team has plans for 99.9% availability of the service
  • team has various escalation processes in place within the Notify team as well as GOV.UK PaaS and AWS for resolving the issues quicker
  • Team is well organised to make the changes quickly

Updates to this page

Published 16 February 2021