Get help with health and safety

The report for the Get help with health and safety beta assessment on 22 March 2022

Service Standard assessment report

Get help with health and safety

From: Central Digital & Data Office (CDDO)
Assessment date: 22/03/2022
Stage: Beta
Result: Met
Service provider: Health and Safety Executive

Service description

The service enables employers, employees, and members of the public to contact the Health and Safety Executive. Users will later receive a response to their question, query, or complaint and in some instances further investigation may take place.

The service allows users to:

  • Report workplace health and safety concerns
  • Request technical advice
  • Ask for advice on Covid-19 in the workplace, introduced as a response to Covid-19

Service users

External users:

  • Employers (duty-holders)
  • Employees
  • Members of the public

Internal users:

  • Call handlers (fill in the form on behalf of external users, offline journey)
  • Secondary internal users: policy colleagues, concerns, and advice team (CAT) members

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had carried out and documented a large number of research findings they were able to discuss with clarity
  • there was a regular cadence of studies, across varied methods with different users of the service. Various members of the team were well-versed in the research findings which was reassuring
  • the service has iterated based on the research

What the team needs to explore

Before their next assessment, the team needs to:

  • explore a fuller picture of how users get to the service, and what users are searching online to land on related content on ‘competitor’ websites. It could open up patterns in language and behaviours, as we as comms opportunities on third-party websites
  • explore more research with users who i) aren’t familiar or inclined to engage with HSE or ii) have assisted digital needs. The service team should explore using the call centre to profile and recruit users
  • commission a piece of service design work to explore the whole landscape of stakeholders and user journeys into, and user flow through, the service

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • revisited the problem statement with insight from Private Beta testing to test whether it reflected the context of the problem and user needs identified at the earlier stages of the design process
  • demonstrated a good understanding of where users learn about HSE, for example, Citizens Advice

What the team needs to explore

Before entering public beta the team needs to:

  • deploy the evidence upload feature for users as scheduled in July 2022 and validate that with CDDO assurance prior to the launch
  • the ability to submit evidence during the digital service has been highlighted in both usability testing and the exit survey. It’s understood by the panel there’s a policy requirement to ask users if they have photographs, documents, or any other evidence to prevent users becoming an ‘agent’ of HSE if asked to provide evidence later in an investigation, however, asking about evidence in service prompts users to consider it as an option whether or not a user will be contacted for evidence post-submission is open-ended
  • will they or won’t they be contacted; will they need to collect evidence where they haven’t done so already, and should the user do that now if the risk is a single instance in a finite time period, intermittent or ongoing?
  • the process to gather evidence post-submission adds further complexity to a case for both the reporter and the agent handling the case.
  • as users are enquiring about providing evidence and HSE recognises the benefit of supplementary information to support a case suggests it should be part of a submission. Examples of file upload patterns from other government departments:

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the telephone journey mirrors the online journey where agents act as proxy users, completing the digital application on behalf of citizens during a call. The team have removed questions which required more specialist knowledge, familiar to agents, but caused difficulty for users. This has allowed the team to use the same application for both journeys leading to consistent data collected via both channels

What the team needs to explore

Before their next assessment, the team needs to:

  • test communications using content which reflects what a user will receive in a response to a genuine submission. This can be done using scenarios during usability testing. Whilst the team presented an example of an email communication during the assessment, Lorem Ipsum was used in the body of the email which limits the insight gained from testing. Consider using the language an agent will use in these communications, does the user understand what they have received, and do they know what to do next? The Confirmation email is a good example

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed the team:

  • have continued to iterate the service and explained how research insight and user need for trust in how HSE handles personal data, led to a reordering of questions to simplify the flow of the application to build trust
  • have responded to research insight that citizens struggle with identifying the local authority when reporting incidents by removing the field from the application. Agents who have access to local authority information are best placed to update the application post submission
  • the team are investigating alternatives to manual entry of business/organisation address, which will help users who are unfamiliar with the location of an issue

What the team needs to explore

Before their next assessment, the team needs to:

  • review question pattern to remove ambiguity, for example, Do you know the business or organisation doing the activity? is asking the user for a name, but doesn’t explicitly state it. Whereas the following question does; Do you know the name of the person in charge? Similarly, the words issue and activity, also location and business or organisation are used interchangeably in the service to refer to the same thing
  • consider moving the question to ask the user if HSE can share the issue with the business or organisation until after the user has disclosed the name of the business or organisation. This would follow a logical flow from disclosing the information to what HSE can do with the information
  • progress recommendations of the follow-up research and design workshop, including the focus on the positive benefit for the user of providing their details. Data shows that a significant number (39% as of November 2021) of users drop out of the application when asked for their names. The team have done well to highlight the disadvantages of not providing name and contact details, however, the content focuses on why HSE need this information as opposed to the benefits for the user

    • it’s a well-recognised insight that users don’t always read all the content on a Start page before hitting the green ‘Start now’ button. For this reason, it’s important to provide contextual guidance on questions during an application, where evidence suggests it’s needed, or structure the flow to remove ambiguity and make choices clear-cut with routing
    • removing field validation to allow users to proceed without entering data in these fields will result in errors being captured in the application. Either HSE needs users’ contact details to investigate reports, making them a mandatory data item or users can report an issue anonymously, in which case this needs to be clearer

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that the team:

  • conducted usability testing with assisted digital users and users with access needs
  • removed the business sector question and drop-down which caused confusion for users leading to ‘other’ or incorrect data being captured. Agents who understand the sector terminology will update the application post-submission. An additional improvement is the removal of the drop-down component which can be problematic for screen readers
  • have included a free text area for users to include specifics about a location. The team might want to consider making this a separate question to remove the requirement for an optional field

What the team needs to explore

Before their next assessment, the team needs to:

  • consider moving to a two-thirds layout. Pages are currently full-width, resulting in long lines of text and extended components. This can be difficult to read and present problems for users with screen magnifiers who are forced to scroll left and right to view and comprehend the whole page
  • consider mapping the confirmation journey for users who choose not to disclose their email address and phone number. These users won’t receive a confirmation email but can retain the reference number to contact HSE to follow up. Should contact details for HSE be provided at this point to allow users to do this?
  • need to demonstrate they’ve researched and understand the journey through the whole service and the various ways users arrive at it, particularly for users with assisted digital needs

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had the right structure and disciplines in place to develop a user-focused service with a drive to develop T shaped skilled people
  • there is a plan to develop a sustainable team and increase digital capability within HSE
  • the environment to enable the team to learn and develop
  • the leadership deploys a flexible approach

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to build internal capability reducing cost and reliance on contractors

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have governance, Business and Tech Design Authorities in place that is flexible and are consciously trying to align with these governance boards
  • using the right tool sets and ceremonies, to drive agile ways of working and good practices
  • gone through an 8-month journey to make better decisions in hybrid environments. Building trust and supporting other teams to build on things
  • the service has the right people sitting on the right boards and communications between decision makers, taking a collective ownership model in place and continuing to put the right governance foundations in place

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the service utilises analytics to understand the impact of their iterations, particularly around the removal of problem questions with high dropouts

What the team needs to explore

Before their next assessment, the team needs to:

  • explore using hypotheses to drive out success measures before the team put their iterations into live. This should allow the team to find out whether there are any unique metrics to the iteration ahead of time. It will also allow the team to clearly define whether an iteration has been successful or needs additional work
  • explore the potential to improve the service further as functionality is unlocked after the implementation of a fully cloud-based data architecture

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has completed their DPIA process and worked closely with the organisation’s Data protection officer. The design and data records have taken into account on the need to record only the minimum amount of personal information that is required
  • the service has had an IT Health Check and all serious issues have been resolved.
  • the system is based on Microsoft Dynamics hosted on the Azure cloud platform, which provides the fundamentals of a secure and resilient platform
  • access to production systems and environments is audited

What the team needs to explore

Before their next assessment, the team needs to:

  • the process of transferring data from the front-end Dynamics database to the existing back-end processing system is currently a manual process. This is a risk known to the team. The team must ensure, as a priority, that this process is fully automated, in order to enhance the security and privacy of the service
  • the plan to enable customer evidence file uploads involves significant security and privacy risks and the team is aware of these challenges and the need to demonstrate progress in this area

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have a dedicated performance analyst supporting the team
  • there is a process in place for evaluating the effectiveness of the iterations made on the service. They have utilised data from Google Analytics and MS Dynamics to do this
  • there are a list of key metrics to evaluate the overall effectiveness of the service. These metrics are clearly linked to the success/failure factors of the service
  • their team have implemented a single compliant cookie consent mechanism for analytics cookies

What the team needs to explore

Before their next assessment, the team needs to:

  • explore whether the service requires additional search engine optimisation (SEO) or paid search, to better help users find the service. Determine whether this is value for money
  • develop more than just four KPI’s to monitor the success of the service. The team need to develop or update their Performance Framework to ensure they have all aspects of the service covered
  • getting support from the performance analytics community in DWP to implement a hypothesis-based approach to measuring the improvements to their iterations

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team choose Microsoft Dynamics for both the data store and the user interface and are the pioneers of the use of this technology in government
  • the team are continuing to build expertise in Dynamics technology so it can be supported and enhanced into the long term
  • the team have recently implemented a continuous integration pipeline and the use of separate environments for development, testing and production. This enabled the team to improve system quality through software engineering standards and better deployment security controls

What the team needs to explore

Before their next assessment, the team needs to:

  • the team must demonstrate that their new development processes and automated pipelines are reliable and have delivered the expected benefits
  • before launch, complete the design of the proposed customer evidence upload feature, prototype this design and complete a tech spike to prove such a system can be implemented and integrated with their Dynamics-hosted user interface
  • the postcode lookup feature must be completed and integrated with the Dynamics system

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are aware of the need to release their code to an open-source repository

What the team needs to explore

Before their next assessment, the team needs to:

  • the team need to work with the rest of the organisation to agree on a policy for how to work in the open and how source code will be published
  • the team must automate the process of copying code from the Dynamics environment into a public repository so that it can be reliably published on a regular basis

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the GOV.UK design system styling and design standards have been implemented on top of the Dynamics system interface components, to provide a user experience comparable with other government sites

  • the team have actively engaged with the DWP digital assessors and the wider government digital community to share expertise and best practice

What the team needs to explore

Before their next assessment, the team needs to:

  • the team must implement a process to integrate new versions of the design system as it is updated on a regular basis. This could be supported by a SaaS compilation process, as used and supported by the design system itself
  • as one of the first services to use Microsoft Dynamics the team has been limited in how much collaboration and contribution to open standards can be made, the team should reach out to other users of the MS Dynamics platform for example in the Department for Transport to share learning and iterate the approach

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have in place a development and test environment, to ensure only stable and fully tested new versions are released to the production environment -the service benefits from the expertise and the suitable SLA provided by the supplier to ensure that the service is reliable -the team have established proposed operations for monitoring and managing the service -the expected usage level of the service is known and the service has been load tested for that

What the team needs to explore

Before their next assessment, the team needs to:

  • the release of new system versions is not expected to cause problems for active users of the system, but this is an area of uncertainty. As the system should be available 24 hours a day, there is no option for an out of hours maintenance period. The team must actively test for session loss, or user journey disruption and put in place mitigations as necessary

Updates to this page

Published 15 December 2022