Environment Agency - Digital Customer Portal

Service Standard assessment report Environment Agency - Digital Customer Portal 08/09/2022

Service Standard assessment report

Environment Agency - Digital Customer Portal

From: Defra - DDTS
Assessment date: 08/09/2022
Stage: Alpha
Result: Not Met
Service provider: Environment Agency

Service description

This service aims to solve the problem of not offering a consistent approach for users in relation to applying and managing their permissions with the Environment Agency for activities they want to carry out.

The vision is to provide an all-inclusive platform which enables those we regulate to apply and manage their permits in a smooth and seamless way.

Service users

This service is for businesses and their individuals who are approximately 54% of total user base, public bodies, agents acting on behalf of business and individuals.

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team showed a good understanding of user’s pain points across their external user groups
  • the personas and user needs presented were clear and evidence based. It would have been helpful to understand how the team were using these to make decisions and prioritise work
  • the team had made the most of the data available to them from elsewhere in the organisation as part of their evidence base
  • the user research methods utilised so far (remote semi structured interviews and usability testing) had been executed well

What the team needs to explore

Before the next assessment, the team needs to:

  • clearly articulate the research undertaken in discovery and how the findings led the team to their design decisions. It wasn’t clear how or why medium combustion plants were the private beta cohort
  • provide evidence to show whether the design meets the needs of all users. The team have chosen to try and solve many of the complex problems first, focussing usability testing on businesses and agents with large numbers of licences. The panel were concerned the design had been over engineered for users with one or two licences. There is a risk they will find it complicated and off putting. The team needs to test the service with all user groups to determine whether this solution is suitable for all needs
  • test the service with a wider range of users including those with low digital skills and varying access needs
  • conduct research with internal users, including case workers and telephone agents. The panel appreciated this had been difficult to do during the agency’s recovery after the pandemic. But the team need to provide evidence to support their hypothesis that the new service will have little impact on their colleague’s working lives
  • ensure their participant recruitment and research methods are varied so they don’t introduce bias into their sample and data. For example, recruiting by means other than using lists from within the agency and conducting face to face research

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team understands the large scope of the service and are using data to influence how they prioritise

What the team needs to explore

Before the next assessment, the team needs to:

  • to solve the problem statement discussed. More work needs to be done on the problems relating to completing confusing, inaccessible, and out of date PDF forms.
  • each form may be linked to different policy teams and or departments – how will the service tackle this to create a joined up experience?
  • focus on how to simplify applications as well as how to manage existing applications
  • have a plan for joining existing permit services or data migration for existing permits to solve the whole problem for the user
  • investigate the unhappy path for the transfer element of the service which was demonstrated - how this will sync with internal staff’s current way of working
  • the team needs to research, design and test user’s unhappy paths. The team had theories on how a user who wanted to do something not in scope for private beta would be directed. Without this, the team cannot demonstrate whether they’ve solved a whole problem for users or provided a joined-up experience

3. Provide a joined-up experience across all channels

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated an awareness of providing support to offline users
  • plans are in place to use existing Assisted Digital channels including NCCC
  • the team is taking ideas from existing services to avoid any duplication of effort

What the team needs to explore

Before the next assessment, the team needs to:

  • consider how users will interact with the service if something was to go wrong? Can an online application be picked up offline for example? Are our internal processes connected?
  • explore mapping out some key user journeys to cover happy and unhappy paths
  • prioritise the recommended activities as soon as possible for both this standard and Solve the Whole Problem for the user before proceeding to private beta

4. Make the service simple to use

Decision

The service Met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has thought about accessibility and inclusive design
  • the team is listening to feedback from usability sessions and using this to improve the service
  • the team has recognised the need for a content designer

What the team needs to explore

Before the next assessment, the team needs to:

  • a lot of the iteration and learnings are around specific screen designs and components. At Alpha the team need to consider the wider journey and how they are testing that with all the actors involved, for example applicants, agents, internal staff
  • re- designing legacy permit applications will be a large part of this service. The team need to have a plan for making sure these all these forms are simple to use for a varied user group

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has thought about accessibility and inclusive design
  • usability testing has been carried out in an iterative cycle

What the team needs to explore

Before the next assessment, the team needs to:

  • carry out research with users with low digital skills or access needs
  • investigate the impacts of this service on users of existing permitting services, for example waste permissions or water abstraction

6. Have a multidisciplinary team

Decision

The service did not meet point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to access UCD capability through working with contractor
  • team members are empowered to raise suggestions and challenge constructively the wider organisation
  • the team was working within different governance layers within the organisation
  • the team was able to conduct many rounds of research and develop a prototype despite having a relatively small team for a service of this size

What the team needs to explore

Before the next assessment, the team needs to:

  • while the team had a variety of DDaT roles the panel was under the impression that the size of the team was disproportionate to the ambition. The number of users and number of user journeys the dashboard was looking at would suggest a dedicated Service Designer and more UR capability to provide the right resource and capability to ensure that each iteration delivered was a fit to the whole service being delivered. The team should ensure that once the scope is agreed appropriate UCD capability is in place and they have access to service design capability
  • review the scope of their alpha and MVP to ensure that enough is proved or tested in alpha to deliver the whole service – at the assessment the panel felt strongly the team would benefit from a dedicated Service Designer who could review the scope of the alpha and MVP to ensure enough is proved and tested in alpha to deliver whole service and a fully dedicated Product Manager who can build and prioritise backlog as well as create and review roadmaps including dependencies on other services and milestones.
  • ensure the team members are not only empowered to make suggestions and challenge but to make service decisions and be able to prioritize backlog and revise roadmap based on learnings and in accordance with service design thinking

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was aware and confident around the ways of working recommended in the Service Standard and has been using agile ceremonies
  • selecting a complex journey to de-risk is a good approach

What the team needs to explore

Before the next assessment, the team needs to:

  • the team chose to focus on the most complex path but this was only one of the service journeys. We would still expect the team to gather evidence that the whole service works for users

8. Iterate and improve frequently

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to iterate interaction elements based on user feedback
  • the team had real people using the service as early as possible

What the team needs to explore

Before the next assessment, the team needs to:

  • selecting a complex journey to de-risk is a good approach but the team needs to think about the time and effort invested in prototyping and limiting the iterations. The panel felt that the complex journey has been prioritised over checking fit with overall service and there is now a risk of rework as other permits and user groups are brought into MVP for alpha

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is following the Defra CCoE cloud framework and NCSC cloud principles
  • the team is actively working with security architect and would be following the OWASP principles
  • the team is planning for the IT health check as appropriate for the service
  • the team is doing the Data Protection Impact Assessment (DPIA) and developed the Cookie policy and Privacy policy
  • the team is planning to encrypt the data in transit and at rest

What the team needs to explore

Before the next assessment, the team needs to:

  • showcase the results from the Security Risk assessment and Pen Test and any actions taken based on the test reports
  • show the DPIA
  • show the approved and published Cookie policy and Privacy policy
  • showcase the data encryption

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had a clearly established success criteria
  • the team has selected additional non-mandatory KPIs (user behaviour, save button use) to monitor alongside the mandatory ones and has identified key data points to measure performance of the service for example submission timeframe. Alongside quantitative data the team will also collect qualitative feedback through a feedback functionality

What the team needs to explore

Before the next assessment, the team needs to:

  • gather more more data around current use of licenses to understand where the biggest traffic is to be expected, during the assessment it was unclear what percentage of users are expected for each part of the journey
  • as the team moves to next phases (private beta) and can work with a data analyst the panel encourages to start exploring what data points are of value for each of the journey or user groups and how they could combine these with existing user research. For example, explore whether there a particular type of license that has least favorable feedback or takes the longest to apply for and how this could be improved through digital means

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using the right set of tools and technologies for the front end, backend, and cloud services
  • the team is using appropriate tools for testing and quality assurance
  • the team is using CI/CD and other project management tools

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is planning to publish the code in open
  • the team is using GOV.UK Prototype Toolkit
  • the team is using DEFRA IP which is open sourced

What the team needs to explore

Before the next assessment, the team needs to:

  • showcase that the code has been published in open

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is planning to use open-source tools and technologies and using digital portal accelerator for Defra wide open source
  • the team is planning for a strategic approach of API Gateway and Data Cache solution
  • the team is using ReSP Dataverse, IdM solutions and Customer communication via GOV.UK Notify

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done the analysis that the current portal service does not need to be highly available in order to mirror current service levels. It would be acceptable for the service to resume within 3 business days.
  • the team has planned for the SLAs to meet the BC and DR activities as per the requirements
  • the team has planned for CI/CD tools

What the team needs to explore

Before the next assessment, the team needs to:

  • be able to demonstrate that the BC and DR plans are acceptable and working as per the requirements
  • be able to show results/ dashboards for various KPIs and SLAs

Next Steps

The team needs to work on the recommendations for the standards not met in the report before proceeding into private beta. The panel feels that whilst the alpha done to date is of good quality for a particular journey and user group it needs take more of a whole service design approach at high level and broaden its scope of MVP and user journey to evidence that this is what needs to be done to solve the problem statements and fit within the context of the whole service. The panel understands with a large scale service with multiple permit types and user groups that this needs to be done in an iterative way or phased approach but feels that further alpha is required to provide evidence that MVP is going to be the foundation and repeatable pattern for the overall new service across many permit types and user groups and fit within the context of the whole journey. There needs to be more of a service design approach.

The panel will be happy to provide further information and guidance within their field of expertise or clarity on any of the standards not met.

The panel suggests that the recommendations in the report are addressed in further alpha work for the standards not met and that a reassessment takes place that focuses on these areas after the team has established a timeline for doing this. To ensure continuity of panel where possible the same panel will carry out the reassessment so the recommendation is after review and planning has taken place that this is booked into the calendar at some point in the future as soon as possible.

Updates to this page

Published 14 April 2025