Export green list waste

The report for Defra's Export green list waste alpha assessment on the 1st of July 2021

Service Standard assessment report

Export green list waste

(previously called: Tell us about the future export of green list waste)

From: Central Digital & Data Office (CDDO)
Assessment date: 01/07/2021
Stage: Alpha
Result: Met
Service provider: Defra

Service description

This service enables exporters to record information about an upcoming export of ‘green list waste’, so that they can share this information with their environment agency and generate the documentation that is legally required to accompany the waste.

‘Green list waste’ is a term used by the waste sector to describe non-hazardous wastes that can be exported for recycling to certain countries as long as it is sorted and uncontaminated.

Agency Officers can use the service to review who is exporting the waste, what is being exported and where it is going, in order to target compliance and prevent illegal exports.

Service users

Exporters are the service users responsible for arranging the export of green list waste. We estimate there are 5,500 exporters of green list waste across the United Kingdom, generating over 510,000 exports of green list waste a year. Each of these exports must by law be accompanied by a physical document called an Annex VII form.

Agency Officers are officers from the four UK environment agencies responsible for regulating green list waste exports: the Environment Agency (EA), the Scottish Environment Protection Agency (SEPA), the Northern Ireland Environment Agency (NIEA) and Natural Resources Wales (NRW).

Agency officers seek to monitor export activity, identify potential illegal exports and either stop green list waste containers at ports or intervene at sites of loading. At the moment only officers in Northern Ireland and Scotland receive documentation on exports of green list waste in advance. Officers in other nations have limited information to act on in advance.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team directly responded to the user need for speed and efficiency by creating favourites enabling the user to re-use previous forms
  • it’s great that they distinguish between ability to complete the task and user appetite to use the system
  • watched the user testing closely to understand the preferred user journey and created a real-world mental model – to create a non-linear design

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the non-compliant exporters on the fringes of legality/compliance and describe these as a persona type. More insight is needed to understand these users in terms of their motivations, behaviour and expectations - consider using anti-personas

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • user testing participants were evidently highly engaged with this prototype and offered to provide even more information – as a way to reduce crime and create a level playing field
  • users did not generally see this as an admin burden
  • the team was aware of existing, related services and guidance, and the stages of development or replacement that they are at

What the team needs to explore

Before starting their private beta, the team needs to:

● prototype and test how the journeys to transfer (for example to a colleague) or recover a Defra account will work with the ‘export green list waste’ service

Before their next assessment, the team needs to:

  • given that Exporters want a level playing field, consider these users’ competitive position and their perceived customer value
  • I note your mention of Welsh, Irish and Ulster Scots language capability (legal requirement), but would like to see a capacity for other languages in view of the many countries involved across the delivery chain
  • given the volatile nature of the global market right now, business processes need to be considered in more detail to support this service so that it can be easily reconfigured in response to changes in the external environment or distribution chain
  • as the most significant problem suggested to respondents was “Checking that carriers, brokers and shipping organisations are registered to handle Green List Waste (51%)”, this will need to be explored in Beta as an Agency and Exporter user need
  • prototype and test how exporters will move between the ‘export green list waste’ service, government website guidance and related services (such as the waste exports control tool and International Waste Shipments service)
  • prototype and test how agency workers will move between their intranet guidance (and any other key points) and the ‘export green list waste’ service
  • have started exploring alternative solutions to the agent downloading service data and importing it into their agency system

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team evidenced the user journey on and offline and on laptop versus hand held device
  • the service team distinguished between the user’s ability to complete the task and users’ appetite to engage with the system
  • created a design template that could be used in other contexts across Gov

What the team needs to explore

Before starting their private beta, the team needs to:

  • design and usability test an email to alert exporters to an issue with one of their submissions, exploring different delivery methods and security issues, and taking account of guidance on sending emails and links in emails

Before their next assessment, the team needs to:

  • re-design and usability test the introductory email for both exporters and agency staff, considering whether they would open and trust the email, and considering security issues - see the GOV.UK guidance on sending emails and links in emails
  • test a range of options relating to updating submissions and manually updating paper forms - for example only permitting a print out once the submission is complete, or providing a replacement printout if the exporter does an update
  • extend the scope of the user testing to incorporate eye tracking, key-strokes, mouse movements for optimum user experience

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a clear record of how the design has been iterated, and that the iterations were based on user needs and observed user behaviour
  • the team has made use of existing patterns and components, and was confident enough to move away from a common pattern where it was not working well for their users - the panel would encourage the team to share the research findings with the design system community
  • the team’s design of the service name for exporters is consistent with the service manual guidance
  • the team has got arrangements in place for regular design crits and content 2is

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure scannability and consistency in the task list content - the task list pattern uses brief noun phrases, as does the team’s ‘check your answers’ prototype screen
  • explore whether ‘dashboard’ is the most effective title for the service home page
  • determine whether there is a genuine user not follow the convention of displaying the user/account name when logged in
  • continue trying to make the declaration content as plain English as possible, whilst ensuring it still meets the legal requirement and is compatible with the hard copy form
  • consider whether a separate, different service name would better serve the needs of agency staff - they are users in their own right, with a very different user journey to exporters, and with a different task to complete

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a plan to test non-digital ways to use the service (phone and post), for assisted digital and business continuity need
  • the team has a plan to do accessibility usability testing with a range of users

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the printable form generated by the service is accessible
  • find out at what points users need additional guidance (if any), and design and test solutions - for example, do exporters know how to set up their spreadsheet in the right way for a bulk upload?

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is well resourced with a wide range of skills
  • there is a product owner from each country
  • the team has access to additional specialists where necessary
  • there are plans to recruit more permanent staff

What the team needs to explore

Before their next assessment, the team needs to:

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has adopted agile working methods and has demonstrated how they have learned as they have moved through alpha

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • it’s clear that the team has iterated in response to user research

What the team needs to explore

Before their next assessment, the team needs to:

  • think about what continuous improvement will look like in the future

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is aware of GOV.UK Service Manual and NCSC security guidance
  • the team is aware of Defra security assurance and DPIA processes
  • fraud and threat vectors have been identified, along with countermeasures

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • identified what good would look like for exporters and agency officials

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how the analytics will support the aim of fewer better targeted interventions

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • build pipeline and deployment technologies have been identified

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that all required dependencies are documented (e.g. GOV.Notify)

12. Make new source code open

https://github.com/DEFRA/wtp-glw-prototype

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • prototype code is open source

What the team needs to explore

Before their next assessment, the team needs to:

  • make the prototype README.md more useful
  • establish GitHub repo(s) for production code
  • consider creating CONTRIBUTING.md to encourage contributions

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • Defra Common Platform is being used
  • appropriate CCoE patterns have been chosen

What the team needs to explore

Before their next assessment, the team needs to:

  • check if code reuse is possible from other form-based Defra services
  • show how other Defra teams can learn from what has been implemented

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the assisted digital support channels can be used to provide business continuity

What the team needs to explore

Before their next assessment, the team needs to:

  • understand what load is expected on the system and load test accordingly

Updates to this page

Published 13 August 2021