Extended Producer Responsibility packaging beta assessment
Service Standard beta assessment report Extended Producer Responsibility (EPR) packaging 29/05/2024
Service Standard assessment report
Extended Producer Responsibility (EPR) packaging
Assessment date: | 29/05/2024 |
Stage: | Beta |
Result: | Red |
Service provider: | DEFRA |
Subsequent assessment reports
Service description
The purpose of the service is to support the implementation of new regulations for packaging and packaging waste.
Extended producer responsibility for packaging is based on the principle “the polluter pays”. This means producers of packaging become responsible for the full cost local authorities incur when collecting & disposing of packaging waste.
This is expected to reduce the amount of packaging that enters the waste stream, and to improve the choice of materials used in packaging, both of which will positively impact the environment.
Service users
Current service users are grouped into four categories:
- Large producers of packaging (organisations)
- Third-party consultants (individuals)
- Compliance scheme operators (organisations)
- Environmental regulators (organisations)
Things the service team has done well:
Tech
- The platform for the service follows the cloud-first strategy and uses standard patterns to ensure the service is secure and reliable. The team have published all of the source code for the service publicly at https://github.com/orgs/DEFRA/repositories?q=epr
Performance Analytics - The team is aware of the limitations and risk of not having a PA on the team and have worked well to mitigate this with the work they have undertaken. - The team has benchmarked against existing service, dashboarded the publishable 4 metrics, understood the limitations of GA and cookie consent, secured the data with access limitations and SC clearance requirements, ensured GDPR and PII policies are followed and have examples of iterative improvements. - The team has a roadmap with some improvements such as the exit survey upcoming.
1. Understand users and their needs
Decision
The service was rated green for point 1 of the Standard.
2. Solve a whole problem for users
Decision
The service was rated amber for point 2 of the Standard.
During the assessment, we didn’t see evidence of:
- The panel did not get an understanding of who exactly would complete work across the different types of organisations that will need to comply under the legislation.
- The panel felt there was a lack of understanding or knowledge demonstrated in terms of the need for designing accessible and inclusive services, responses such as ‘users in these organisations are highly confident with digital technology’ and then ‘some users struggle to build a CSV file’ these relate to digital skills rather than the need for and use of assistive technology. ‘No assisted digital support has been provided so far’ but they have a helpline. Suggested the problem was not clearly defined.
3. Provide a joined-up experience across all channels
Decision
The service was rated amber for point 3 of the Standard.
During the assessment, we didn’t see evidence of:
- The panel felt limited uptake and reporting errors would indicate further iteration is required.
- There also appears to be confusion and issues around account management part of the journey
4. Make the service simple to use
Decision
The service was rated red for point 4 of the Standard.
During the assessment, we didn’t see evidence of:
- The panel felt there was a lack of evidence showing detailed understanding of who would be completing the journeys – further evidence of this is required
-
The panel felt that further evidence for designing inclusive and accessible services was required
- The panel felt there was further work required around assisted digital approaches and provision of alternative routes.
5. Make sure everyone can use the service
Decision
The service was rated amber for point 5 of the Standard.
During the assessment, we didn’t see evidence of:
-
Testing with users with accessibility needs or assisted digital.
-
While the panel understands the user group is relatively high on the digital confidence scale, it would be good to see attempts and approaches to testing for accessibility - e.g. proxy users, colleagues within your departments
6. Have a multidisciplinary team
Decision
The service was rated red for point 6 of the Standard.
During the assessment, we didn’t see evidence of:
- There was no Performance Analytics and the team suggested there were no immediate plans to recruit anyone into the role.
- The panel felt that whilst there was a general understanding of the need for good analytics, there was no provision to identify and respond to key metrics.
7. Use agile ways of working
Decision
The service was rated amber for point 7 of the Standard.
During the assessment, we didn’t see evidence of:
- Whilst there was good evidence of knowledge transfer and retaining information relating to third party supplier input, it was felt that more interaction with the day-to-day activity and agile ceremonies would provide further insight.
- The panel felt that more evidence of service iteration in response to analytical and user feedback was needed.
8. Iterate and improve frequently
Decision
The service was rated amber for point 8 of the Standard.
During the assessment, we didn’t see evidence of:
- As there was no performance analyst it was difficult to understand how feedback was being monitored and reacted to. There was mention of key performance indicators, but we didn’t get a view of how this impacted the service and what was done in response.
9. Create a secure service which protects users’ privacy
Decision
The service was rated green for point 9 of the Standard.
##
10. Define what success looks like and publish performance data
Decision
The service was rated red for point 10 of the Standard.
During the assessment, we didn’t see evidence of:
- There is no embedded Performance Analyst on the team to instruct and guide the quantitative data. A Performance Analyst is essential for service level analytics, dashboards, key KPI’s and underlying metrics along with A/B Optimisely testing. (Red)
- An exit survey needs to be implemented on the service (Amber)
- Consider wider KPI’s outside of the mandatory publishable 4. E.g look at analytics around packaging types and shift as the service progresses. (Amber)
- Ensure the product analytics are kept up to date/accurate and relevant to avoid issues with Google Analytics Cookie Consent. (Amber)
11. Choose the right tools and technology
Decision
The service was rated green for point 11 of the Standard.
12. Make new source code open
Decision
The service was rated green for point 12 of the Standard.
13. Use and contribute to open standards, common components and patterns
Decision
The service was rated green for point 13 of the Standard.
14. Operate a reliable service
Decision
The service was rated green for point 14 of the Standard.
Next Steps
In order for the service to continue to the next phase of development, it must meet the Standard and get CDDO spend approvals. The service must be reassessed against the points of the Standard that are rated red at this assessment.