Plastic Packaging Tax beta assessment

The report for HMRC's Plastic Packaging Tax beta assessment on the 08 of March 2022

Service Standard assessment report

Register for Plastic Packaging Tax

From: Central Digital & Data Office (CDDO)
Assessment date: 08/03/2022
Stage: Beta
Result: Met
Service provider: HMRC

Previous assessment reports

Service description

This service aims to solve the problem of users needing to pay the new plastic packaging tax. The amount they need to pay is based on the weight of plastic packaging that they use, subject to certain exemptions. The service enables them to input their information, see how much tax is due and pay the tax.

Service users

This service is for representatives of all the companies (or other organisations) who will have to pay the tax based on the amount of plastic they use.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • participants recruitment was thoughtful and rigorous, mitigating risks and limitations while increasing user groups’ variety
  • primary user personas and segments have been prioritised for testing, taking into account many considerations

What the team needs to explore

Before their next assessment, the team needs to:

  • continue testing for accessibility
  • continue testing with both current and future likely users of the service. Specifically, the team has focused on Limited Companies as primary users (85%) and the panel recommends they expand testing to the remaining 15% of their user base as well in the next phase
  • continue testing the other elements of the journey beyond registration
  • re-consider their post-alpha position on assisted digital. The panel felt that some decisions about de-prioritising testing with proxy users on the lower digital literacy spectrum might be based on the risky assumption that users working in business are overwhelmingly confident when using digital services or even that “assisted digital” users might not exist in their user base. The panel suggests that the team re-considers the impact of this assumption and how it might negatively and needlessly increase burden on some business groups and individuals. The panel recommends that the team includes additional thinking and potential testing plans to mitigate this risk in their private beta

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has been engaging with industry groups, policy and other HMRC services to work on the end-to-end service

What the team needs to explore

Before their next assessment, the team needs to:

  • test how well the entire service is working for users, ideally including specific upfront testing to inform the publicity materials and guidance relating to the new tax. The awareness stage of the journey will be critical for how users decide to register (or not) and their expectations when they begin
  • particularly make sure users are given clear instructions about when they should register, to balance their desire not to pay before they have to with a sensible approach to getting the user registered that does not put the onus on the individual to remember that in several months they will need to come back to the registration service. This timing issue is particularly important given that users are most likely to hear about their new responsibilities upon the comms burst of activity around 1 April
  • have a critical eye on the service in the context of how much ‘work’ the guidance has to do - proliferating guidance documents are usually a sign of a poorly designed service

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the service fits into the landscape of business services and businesses will be able to see information about their PPT returns and payments on their Business tax account dashboard
  • the team has thought about the support channels in place, e.g. with regards to getting help with the security questions

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure that they are able to influence every point of the user experience, even if it is owned by another part of the business e.g. call centres
  • keep measuring the impact of parts of this journey on other channels and services. For example, the team has noticed users would call the Corporation tax helpline for help with a security question. Knowing the scale of this knock on impact will help with decisions to design different security routes

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has demonstrated an iterative approach deeply informed by user research. They have been simplifying the interaction and the content to make an easy to use service
  • the team has good awareness of what their main issues are and points to explore next – to list a few: the kg vs. tonne tension, email notifications to users and welsh translations

What the team needs to explore

Before their next assessment, the team needs to:

  • pay special attention to the unhappy path journeys. The panel feels we have not seen enough of that during the assessment.
  • explore the use of commas separating numbers above 999, especially form fields indicating plastic weight, to make it easier and reduce errors

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team needs to explore

Before their next assessment, the team needs to:

  • prioritise the fixing of accessibility issues raised on the accessibility audit so the service is WCAG 2.1 AA compliant
  • prepare an accessibility statement ready for the service launch in April

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a strong connection to policy teams and other similar service teams, and is represented in official boards
  • the team has a good mix of roles and we heard impressive presentations from numerous members of the team during the assessment
  • the team has effectively mitigated the risks around their DOS contract being re-tendered
  • there are early-career opportunities in the team, which should build HMRC’s DDAT capability

What the team needs to explore

Before their next assessment, the team needs to:

  • try and increase the proportion of civil servants on the team, to reduce the risk of knowledge draining away
  • keep the current civil servants on the team to ensure a good transition throughout the summer, when registration and returns journeys open

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • there was plenty of evidence that the team have made changes based on showing designs to users
  • the service manager had a pragmatic approach to bringing together the dev team’s governance and the wider Departmental governance
  • the team runs well-attended show and tells

What the team needs to explore

Before their next assessment, the team needs to:

  • consider paying more attention to explaining how the whole team benefits from and participates in agile governance

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • there was evidence of several improvements to the design of the service, including to tackle problems with standard HMRC components
  • the team has a plan for how it will iterate the service beyond the 1 April and 1 July launch dates

What the team needs to explore

Before their next assessment, the team needs to:

  • use real user behaviour to identify opportunities to improve the service. As well as using metrics like completion rates, in the absence of auto-validation or assessment of the data returned within the service it’s important to cross-check entries to see if users are actually able to correctly enter their data. It’s about service performance, not checking compliance from a government perspective: the panel could not see how the service team would pick up on users entering incorrect information because the questions were not clear enough
  • follow their hunch about how the service will need to change as users grow more familiar with the tax over time
  • explore whether bulk upload would be useful functionality

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had given serious thought to how much data they need to retain and for how long
  • the sensitive parts of the service are hosted in low-permissions infrastructure by default and given appropriate protections
  • the team balances the need for an audit trail with the inherent risks holding that data provides

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure they are able to run periodic IT health checks at 9-12 month intervals
  • integrate automated tooling to keep dependencies up to date (where permitted by the infrastructure platform’s safety rules)

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that the team:

  • has a performance framework giving them a set of success metrics that directly relates to the aims of the service
  • appears to actively monitor and share its data
  • collects data from a wide range of sources
  • is segmenting data by type of manufacturer to identify problems
  • demonstrated how they had used data to make improvements to their service

What the team needs to explore

Before their next assessment, the team needs to:

  • have a plan for measuring and publishing the 4 mandatory KPIs
  • show how feedback from users (both direct and via help forums) has helped to improve the service

We’d also like to see:

  • the team be more proactive in working with the compliance unit to understand where problems and errors occur (it was explained that this usually happens after a year of live service, but the panel felt that feedback from the compliance teams should be prioritised as it may be the best way of evaluating the quality and accuracy of returns)
  • more research with packaging manufacturers to understand how the tax is influencing decisions on recycling

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team chose to use as many pre-built components as possible given their tight delivery deadlines

What the team needs to explore

Before their next assessment, the team needs to:

  • consider whether the shared components are meeting accessibility requirements out-of-the-box
  • evaluate whether they would’ve made different choices had they not been under a delivery deadline and if this can inform future iterations of the service
  • prioritise fixes for high-value pieces of technical debt accrued so far

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has kept all code open where possible
  • the only closed-source code was because of content deemed sensitive

What the team needs to explore

Before their next assessment, the team needs to:

  • investigate whether it’s possible to open the code for the (currently closed) acceptance and performance tests
  • make sure that all open repos have an up to date readme, licence, and build instructions

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team took advantage of patterns and components offered by HMRC’s infrastructure platform (MDTP) to provide authentication

What the team needs to explore

Before their next assessment, the team needs to:

  • consider whether any of the things they have built can be contributed upstream or extracted into shared components

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had given consideration to ways the service could be impacted beyond code changes, such as volumetric attack
  • the service’s logging is tied into Pagerduty to inform the support rota in hours
  • the team is solely concerned with incidents related to their service, with separate team responsible for platform-related issues

What the team needs to explore

Before their next assessment, the team needs to:

  • plan how to expand their support rota outside of core hours
Published 23 March 2022