Do a tax check when renewing or applying for a licence beta assessment

The report for the Do a tax check when renewing or applying for a licence beta assessment on 10 March 2022

Service Standard assessment report

Do a tax check when renewing or applying for a licence

From: Central Digital & Data Office (CDDO)
Assessment date: 10/03/2022
Stage: Beta
Result: Not Met
Service provider: HMRC

Previous assessment reports

  • Alpha assessment: 4 August 2021 - Met

Service description

Do a tax check when renewing or applying for a licence will be delivered by 3 micro-services:

  • external service - ‘Do a Tax check when applying for a licence’. This service allows an applicant who is seeking to renew certain types of licences (taxi and private hire drivers, taxi and private hire operators, scrap metal collectors and dealers) to do a tax check. The service checks on the applicant’s tax registration status, and if all is in order a tax check code will be provided to prove to the licensing body that they are registered for tax. The service is provided for both Individuals (PAYE, SA) and Companies (Corporation Tax)
  • external service - ‘Confirm an applicant has done a tax check’. This service allows the licensing bodies (LB) to verify a tax check code that has been provided by an applicant seeking to apply for a licence that is covered by Tax Conditionality legislation
  • internal facing service - ‘Help a customer do a tax check when applying for a licence’ This service allows an HMRC call centre operator to do a tax check on behalf of a digitally excluded applicant who has contacted the HMRC call centre, or is being otherwise supported by HMRC

Service users

  • licence applicants, comprising PAYE, Self Assessment and Corporation Tax customers in the taxi and private hire driver and scrap metal collection sectors in England and Wales
  • licensing bodies for the taxi and private hire and scrap metal collection sectors in England and Wales
  • HMRC call centre operators using the Tax Conditionality internal service to help customers in England and Wales who call in to get a tax check code

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had utilised relevant research methods and had a good rhythm of planning, doing, analysing and iterating
  • the team has included authentication through the Government Gateway as part of usability testing and are proactively raising issues it causes for users to the relevant team
  • the team had tested their assisted digital journey using scenarios with participants and contact centre staff
  • the team has worked hard to find and work with people in the scrap metal user groups.
  • there are plans to start research early with the new cohorts of users in Northern Ireland and Scotland

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate that the service works for everyone who needs to use it. Only 17 applicants have passed through private beta. That represents 0.00005% of the estimated user base of 393,000 people. The 17 does not include anyone from the scrap metal industry or anyone with access needs
  • focus user research on people using the service. The team has recently removed access restrictions to the service, effectively making it live. But they had no plans to conduct research to understand the experience of circa 100 users a day who are now using it between now and 4th April. The aim of private beta is to establish whether the service works for everyone in the real world so must be the team’s priority
  • ensure users with a variety of access needs are prioritised for research to evidence whether or not the service works for them. The accessibility audit and the work undertaken in alpha are not sufficient to answer this question
  • continue to try and recruit people in scrap metal. Now the service is effectively live, they could make direct approaches to users, assuming the service allows users to consent to being contacted for research purposes
  • ensure their research sample has a representative geographic spread of users. The team appear to have done a lot of work in London, but there was no evidence of any work being undertaken in Wales

2. Solve a whole problem for users

Decision

The service did not meet point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have a good understanding of their user groups
  • the team have built 3 services

What the team needs to explore

Before their next assessment, the team needs to:

  • demonstrate that the service works for everyone who needs to use it
  • the team explained the challenges and difficulties of user research with some user groups e.g. scrap metal
  • continue to explore and test the support channels to ensure that the user needs are taken into account
  • continue to test the user journeys (happy/unhappy paths) and iterate the 3 services

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • The team have a good understanding of the user groups

What the team needs to explore

Before their next assessment, the team needs to:

  • explore the email journeys and the use of Notify
  • continue to validate and test mobile, happy and unhappy paths
  • finalise a Welsh translation of the service
  • work with the GOV.UK team to iterate guidance

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a good understanding of the GDS Design System
  • the team have used common components from the GDS and HMRC design system
  • the team have iterated designs and content to simplify it

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure recommendations from the content review are applied
  • internal content review to be done before coming to reassessment
  • work with the policy team to ensure guidance pages are clear and understood but users
  • user test and iterate guidance pages feeding back to content team at HMRC
  • iterate and simplify language within the service esp. micro content use and apply guidance from: https://www.gov.uk/service-manual/design/writing-for-user-interfaces some content deviates from these guidance

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have developed a new pattern
  • testing offline journeys by simulating live session with users calling HMRC support line
  • have tested the service with accessibility software

What the team needs to explore

Before their next assessment, the team needs to:

  • address accessibility issues highlighted in the accessibility audit
  • use specific user research methods e.g. highlighter tests, comprehension tests etc. to test content and not just usability testing
  • share user research with design patterns with the wider GDS design community
  • continue to audit content and designs against the GDS Design System
  • continue to test the service and user experience on a range of devices e.g. mobile
  • track analytics on error messages/shutter pages within the service and screens to identify problematic content
  • continue to ensure the policy content and service is tested with all user groups especially users who first language is not English

6. Have a multidisciplinary team

Decision

The service did not meet point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • all required roles are resourced and there is a core team to take the service forward
  • there are clear routes of escalation for issues and risks
  • the team have managed to overcome a period of high staff turnover

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure stakeholders and business area representatives are included in the agile ceremonies where appropriate
  • ensure knowledge is transferred appropriately by sourcing permanent staff to work closely alongside contractors or 3rd party suppliers
  • continue to work closely with Policy and stakeholders to influence how the service and approach needs to change as knowledge from real users is gained

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team continues to work in sprint cycles with sessions to review goals and issues are escalated appropriately
  • all ceremonies are observed, and retrospectives are undertaken for the team to reflect and continually improve

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure the right level of personnel attend the Project Boards, to allow the team to be empowered to make independent decisions
  • ensure the product roadmap is visible, and the backlog is sufficiently populated to provide stories to the development team

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has iterated the service with clear enhancements
  • outcomes of user research informed the development
  • analytics were used to inform priorities

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to test and iterate, particularly with those who are not able to use the service and with Scrap Metal licence holders

9. Create a secure service which protects users’ privacy

Decision

The service did not meet point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • citizen accounts are well protected and adhere to GPG 44 and 45, through the use of Government Gateway
  • there is strong validation of tax codes to prevent any possibility of phishing
  • in cases where citizens are able to bypass the real-time tax check (i.e., if they declare they don’t have an income from their licenced activity or if they are PAYE), an auditable event is created, and this data is passed to another team in HMRC to check for suspicious activity
  • personal data is encrypted and protected with approved patterns
  • the email addresses of users are validated so that tax codes cannot be generated for users without access to the email address provided
  • there has been a recent ITHC and another planned for the near future
  • automated security testing (using OWASP ZAP) is in place

What the team needs to explore

Before their next assessment, the team needs to:

  • provide evidence around the effectiveness of the measures proposed to flag suspicious tax checks are effective. Although this is not the responsibility of the ‘Do a tax check’ team directly, the effectiveness of the service will, to an extent, depend on how well the service can identify users who generate ‘suspicious’ tax codes. During private beta, the service will need to collect more evidence around this, and at this point the number of users who have been through the service (17) doesn’t give the panel enough assurance about this

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • they have developed comprehensive performance dashboards including: digital take-up, completion rate, customer satisfaction, cost per transaction, user behaviour and user demographics
  • the metrics are updated in a timely fashion
  • they use common technologies: splunk and Google Analytics

What the team needs to explore

Before their next assessment, the team needs to:

  • service data will go into the central data warehouse to justify the application. They should work with the relevant parties to confirm the warehouse feed and data are fit for purpose
  • they are aware of the user numbers in each user category, for example scrap metal collectors. It would be beneficial to have an estimate of the monetary value for each of these user groups
  • ensure the reports and data feeds are well documented so they can be maintained over the long-run
  • ensure that they can provide metrics around users who start using the service but cannot complete it due to issues around IDV

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the service has been built with a cloud first approach using a microservices architecture
  • containers are used throughout
  • the front end of the service has been built using a GDS-compliant framework which has been open sourced

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work with the Government Gateway team to remove the need for users to accept cookies for both the authentication and tax code journeys

12. Make new source code open

GitHub repos published:

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the code for the citizen and licensing bodies has been released to the department’s public GitHub account
  • the team is coding in the open
  • code is well documented and well structured

What the team needs to explore

Before their next assessment, the team needs to:

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service leverages Government Gateway for authentication and proof of identity
  • there is good use of shared HMRC components and services

What the team needs to explore

Before their next assessment, the team needs to:

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • monitoring and alerting are following best practice and using appropriate tooling (Kibana and Grafana)
  • the service can be updated quickly and easily with a ‘one-button’ deployment
  • there are appropriate environments in place and good controls of integration and releases of software
  • the team have been working with the HMRC service management team, and there is good evidence that they can cope with the anticipated levels of assisted digital traffic

What the team needs to explore

Before their next assessment, the team needs to:

Updates to this page

Published 15 December 2022