Do a tax check when renewing or applying for a licence beta reassessment
The report for the Do a tax check when renewing or applying for a licence beta reassessment on 27 May 2022
Service Standard assessment report
Do a tax check when renewing or applying for a licence
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 27/05/2022 |
Stage: | Beta reassessment |
Result: | Met |
Service provider: | HMRC |
Previous assessment reports
- Alpha assessment: 4 August 2021 - Met
- Beta assessment: 10 March 2022 - Not Met
Service description
Do a tax check when renewing or applying for a licence will be delivered by 3 micro-services:
- external service - ‘Do a Tax check when applying for a licence’. This service allows an applicant who is seeking to renew certain types of licences (taxi and private hire drivers, taxi and private hire operators, scrap metal collectors and dealers) to do a tax check. The service checks on the applicant’s tax registration status, and if all is in order a tax check code will be provided to prove to the licensing body that they are registered for tax. The service is provided for both Individuals (PAYE, SA) and Companies (Corporation Tax)
- external service - ‘Confirm an applicant has done a tax check’. This service allows the licensing bodies (LB) to verify a tax check code that has been provided by an applicant seeking to apply for a licence that is covered by Tax Conditionality legislation
- internal facing service - ‘Help a customer do a tax check when applying for a licence’ This service allows an HMRC call centre operator to do a tax check on behalf of a digitally excluded applicant who has contacted the HMRC call centre, or is being otherwise supported by HMRC
Service users
- licence applicants, comprising PAYE, Self Assessment and Corporation Tax customers in the taxi and private hire driver and scrap metal collection sectors in England and Wales
- licensing bodies for the taxi and private hire and scrap metal collection sectors in England and Wales
- HMRC call centre operators using the Tax Conditionality internal service to help customers in England and Wales who call in to get a tax check code
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team utilised relevant research methods and has a good rhythm of planning, doing, analysing and iterating
- the team has included authentication through the Government Gateway as part of usability testing and are proactively raising issues it causes for users to the relevant team
- the team tested their assisted digital journey using scenarios with participants and contact centre staff
- the team has worked hard to find and work with people in the scrap metal user groups.
- the team tested the end to end service during private beta to demonstrate how someone could successfully get from being aware of the service to receiving a licence
- there are plans to start research early with the new cohorts of users in Northern Ireland and Scotland
What the team needs to explore
Before their next assessment, the team needs to:
- maintain a focus on ensuring the service works for everyone. Their most common users and their journeys are mostly successful so far. But they must keep working hard to ensure other users, and especially those who are already disadvantaged in society, can use the service. For example, users with cognitive or mental health related access needs, users with low levels of literacy or English skills
- diversify their recruitment methods. There has been a reliance on the survey and other digital means. They could benefit from engaging local communities or campaign groups their users may be involved with. Using a recruiter would be beneficial to access the people they seldom hear from
- understand user’s experience of the unhappy paths through the service, for example, where they have been ‘gracefully exited’. The team needs to demonstrate these users have been correctly removed and that they understand what to do next
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a good understanding of their user groups
- the team has built 3 services
- the team explained the challenges and difficulties of user research with some user groups e.g. scrap metal. They have taken steps to try and engage with that audience but are still unsuccessful. However, they have demonstrated how they’ve considered and designed the service to accommodate all types of users
What the team needs to explore
Before their next assessment, the team needs to:
- continue to explore and test the support channels to ensure that the user needs are taken into account as the service develops during public beta
- continue developing the web service with any new learnings or insights from user contact and feedback
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has a good understanding of the user groups
What the team needs to explore
Before their next assessment, the team needs to:
- explore the email journeys and the use of Notify
- continue to validate and test mobile, happy and unhappy paths
- finalise a Welsh translation of the service
- work with the GOV.UK team to iterate guidance
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a good understanding of the GDS Design System
- the team have used common components from the GDS and HMRC design system
- the team have iterated designs and content to simplify it
What the team needs to explore
Before their next assessment, the team needs to:
- ensure recommendations from the content review are applied
- internal content review to be done before coming to reassessment
- work with the policy team to ensure guidance pages are clear and understood but users
- user test and iterate guidance pages feeding back to content team at HMRC
- iterate and simplify language within the service esp. micro content use and apply guidance from: https://www.gov.uk/service-manual/design/writing-for-user-interfaces some content deviates from these guidance
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team has developed a new pattern
- testing offline journeys by simulating live session with users calling HMRC support line
- The service team has tested the service with accessibility software
- all accessibility issues highlighted in the accessibility audit have been fixed and the accessibility audit has been updated
What the team needs to explore
Before their next assessment, the team needs to:
- continue using a range of user methods and analysis where appropriate to improve the service during public beta and demonstrate this in the live assessment
- continue to ensure the policy content and service is tested with all user groups especially users who first language is not English
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- there’s a core team to take the service forward
- there are clear routes of escalation for issues and risks
- the team have managed to overcome a period of high staff turnover
What the team needs to explore
Before their next assessment, the team needs to:
- ensure stakeholders and business area representatives remain informed
- ensure knowledge is transferred to the delivery centre and the service is maintainable
- continue to work closely with Policy and stakeholders to influence how the service and approach needs to change as knowledge from real users is gained
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team continues to work in sprint cycles with sessions to review goals and issues are escalated appropriately
- all ceremonies are observed, and retrospectives are undertaken for the team to reflect and continually improve
What the team needs to explore
Before their next assessment, the team needs to:
- ensure the right level of personnel attend the Project Boards, to allow the team to be empowered to make independent decisions
- ensure the product roadmap is visible, and the backlog is sufficiently populated to provide stories to the development team
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team has iterated the service with clear enhancements
- outcomes of user research informed the development
- analytics were used to inform priorities
What the team needs to explore
Before their next assessment, the team needs to:
- continue to test and iterate, particularly with those who are not able to use the service and with Scrap Metal licence holders
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team ia working closely with HMRC’s Knowledge, Analysis & Intelligence (KAI) and Risk & Intelligence Service (RIS) teams, sharing data from the live service
- to date, KAI and RIs have not reported any unusual or suspicious behaviours from users, and the volumes of expected use are in line with their expectations
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- they have developed comprehensive performance dashboards including: digital take-up, completion rate, customer satisfaction, cost per transaction, user behaviour and user demographics
- the metrics are updated in a timely fashion
- they use common technologies: splunk and Google Analytics
What the team needs to explore
Before their next assessment, the team needs to:
- service data will go into the central data warehouse to justify the application. They should work with the relevant parties to confirm the warehouse feed and data are fit for purpose
- they are aware of the user numbers in each user category, for example scrap metal collectors. It would be beneficial to have an estimate of the monetary value for each of these user groups
- ensure the reports and data feeds are well documented so they can be maintained over the long-run
- ensure that they can provide metrics around users who start using the service but cannot complete it due to issues around IDV
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the service has been built with a cloud first approach using a microservices architecture
- containers are used throughout
- the front end of the service has been built using a GDS-compliant framework which has been open sourced
What the team needs to explore
Before their next assessment, the team needs to:
- continue to work with the Government Gateway team to remove the need for users to accept cookies for both the authentication and tax code journeys. This will have an impact on the reliability of the data they are able to gather from an analytics perspective.
12. Make new source code open
GitHub repos published:
- https://github.com/hmrc/hec-applicant-frontend
- https://github.com/hmrc/hec-licensing-body-frontend
- https://github.com/hmrc/hec-stubs
- https://github.com/hmrc/hec
Decision
The service met point 12 of the Standard.
What the team needs to explore
Before their next assessment, the team needs to:
- HMRC has produced some useful and reusable repos utilising the GDS front end, but built with Scala. This work should be shared more widely across government, e.g., via the GOV.UK technology blog (https://technology.blog.gov.uk/) or similar. A reference to Scala at HMRC was found here - https://opencastsoftware.com/blog/articles/opencast-devs-rise-to-the-scala-challenge/, but it would be useful to share this information more widely in government digital forums, for example on cross governmental Slack
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team has investigated the suitability of the Companies House login for corporate users (see https://find-and-update.company-information.service.gov.uk/), but found this to be technically incompatible with HMRC systems
- The service team is keeping up-to-date with current developments on the GDS One Login for Government project and the impact this will have on Government Gateway
What the team needs to explore
Before their next assessment, the team needs to:
- provide an update on how they will continue to support corporate users and provide the enhanced Confidence Level (CL250) which is needed to provide more robust identity assurance
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- monitoring and alerting are following best practice and using appropriate tooling (Kibana and Grafana)
- the service can be updated quickly and easily with a ‘one-button’ deployment
- there are appropriate environments in place and good controls of integration and releases of software
- the team have been working with the HMRC service management team, and there is good evidence that they can cope with the anticipated levels of assisted digital traffic
What the team needs to explore
Before their next assessment, the team needs to:
- revise the content on the service start page which reads: ‘Online services may be slow during busy times’ and produce content more in keeping with the performance expectations of the service. See - https://www.gov.uk/guidance/complete-a-tax-check-for-a-taxi-private-hire-or-scrap-metal-licence