Agent Services live assessment
Service Standard assessment report Agent Services 28/02/2023
Service Standard assessment report
Agent Services
Assessment date: | 28/02/2023 |
Stage: | Live |
Result: | Met |
Service provider: | HMRC |
Service description
Agent services act as a gateway to other tax services that tax agents can access once their clients have provided authorisation for them to act on their behalf. The service has two primary user journeys:
Authorisation
This is the first part of the service, where the tax agent creates an agent services account following a series of checks to ensure they meet the criteria to create an account.
They also have the opportunity to map over existing agent client relationships from legacy VAT and ITSA services.
Relationships
Once the agent has created their account, they are taken to their homepage where they can start and manage the process of client authorisation. In this section, they can do the following transactions:
- request client authorisation
- manage requests from the last 30 days
- copy across VAT and Self Assessment client authorisations
- cancel a client’s authorisation
Along these two primary journeys, there are also additional journeys:
- an overseas journey for agents based outside of the UK
- a route into the Agent Services Account (ASA) from either the Personal Tax Account or Business Tax Account to review client relationships with an agent (Manage Your Tax Agents)
- alternative authorisation for ITSA clients
Service users
The ASA caters for all tax agents in the UK. These range from:
- small / micro agents, where there are few to no employees and work out of one office
- medium / large agents, who have up to 99 employees and usually more than one office
- enterprise agents, such as the big four, who are international and multi-faceted
A secondary user group is formed of the clients these tax agents support. They only interact with the ASA to authorise the agents to deal with their tax affairs on their behalf. They can be broken down into:
- small businesses or sole traders with low or no digital confidence
- small businesses or sole traders that are digitally confident
- mid-size and large businesses with more complex tax affairs
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has spoken to a large number of users in beta - 195 users in total, mainly focusing on agents (173) but also some clients (22), and users with access needs (11)
- the team was able to demonstrate a clear understanding of who its users are. They were able to explain the user segments (split across agents and clients) for the service in detail and had developed a robust set of personas
- the research conducted has provided the team with a solid foundation to understand the needs of their users. This included specific needs for each user type, and where the were overarching needs across agents
- the team has used a broad range of methods (behavioural and attitudinal) to gather user data and develop insights that the team can work with to help improve the service
- the team has spent time with support teams/HMRC helpdesk staff to understand the offline journey, and further support the needs of their users
- the team were open about areas that had proved challenging and had impacted the research - particularly the difficulty in the recruitment of agents with access needs
- the team has a clear plan of action for research when moving into live - this includes usability testing with agents/clients with access needs and low digital confidence, testing with mobile users and research to support new tax services and features for the ASA
What the team needs to explore
Before the next assessment, the team needs to:
- address ways of conducting research with real users that have access needs/low digital skills. The team understands the limitations of conducting research with proxy users. Moving into live, there should be more avenues to interact with these users (qual and quant), and ensure their needs are understood
- develop relationships with other research/insight teams across HMRC - not just product teams. This will enable the team to benefit from research conducted in other areas of HMRC (for example, Behaviour Insights, Policy Lab), reducing the load on the team, and using the findings to develop and improve their service further
- The team spoke about agents having an influence on policy at HMRC - particularly the Big four. The team should look to use their relationships with policy colleagues (including HMRC Policy Lab) so they are involved at the start of policy conversations, enabling the service to adapt more easily to meet changing needs of their users.
- The team is currently holding personal identifying data indefinitely so users can access unlimited historical data. The team should aim to understand more about the needs of users to access this type of information, and what level would be acceptable so that it meets GDPR guidelines.
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team is working closely with adjacent service and policy teams and has a big picture view of services at HMRC
- the team has ensured the users can get through the whole journey and that there are robust support systems in place
- the team understands, and are able to measure, how users access the service
- the team has considered both the needs of small entities and very large accounting operations
- the team is presenting the service to agents at agent forums and soliciting feedback
What the team needs to explore
Before the next assessment, the team needs to:
- monitor the private beta for the guidance page and ensure that they are not falling into the trip of complicating the service and putting more and more guidance elsewhere
- ensure that any additional journeys, particularly if adding in charities, is done with discovery and not under continuous improvement
- do more research into a multiple agent / broker journey (for example, a user has more than one service provider). Another organisation who has done this well is Defra whose farmers often have multiple brokers
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team is performing user research in an end-to-end manner
- the team is basing iterations on user research and analytics
- there is a robust support model in place
###
What the team needs to explore
Before the next assessment, the team needs to:
- ensure that related content such as the webinars created stays up-to-date
- ensure they continue horizon scanning for things that will impact their service - such as One Login
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- there is a style guide in place
- content iterations are grounded in research and moderated with peer reviews and content critiques
- content has a continuous improvement plan
- design iterations, particularly the accordion, have made the service clearer and reduced the need to scroll
What the team needs to explore
Before the next assessment, the team needs to:
- monitor the private beta for the guidance page and ensure that they are not falling into the trip of complicating the service and putting more and more guidance elsewhere
- ensure that there is a focus on simplifying the service wherever possible and not further complicating it over time under continuous improvement
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- there is a Welsh translation team to translate content
- the team has had regular accessibility audits through the Digital Accessibility Centre
What the team needs to explore
Before the next assessment, the team needs to:
- ensure the Welsh content is tested with Welsh users
- continue to prioritise accessibility and assisted digital
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- there is a stable team with very strong expertise in the field, which is largely going to remain the same during live
- although the current Product Manager (PM) is leaving, there has been a thorough handover with a new PM who comes from a service that has integrated with ASA so will be familiar with the service and technology
- the Service Manager has been responsible for the team for the last two years and has helped to ensure the scope for the service is realistic and valuable
- the team is well resourced to continue to support the live service
What the team needs to explore
Before the next assessment, the team needs to:
- ensure that there is more sharing and collaboration between the User-Centred Design (UCD) disciplines and the developers on the team. At the moment there are two tracks of work, with work passed from design to dev as ‘ready’ in a separate scrum board
- ensure that all members of the team engage more with and observe user research, making sure user research is a ‘team sport’
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- agile practices and ways of working are fully embedded in the team
- the team has ensured collaboration and knowledge sharing within the team despite multiple team changes throughout the delivery in early public beta
What the team needs to explore
Before the next assessment, the team needs to:
- ensure that the team attempts to work more as a whole unit and in roughly the same cadence and delivery methodology rather than separate streams for design and tech
- continue engaging and working in the open, in particular by having a long-term roadmap, which they widely promote within the department, and working across stakeholders and services that will need to onboard onto ASA
- collaborate closely with other teams across HMRC who will be looking into authorisation or agent services accounts, as both of these seem to be integral and unique to this service. There is a risk that without collaboration there could be duplication of effort or this service could be deemed redundant without sufficient understanding
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team is clearly focused on iterating and improving the service based on user feedback and engagement with stakeholders
What the team needs to explore
Before the next assessment, the team needs to:
- increase how much experimentation they do - they mentioned using Optimizely for defining experiments and doing A/B tests to assess any changes but when questioned the team mentioned that they haven’t done any experiments in more than a year due to other priorities. Going into Live, they should ensure they block some time every quarter for some targeted experimentation
- focus on improving the onboarding process for new services to the account, taking into account learnings from the previous onboardings and automating where possible elements of the process. ASA still has a significant number of services to onboard (mostly legacy) and each one of these could take a large amount of time to complete
- question and ensure they’re continuously prioritising work to meet user needs - for example, the decision not to focus on further API improvements (which could benefit significantly their large-scale agent users) was not backed up by sufficiently strong arguments and should be revisited to ensure the service continues to meet user needs
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- annual Penetration Tests, medium low risks have been resolved. Recent tests picked up and successfully resolved high level risk on access groups. Also conducting ZAP testing
- the team has discussed with the Risk Assessor the data retention. Data held in Mongo databases is encrypted along with redaction for selected data for example, for example, tax identifiers
- platform level security, managed by a dedicated team
- the panel was impressed with the implication of email verification, the use of standard identification with Companies House and the use of verification with HMRC’s API’s security
- for Data Storage within secure Field level encryption with backend Mongo stores
- redacting tax identifiers
What the team needs to explore
Before the next assessment, the team needs to:
- identify the DPO (Data Protection Officer ref: GDPR)
- perform information assurance and data protection impact assessments. This should include a review of the data retention policy with respect to client access and a review of agents not with HMRC as API verification is not possible
- continue to keep up to date with security-related best practices
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- there is effective and planned use of A/B testing to understand the impact of changes made to the service
- there is good evidence that data is being used to analyse the impact changes made to the service and statistical methods are used to ensure that the results of analysis are statistically robust
- there is a range of data being used to understand user behaviour and report on the services’ online performance and impacts
- there are dashboards and reports showing the performance of the service and these are shared with the team and stakeholders where necessary
- the Performance Analyst holds regular meetings to discuss performance data and analysis with the service team
What the team needs to explore
Before the next assessment, the team needs to:
- demonstrate how the online journey impacts offline processes, for example, do changes to the online service have a positive or negative impact on offline processes such as telephone services
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team is using common components and the well known services of the HMRC legacy platform Therefore the Agent Services Account service is using cloud technology
- the technology is only utilising MDTP patterns and technologies
- the latest version of assets are utilised
What the team needs to explore
Before the next assessment, the team needs to:
- explore the government One login service for identification where possible
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team has stored source code in Github where possible
What the team needs to explore
Before the next assessment, the team needs to:
- confirm which Open Source Initiative licence applies to released source code
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team used GOV.UK patterns and where one was available found a suitable pattern (select from large list) from MOJ
- Common components were used across the HMRC estate with registered APIs. Software is stored in GitHub
- the team used standards such as OAuth client; SI security integration
- the backend of the service is a completely protected zone
- the panel was impressed with the use of standard patterns; with the latest versions of assets
- consideration has been given to accommodate for the impact of scaling of API data transfer volumes with re-requests when necessary
- have used OWASP
- there is a CAB process in place for platform readiness assessments including impact and risk assessment for enhancements
What the team needs to explore
Before the next assessment, the team needs to:
- It is advised for impact assessment across the technical estate that there be a Technical Design Authority (TDA) contacted for endorsements and approvals for integrations across the estate.
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team has CI/CD deployment pipelines using Jenkins, a reliable agent
- the team has touched the notion of architecture by design, using architects to design the access groups, then further regular consultations with architects for other significant future changes and enhancements
- the team was impressed that Splunk monitoring console was used to assist in informing for a reliable service - keeping audit secured data in Splunk and only volatile insecure data in Kibana
What the team needs to explore
Before their next assessment, the team needs to:
- understand which components must be available 24/7 and create a plan for supporting in BAU