Look up residency status for relief at source live assessment

The report from the live assessment for HMRC's look up residency status for relief at source on 16 August 2018.

From: Central Digital and Data Office
Assessment date: 15 August 2018
Stage: Live
Result: Met
Service provider: HMRC

The service met the Standard because

  • whilst meeting the user needs that were identified in Discovery, Alpha and Private Beta, the team successfully identified new user needs in Public Beta and iterated the service to meet those new ones as well
  • the service team has been using continuous user research and analytics to identify problems, test new solutions and iterate the service accordingly. They have been basing their decisions on user needs and testing, and have remained consistent with gov.uk patterns all throughout
  • there is a clear plan of handing over the service to a Live Team that consists of the key necessary roles to iterate the service in Live. The excellent setup and use of analytics for the service will help the Live Team to identify any new emerging problems.

About the service

Description

The service lets a Pension Scheme Administrator (PSA) look up where a pension member’s residency is, i.e. Scotland vs the rest of UK, to claim the right relief at source.

Service users

The users of this service are pension scheme administrators and pension scheme practitioners.

Detail

User needs

The team have researched with 27 users in Public Beta. They continued with usability testing and included live observations with 16 users and held 2 rounds of usability with users of assistive technology.

The team used visuals to evidence how their user needs and personas have continued to evolve as a result of research insights. It was good to see some new user needs that had been identified in public beta, and how the team have improved the user journeys as a direct result of research.

The team used public beta to progress research with new, first time users of their service to evidence that users can complete their journey successfully, at their first attempt. They shared examples of how they have used research sessions to test iterations and then observe how effective these changes were from a user perspective, once implemented into the live service.

Research shows that the majority of users access the service via direct links. This suggests users skip GOV.UK guidance available to them. Although research has not evidenced that this proves detrimental to the user, further research to understand why users do this and how they link directly would provide valuable insights.

Researching how users access and apply the current RAS (Relief at Source) guidance available will further develop understanding of user behaviour around how it is used, the level of detail required and how relevant it is prior to, and during their journey.

Promotional exercises continue through pension industry forums and newsletters to encourage digital uptake and it was good to hear the team have done some research with users who are yet to use the service. The team need to ensure that any future first time users have awareness of the ‘tool’ and are able to use it effectively to meet their needs of checking residency status of members.

The team are scheduling a handover of the research to HMRC live team. This will include providing an overview of the research conducted to date, and also plans of research moving forward. Bulk upload is anticipated to occur at the start of the new tax year. This will be good opportunity to research across larger volumes of users.

Team

The team is made of the necessary roles for public beta, and they have all used their expertise really well to take decisions that benefit the users of the service. The service team has been able to continuously iterate the service through rounds of testing and deployment.

The Digital Service Manager in Live will work with the live services team to continue to iterate the service. HMRC has an already established way of making this happen, as exemplified by other live services. Moreover, there is funding secured for further iterations, and a clear roadmap has also been identified for user research, and potential iterations based on policy changes (Wales might be diverting in tax policy for relief at source).

Technology

The technical structure of the service has not changed at all since the beta assessment and the team are continuing to follow good practices around development and deployment. It was good to see that they were able to quickly iterate in response to issues raised by user research during the public beta, making changes to the frontend and adding preprocessing to fix a common issue with uploaded files.

As the team prepare to hand the service over to the live services team they have produced a runbook for handling support queries and are holding a workshop to run through the code with the developers. Similarly, they are in contact with the software support team who will deal with any issues with the API.

The service is dependent on the upstream data matching API and if that goes down the service is shuttered and replaced with a holding page; this happened during the beta and the team were able to exercise the process, which involves running a Jenkins script.

They continue to be conscious of issues around data security. They have carried out a pentest of the service and talked regularly with the data guardian. No fraud vectors have been identified and they are continuing to work with the transaction monitoring team. Since no data is stored long-term, there is no need for a data recovery strategy.

Design

The team explained how research, analytics and design worked together to improve the service. They reviewed research findings and analytics to identify problems, prototyped and tested new designs, then observed people using the new designs in the live service.

The design of the service is robust, based on evidence and meets user needs. In live service testing, one user said “it told you straight away the file was ready – it’s much easier than it was before”.

The team described how they tried to help people who struggled to upload files in the service. They tried guidance, errors messages and other help but none of these worked.

They decided to remove the characters for their users. Usability testing, live observations and analytics show new users succeed first time. This was an excellent decision and a great example of doing the hard work so the user does not have to.

The design changes:

  • gives users information sooner, in a clearer way
  • prevents users trying to do things they could not do
  • use standard patterns like confirmation pages, status tags and a task list
  • makes the service more efficient for users who do more than one search.

The team could not find any assisted digital or access needs users. They asked in research sessions, at pension forums, and in the pensions newsletter. To confirm the service was accessible, they had 3 accessibility reviews, a Digital Accessibility Centre audit, and tested with internal HMRC assistive technology users. All these activities has made the service’s design, content and code more accessible.

Guidance on GOV.UK about the service is at the end of a long page after information about a yearly report. The team’s research shows the guidance may not meet user needs. People are reading it less and the order of the page makes it more difficult to find information about looking up new members.

The team tested the offline support model with 2 rounds of mystery shopper calls. They showed people get directed to the right helpline if they call another HMRC helpline. But they found advisers did not give the right advice. Following the first round, they worked with stakeholders to give advisers a revised briefing document. This improved the advice they got in the second round.

Although the team make support available via the pension helpdesk, it is still unclear whether users have an awareness of its existence and are able to access this support easily.

The team expect digital take-up to increase if the Scottish rate of Income Tax changes. They are going to stay involved with the pensions forum calls and in newsletters to keep raising awareness. The team confirmed monitoring analytics, feedback and help requests, and more research with new users is in the user research plan for live.

Analytics

The team’s approach to data and analytics was impressive. Since their last assessment the team have addressed limitations in their tracking of users by implementing sophisticated custom dimensions. With this, the team are able to analyse in depth user interaction - joining IDs and timestamps to understand journey time and journey completions. In their analysis the team have moved beyond the straightforward journeys and can identify users who are struggling.

A combination of Google Analytics reports, Splunk dashboards and Google Sheets reports are used by the team to monitor performance. These cover a variety of metrics, from top level data to granular information, allowing the team to analyse all of their data in one place - outside of the limitations of Google Analytics. As well as analytics data these dashboards also monitor user satisfaction, which is exceeding the target set out in the team’s Performance Framework. When there are anomalies or changes in this survey data the team are able to tie it back to other metrics, for example identifying an increase of traffic to an error page.

The team consistently demonstrated how they have used analytics data to improve the service. For example, when uploading a csv some users were getting an error due to additional characters being added to the strings - creating invalid entries. By using event tracking the team were able to monitor these, implement a fix and ensure it had worked. The team were able to use similar insight to shorten the journey for users doing multiple single searches, ultimately streamlining this journey.

As the service moves to live the team will follow a well established practice at HMRC of handing analysis over to the new service team - with continuity being provided from the current Performance Analyst when needed.

Recommendations

The service team should also:

  • add their hidden character findings to the file upload component issue in the GOV.UK Design System
  • do a content review on the GOV.UK guidance that uses all their research findings and design changes to decide if the guidance is needed in its current form
  • carry out more research on the end-to-end journey from GOV.UK guidance into the service needs more research – this should look at reordering the guidance, separating it into 2 guides, and adding information about the pensions helpline
  • hold retrospective research sessions with users who have used Relief at Source to check residency status of their members. This should include first time users. Doing this will provide direct feedback of user experience and provide insights as to why users chose to use service when they did, what encouraged them to do so and how successful their journey was

  • research how users gain access to the support channel (pension helpdesk) should they require assistance and continue to monitor use of the service in Live. This will help identify any pain points that haven’t been identified in research so far – and how users overcome these
  • monitor usage, and carry out live observations of users of the service when traffic increases. This should cover new users, single status checks and file upload journeys

  • monitor the communications being used to promote/inform PSA’s (Pension Scheme Administrators). This will provide insight into user awareness that the service exists and how they became aware, specifically for those who are yet to use the service.

Next steps

You should follow the recommendations made in this report as you continue to develop your service.

Submit feedback

Submit feedback about your assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 3 April 2019