Research and analysis

The Feasibility of measuring the Tax Administration Strategy's impact

Published 26 September 2024

Prepared by Verian (formerly known as Kantar Public) for HMRC

Authors: Karen Bunt, Bethany Dokal and Danny Price

Report Number: 746

January 2024

Disclaimer: The views in this report are the author’s own and do not necessarily reflect those of HM Revenue and Customs.

1. Executive Summary

1.1 Research context and design

Currently there is an evidence gap relating to how best to understand the impact of the Tax Administration Strategy (TAS) on customers’ experience of meeting their tax obligations. This report will focus on the time expended and the mental, emotional, and physical efforts of meeting their tax obligations. In December 2022, Kantar Public was commissioned by HM Revenue and Customs (HMRC) to undertake qualitative research to explore options for gathering this information.

The overall aim of this feasibility study was to examine options for collecting close to real-time longitudinal data on the impact of HMRC’s TAS reforms on customers with a view to establishing a robust methodology and baseline to underpin the evaluation of the strategy.

This research was incremental and included a scoping phase, 8 in-depth interviews with research design and tax experts and 10 focus groups with HMRC customers. The groups included a range of HMRC small businesses and individual customers, including individuals who sold online or were involved in gig work, sole traders and micro-businesses with up to 4 employees. Customer groups were further divided by the complexity of their tax affairs.

Fieldwork took place from 20 February to 16 May 2023.

1.2 Key findings

The initial brief suggested using a mobile app or using a short survey via text to collect a real time measurement of time and effort expended by customers in dealing with their tax. However, the desk research and interviews with research design experts indicated that this approach would not be feasible. In particular, the use of apps was seen as too costly, and a SMS survey was also deemed unsuitable due to practical and technical limitations of conducting a survey through SMS.

A fully longitudinal design was also advised against, as research design experts questioned the necessity to collect data to measure change at an individual level, especially considering the additional cost and complexity of longitudinal approaches. Instead, we proposed an alternative design. The main recommendations for the alternative design include:

Design: we recommend utilising a mixed approach which would include a cross-sectional design for those who have low tax complexity and a longitudinal design for customers with high tax complexity. This provides a realistic way of capturing regular data on customers’ experience of meeting their tax obligations. This combination of designs would allow HMRC to monitor population level changes, while also being able to detect individual level changes in customer groups that are most likely to be impacted by TAS reforms.

Data collection and frequency: we recommend varying the data collection modes and the frequency of data collection for each population of interest. For example, surveying PAYE customers less frequently as they have fewer interactions with HMRC and are likely to experience fewer changes from the TAS changes.

Survey length: we recommend HMRC administer short, sharp surveys to collect customer data, which should take no longer than 5 minutes to complete. This will maximise response rates and ensure customers in the longitudinal design continue to take part in future surveys.

Questionnaire design: this research has confirmed that effort could be used as a measure in the proposed study. The precise wording of the questions and the accuracy with which customers recall this information needs further development and testing. This would include thinking about the time spent and the tasks they complete in order to meet their tax obligations.

Finally, we recommend conducting a pilot study to test key elements of the recommended study design. This would include the cognitive testing of survey questions and monitoring response rates, attrition, and the impact of varying survey frequency.

Although this alternative design does not have a fully longitudinal approach and will not incorporate the use of apps, this approach will still enable HMRC to capture changes in customers’ experience of meeting their tax obligations over time.

2. Background and method

2.1 Background

In July 2020, HM Treasury and HM Revenue and Customs (HMRC) set out a 10-year Tax Administration Strategy (TAS), which aims to build a trusted, modern tax administration system. It describes a programme of modernisation that moves from a paper-based, time lagged system to fully integrated, digital, real-time processes. There are plans to better align taxes, reliefs and benefits. The programme is fundamental in building and maintaining trust among taxpayers and the wider public.

HMRC’s modernised tax administration system will: 

  • make it easy for customers to calculate and pay the right tax
  • be fair and transparent
  • improve customer experiences, for example, by making it easier to meet tax obligations
  • build trust in HMRC
  • reduce the cost of tax administration, for example, by reducing the time it takes to meet tax obligations

HMRC have a programme of research to examine specific reforms. Currently there is a knowledge gap relating to how best to understand the impact of the TAS on customers’ experience of meeting their tax obligations. This will focus on the time expended and the mental, emotional, and physical efforts of meeting their tax obligations. In December 2022, Kantar Public was commissioned by HMRC to undertake qualitative research to explore options for gathering this information.

2.2 Research objectives

The overall aim of this feasibility study was to explore options for gathering close to real-time longitudinal data on the impact of HMRC’s TAS reforms, with a view to establishing a robust methodology and baseline to underpin the evaluation of the strategy.

The core objectives of the research were to:

  • determine how to engage and retain participants’ input over an extended period
  • review the options for using an app, SMS or an alternative service to collect near real-time information
  • determine how best to capture regular data on time spent meeting tax obligations
  • determine how best to capture regular data on the perceived mental, emotional and physical effort of meeting tax obligations
  • examine the best way to collect this information from different types of taxpayers, such as Income Tax Self Assessment (ITSA) and small businesses
  • determine the optimum frequency for requesting this information from different taxpaying groups
  • to provide a suggested design for the proposed study (including sample size) and cost for a pilot which would meet measurement requirements robustly and representatively

2.3 Method

A phased approach was developed. It included a scoping phase, qualitative interviews with experts and focus group discussions with HMRC customers.

2.3.1 Scoping phase

The scoping phase consisted of desk research and a 2 hour online workshop with HMRC stakeholders. The desk research was undertaken between December 2022 and early January 2023 and was used to identify potential tools, measures, and research designs.

The workshop reviewed the design and research parameters, providing a long list of options and criteria to explore further in the discussions with experts and customers. It took place on 16 January 2023 and was attended by 10 HMRC stakeholders involved in the design and roll out of the TAS reforms and 3 members of the core HMRC research team.

2.3.2 Interviews with experts

The 8 qualitative depth interviews with experts took place between 20 February and 9 March 2023. The discussions took place online via Zoom and lasted approximately an hour. The interviews were moderated by members of the research team. The experts consulted either had expertise in conducting longitudinal research or in tax policy and administration. Findings from the interviews fed into the development of the research materials used for the customer focus groups.

2.3.3 Customer focus groups

The customer focus groups were held between 27 April to 16 May 2023. Ten online focus groups were held with a total of 47 HMRC customers. The groups included a range of HMRC small businesses and individual customers. This included individuals who sold online or were involved in gig work, sole traders and micro-businesses with up to 4 employees.

Customer groups were further divided by the complexity of their tax affairs. Individuals who had multiple sources of income, paid tax by both Pay As You Earn (PAYE) and Self Assessment (SA) and /or worked or sold goods online were defined as having more complex tax affairs. Small businesses paying VAT or other business taxes were similarly defined as having more complex tax affairs.

Customers were recruited using a screener to ensure they met key research criteria and were recruited using a third-party recruitment agency. The groups were moderated by members of the Kantar Public research team.

2.4 Reading this report

This report draws together findings from all 3 phases of the study.

For comprehension purposes, focus group respondents are referred to as ‘customers’ throughout, whilst the term ‘participants’ has been used to describe those who would participate in the proposed study.

Throughout this report, findings from customers are supported by quotes, where available. Accompanying illustrative quotes are included in this report in the format:

 “Quote”. (Customer type, high or low tax complexity)

3. Scoping stage findings

Kantar Public conducted a literature review to gather learnings to inform the research design, including longitudinal approaches, high frequency data collection and considerations for conducting research with taxpayers. It covered articles, books and technical papers, as well other HMRC customer research and longitudinal studies, including Understanding Society (USOC).

The findings and initial considerations for the research design options were reviewed in a workshop with HMRC stakeholders.

This chapter outlines the main findings, discussion points and outcomes from the workshop, as well as the implications for the later stages of the research.

3.1 Survey design

The scoping stage focussed on the longitudinal design for the proposed study. A longitudinal design involves repeatedly surveying the same participants to detect and track change over time at the individual level.

The main considerations for determining the survey design approach were the importance of tracking change at the individual level, limiting participant attrition (the number of participants who leave the study between waves) to a sustainable rate and design costs.

Given the time horizons for the TAS reforms to be fully rolled out, it was recognised the study will need to run for many years (c 5 to 10 years).

3.1.1 Design approaches

Adopting a longitudinal approach for measuring the impact of the TAS reforms has a range of benefits that were identified through desk research.

A longitudinal approach would allow for the measurement of individual level change. This may be most suitable to evaluate impacts that are likely to vary between customers. We know from previous research with HMRC customers that views about the ease of dealing with tax are influenced by a range of factors. These include the nature, frequency and complexity of customers’ tax liabilities, and their knowledge and confidence of dealing with tax, including whether they use an agent and their digital capabilities.

Furthermore, the TAS reforms were expected to be implemented in stages and therefore impact customers at different times. Longitudinal tracking would allow the study to examine how the benefits build over time and more specifically, whether there is a tipping point. This is where small, incremental changes start to have a significant impact on customers’ experiences of meeting their tax obligations.

The desk research also highlighted several drawbacks to longitudinal approaches.

Firstly, these approaches have lower response rates initially given the commitment required of participants. Participants are generally less willing to join a longitudinal study, knowing that they are required to complete multiple survey waves. This makes it more difficult to recruit enough participants to be representative of the target customer group, particularly for more niche and harder to reach groups.

Secondly, longitudinal approaches are negatively affected by attrition as participants drop out over time. This reduces the usable sample size and thus the robustness of the data. If the attrition rate varies between different sub-groups, this can introduce bias to the research.

Thirdly, longitudinal research tends to be expensive, due to the higher recruitment costs. This is because they require a larger sample size to account for participant attrition, and incentives are needed to maintain participant engagement.

Given the drawbacks of longitudinal designs, a stratified sampling approach was also considered. This involves repeatedly surveying different but characteristically matched samples, which cover the full cross section of the population, at different points in time. The large sample size means that repeat samples capture the full range of experiences each time, ensuring they are comparable.

This approach allows for the monitoring of aggregate change in experience of different groups over time, but it cannot track individual level change. The main benefit of this approach is the lower cost. It requires a smaller sample size and the incentive costs and effort involved in maintaining the study sample would be lower.

3.1.2 Longitudinal design options

At the stakeholder workshop Kantar Public presented 3 longitudinal design options:

Firstly, Monotonic design: participants are required to complete all survey waves. If they fail to respond to one wave, they are removed from the study and not invited to later waves.

Secondly, Non-monotonic design: participants joining the study in the first wave remain in the study even if they do not respond to every survey, allowing for occasional responses.

Thirdly, Mixed design: a design with monotonic and non-monotonic elements, whereby participants that join the panel in the first wave are only removed from the study if they do not respond to a fixed number of surveys, allowing some waves to be skipped.

3.1.3 Survey design conclusions

Stakeholders wanted to measure individual change so preferred a longitudinal approach, but did not rule out a cross-sectional approach, understanding the drawbacks of longitudinal studies.

Of the longitudinal designs presented, they preferred the mixed design. It was agreed that this would give the opportunity to measure individual change over time. It would also provide a ‘temperature check’ of customer feeling to the reforms by including participants who dipped in and out of the study during analysis.

Stakeholders felt a mixed design would be more cost effective than monotonic designs, as those who respond occasionally remain in the study. It would also lessen the impact of attrition on sample size over time. A mixed design was also preferred to the non-monotonic design, as there were likely to be smaller gaps between respondent survey completions.

Stakeholders agreed that other measures to help maintain sample size and research quality were required. This included recruiting a large initial sample to account for attrition and ensure customer groups were large enough to be analysed separately. They also recognised topping up the study with additional sample was required to maintain the study size and representativeness.

3.2 Elements of data collection

During the scoping phase Kantar Public explored which elements of data collection were suitable for the proposed study. This included survey mode, frequency of data collection, recruitment strategy and approaches to maintaining the study sample.

3.2.1 Survey mode

Potential mode options including online surveys, mobile apps and telephone surveys were outlined in the workshop.

Stakeholders originally favoured a mobile application or SMS survey approach, which was felt to be most suitable for gathering ‘close to real-time’ data.

Desk research highlighted that this would be an expensive approach, which was explained at the workshop. For this reason, stakeholders’ next preference was for an online approach. This was recognised to be less expensive, but still reflected the TAS aims of ‘embracing digital’.

Kantar Public also recommended online surveys as it was suited to the short survey length (3 to 5 minutes). They can be completed quickly and submitted instantly.

It was explained that this also allowed multiple modes of contact to recruit participants. For example, they could be invited to take part via letter, email and text. This would help to increase response rates by increasing the likelihood that potential participants would receive the invitation.

A drawback of the online approach was that it would exclude the offline population. To mitigate this drawback an offline approach could also be included for example, telephone or face to face.

3.2.2 Frequency

The desk research suggested the frequency of data collection should relate to the expected frequency of change researchers are monitoring. This means researchers should avoid regularly asking participants questions on areas or topics in which they are unlikely to have experienced change. This helps to avoid participant fatigue and panel conditioning, whereby survey answers are influenced by previous involvement in the study.

The frequency with which customers interact with HMRC varies, but most interact infrequently (once or twice a year). A minority of customers, who have more complex tax dealings, interact with HMRC more frequently but this rarely exceeds more than 4 times per year.

Stakeholders agreed that frequency of contact should vary by customer group to reflect the varying levels of interaction customers have with HMRC and the tax system, which was further explored in the customer focus groups.

3.2.3 Recruitment

The desk research identified best practice approaches to recruitment for longitudinal studies.

Firstly, the initial panel recruited should be sufficiently large. This will ensure the survey continues to yield robust data, despite attrition or participants failing to respond to certain waves.

Secondly, researchers should consider offering a larger incentive to recruit participants, as this encourages involvement in later waves.

Thirdly, researchers should collect participant details that are unlikely to change (such as tax behaviour and demographics) and are required for analysis, in the recruitment survey. This enables later surveys to be shorter, reducing the burden on participants.

3.2.4 Maintaining the sample

Literature highlighted that in longitudinal studies, research teams endeavour to maintain the size of the sample. The 2 approaches to this were minimising attrition and topping up the sample with additional participants to replace those who had left the study.

Building on this information, HMRC agreed that the study sample should be topped up with fresh participants to maintain sample size and representativeness. Discussion at the workshop centred on approaches to keeping participants engaged in the study. HMRC were open to offering further incentives to participants to encourage them to complete multiple surveys and sharing findings with them.

3.3 Questionnaire design

During the workshop, Kantar Public proposed that surveys should be a combination of broader tracking questions, asked routinely to measure change, and event-specific questions, measuring customer experience of specific tax events.

Stakeholders intended to focus on broader tracking questions that would be asked routinely to determine whether the TAS reforms are meeting their objectives and measure change over time.

They would also measure whether the TAS reforms were impacting the effort expended by customers in meeting their tax obligations.

HMRC explained that the study could include the measures of ease, trust and confidence. Tracking these wider perceptions would enable study findings to be triangulated with HMRC’s Individuals, Small Businesses and Agents (ISBA) survey. The annual survey measures customers’ experience of interacting with HMRC, their perceptions of compliance, and HMRC’s reputation for 3 main customer groups, individuals, small businesses and agents.

However, HMRC explained that customers’ effort and time spent on their tax obligations, were the most important metrics to include in the proposed study. Customer effort would be measured by asking about the effort expended in completing tax tasks. The definition of effort to be used in the proposed study was developed in HMRCs’ effortless research.

Effort was to be split into physical effort, the time and physical requirements required to complete a task; mental effort, the understanding of what needs to be done to complete a task; and emotional effort the worries and concerns around completing a task.

Stakeholders recognised that effort should be explored in the focus groups. Firstly, the research team would need to explore customers understanding of the definition. Secondly, they should explore customers’ ability to recall the effort expended on meeting their tax obligations.

3.4 Customer groups

Individual and business customer groups of interest to HMRC were laid out during the stakeholder workshop, using data from the ISBA 2021 survey. Kantar Public highlighted their characteristics that may impact their involvement in and recruitment to the study. For example, incidence rate, use of agent and frequency of contact with HMRC.

Groups of interest included those which were likely to be impacted by the TAS reforms and/or were particularly difficult to identify and engage in tax research.

Individual customer groups of interest included:

  • PAYE and SA taxpayers
  • gig workers
  • users of online platforms

Small business customer groups of interest included:

  • sole traders (0 employees)
  • micro-businesses (1 to 4 employees)

Stakeholders emphasised the importance of breaking customers down by the complexity of their tax affairs. Stakeholders hypothesised that those with high tax complexity would be more likely to be impacted by, and gain most from, the TAS reforms. They felt they should therefore be a key target group for recruitment.

Customers with low tax complexity, such as PAYE customers, were also expected to be impacted by the TAS reforms. For example, being able to view their tax position and tell HMRC anything it needs to know using an online account.

Stakeholders felt the research would provide insight on how frequently customers engaged directly with HMRC and the types of tasks they completed to meet their tax obligations.

Aligned to this a key discussion point among HMRC stakeholders was around how frequently different HMRC customer groups should be surveyed. The consensus was that the study should not take a one size fits all approach and should be adapted for different customer groups. It was agreed that this would minimise participant burden by ensuring they were only asked questions relevant to their tax affairs and at times that would capture the most valuable insight. For example, shortly after a relevant reform or following the SA deadline. Stakeholders were also interested in how customer groups may change over time.

The identification of customers of interest influenced the sampling of business and individual customers in the focus group stage of this research.

3.5 Outstanding questions for the expert interviews and customer focus groups

The following questions remained outstanding at the end of the scoping stage:

  • how important is measuring individual level change to evaluating the TAS reforms and is adopting a longitudinal approach necessary?
  • what customer groups should be targeted in the research?
  • what is the most appropriate survey mode?
  • how frequently should customer insights be gathered?
  • should frequency vary by customer group?
  • how should participants be recruited to the study?
  • what contact strategy should be used?
  • are incentives necessary and if so, what incentives should be offered?
  • should event-specific or consistent tracker questions be asked of participants?
  • how should customer experience be measured?
  • do customers understand the 3 different types of effort, physical, emotional, and mental effort?
  • are customers able to accurately recall the effort they expended on their tax activities?
  • are customers able to accurately recall the time spent on their tax activities?
  • how should questions about effort and time be framed to elicit accurate responses?

4. Expert interview and customer focus group findings on elements of the proposed study

Following the scoping stage, Kantar Public conducted 2 stages of fieldwork to explore the questions raised in the stakeholder workshop around research design. The first stage involved in-depth interviews with experts, who either had experience conducting longitudinal research or in tax policy and administration.

During the in-depth interviews, experts were asked to give their initial thoughts on the proposed study. They were also asked to advise on elements of the design to best capture change in customer experience in line with the TAS reforms. Expert suggestions also helped to steer the topics covered in the customer focus group discussions that followed.

The customer focus groups included a range of HMRC’s individual and small business customers. They explored views on specific elements of the research design. Research design discussion points included survey design, sampling, recruitment and recontact approaches, survey mode, survey length, frequency of data collection, and how to maintain participant engagement.

Customer focus groups also looked at how to measure effort and the time customers spent on completing tax tasks.

This chapter outlines the main findings from these discussions with experts and customers.

4.1 Initial reactions

4.1.1 Experts

When first introduced to the proposed study, research design experts were sceptical as to whether a longitudinal approach was necessary to measure the impact of the TAS reforms. They felt there was minimal added value in tracking change at the individual level for most customers, who had few dealings with HMRC. Experts concluded that any benefit to capturing individual level change would be outweighed by the increased costs and risks of a longitudinal approach.

Tax experts were more concerned by the potential burden of involvement in the study. They felt this was particularly the case for small businesses, who had little time to complete surveys and were regularly contacted to take part in government research. They emphasised the need for the research team to minimise participant burden when designing the research.

4.1.2 Customers

Customers were broadly positive about the proposed study initially. Across groups, customers focussed on the benefit of the research to them as users of HMRC services. They recognised the importance of HMRC receiving feedback and felt that the study findings could be used to improve services and their experience of tax administration.

“I’m generally happy to engage with things like that, when they’re designed to help improve things.” (Individual, low tax complexity)

Customers’ willingness to take part in the study was mixed and varied by customer group. As with the expert interviews, they shared concerns around the effort of taking part in the study. Some customers were cynical about whether HMRC would act on their feedback and therefore questioned or doubted whether taking part would lead to positive change. This made them feel taking part would be pointless.

Individual customers were mostly willing to take part in the study, recognising their insights could help improve HMRC’s services for them.

However, those who only paid PAYE questioned whether it was worth them taking part as they rarely engaged with the tax system and their effort expended on tax was low. They felt any changes to the tax system would be unlikely to affect them and therefore, questioned whether their insights would be useful.

This highlights that HMRC could communicate that they are interested in hearing from all customers, even those who rarely engage with the tax system, to encourage participation from a range of customer groups.

As suggested in the expert interviews, business customers were most concerned about the level of effort required in taking part. They worried about the time required to take part in the study and felt they should be compensated for their time and efforts. Consequently, they were the least interested in taking part in the study.

Gig workers and online sales platform users’ reaction was mixed. Some customers felt they spent enough time dealing with HMRC and were not willing to take part. Others felt they were often underrepresented in HMRC’s decision making and appreciated the opportunity to feed back their views.

“We’re quite underrepresented when HMRC are making decisions…they’re not familiar with the way we do things.” (Gig worker or online platform seller)

Positively, once customers had heard more about the design elements of the proposed study many of their concerns were alleviated. As a result, there was more willingness to take part in the hypothetical study towards the end of the focus groups.

4.2 Survey design

Research design experts questioned whether a longitudinal approach was necessary to evaluate the impact of the TAS reforms.

Based on the description that the proposed study would offer a ‘temperature check’ of customer feeling in line with the TAS reforms, they questioned the value of tracking change at the individual level, particularly for customers who had little interaction with HMRC and the tax system. Therefore, they felt that a repeat cross-sectional survey would be equally effective.

By contrast, experts explained that a longitudinal approach was more often used when exploring changes over lifetimes rather than the shorter timescale outcomes required from this study.

“My initial reaction is this does not need to be a longitudinal survey, but needs a repeated cross section survey, as it is not clear why you need to follow the same people over time…really it looks like [HMRC] are mostly interested in whether things are improving at the population level.” (Research design expert)

“You’re doing this over 10 years. Why has it got to be the same people…Most people don’t really know what is happening in their pay packet around tax, [so in a 5 minute check] are you going to get what they really think?” (Research design expert)

Costs and risks of longitudinal research also influenced experts’ judgements. They emphasised that longitudinal studies have high upfront costs given the need for a large sample size. Further costs are then incurred throughout the study. For example, costs to incentivise participants and minimise attrition. Potential participant attrition also posed a risk to the study’s robustness.

Experts felt that the additional costs and study risks of longitudinal research, outweighed the added value of tracking individual change afforded by the longitudinal approach.

Despite the consensus against a longitudinal approach among experts, they could see that tracking individual change may be useful for some customer groups who had more interaction with HMRC. They gave the example of customers who paid multiple taxes and were therefore more likely to be impacted by TAS reforms.

It was suggested that survey invitations could be triggered by an event, for example, completion of Self Assessment to gather close to real-time data from customers.

Similarly, to measure change, experts suggested running a comprehensive baseline survey when participants initially joined the study. This would gauge customer experiences and attitudes and later ‘snapshot’ surveys could measure changes as reforms are implemented.

4.3 Sampling

Research design experts agreed that a sample frame should be drawn from HMRC databases to recruit participants to the study. This would cover most customer groups and small businesses to be included in the study. This approach was also acceptable to customers. They expected HMRC to use this strategy in research and felt there was no alternative.

Research design experts suggested it may be necessary to run a pilot. This would assess the response rates of the various customer group to estimate the size of the initial sample required for each group, though they recognised this would have to be large.

Experts suggested that recruiting gig workers and online platform sellers would be more challenging, given they are not identifiable in HMRC databases. Furthermore, the desk research highlighted these groups are known to be low incidence taxpayers. The representation of these groups is important to the study, given they are likely to be impacted by the TAS reforms. They recognised screening for these groups from the general population would be expensive.

Alternatively, a tax expert suggested recruiting these individuals through recruitment agencies or using other more purposive recruitment techniques.

Customers were happy with the proposed randomised approach to sampling but emphasised that this should be explained to participants when they are invited to the study. Across the focus groups, customers emphasised the need for transparency in communications from the research team. This was driven by recognition that customers would be helping HMRC by participating in the research.

Customers also suggested that building trust with customers early would be important to engaging them in the study.

“I’d be curious to know why I’ve been selected…is it because I’m in a particular tax bracket, for example.” (Individual, low tax complexity)

“Why are they asking this question to [me]…why have I been picked?” (Micro-business, low tax complexity)

4.4 Recruitment

Research design experts suggested 2 possible methods of recruiting taxpayers and small businesses to the research.

Firstly, they suggested that participants could be invited to take part in the research via email, which would include a link to the web survey.

Alternatively, they suggested a push-to-web approach, whereby customers could be invited to take part in the study via post, with details of a survey link. This would be a more inclusive approach as it would include customers for whom HMRC did not have an email address.

These 2 options, as well as a recruitment via SMS text message, were proposed to customers during the focus groups. Although, it should be noted, if the proposed study is implemented, participants will likely be recruited using a postal approach. This is because HMRC databases do not always include e-mail addresses and SMS would not allow for a detailed explanation of the research.

4.4.1 Digital recruitment approaches

Across groups, customers initially favoured the email invitation approach. They considered the level of effort on customers taking part and recognised that this would be the easiest option to sign-up to the study and complete surveys. Customers also approached the question from the researchers’ perspective, suggesting that email invitations would be the easiest to administer and most cost-effective.

While they recognised that these were also benefits of recruitment via SMS, most customers felt customers were less likely to take an SMS survey seriously, impacting engagement.

“[An email with a link] is preferable because then it’s sort of effortless.” (Sole trader, high tax complexity)

On reflection however, customers suggested that they, and participants like them, may believe an email invitation to be a scam and not click on the link. This was particularly relevant given the regularity with which HMRC branding is used in phishing attempts.

Others felt email invitations are likely to be ignored or missed by participants. These customers pointed out that they routinely receive invites to surveys over email and questioned how this one would look any different or be any more engaging.

All of these factors were likely to limit initial involvement in the study.

“If I received an email or a text, I would be suspicious that they weren’t genuine…I would quite likely ignore them and not click on any links.” (Individual, low tax complexity)

“In my inbox at the moment I’ve probably got umpteen surveys asking, ‘how did we do?’ and I’ve just ignored them.” (Individual, high tax complexity)

Customers also echoed the concern raised by experts, that by solely contacting participants via email, the research team would exclude the offline population. They understood that this would impact the representativeness of the research. To mitigate this, research design experts suggested also including offline contact approaches, for example, sending postal invites or contacting participants by telephone.

4.4.2 Push-to-web recruitment

Customers were split between preferring the email invite approach and the push-to-web approach. A push-to-web approach involves a customer being sent an invitation letter, with a link and log-in information to complete the survey online.

Those who worried about lack of engagement with the email approach saw push-to-web as a superior option. They felt customers were more likely to take the research seriously if invited by letter. A letter would allow HMRC the opportunity to provide extra information about the study and would likely give the study more weight. Furthermore, they expected that a letter invite would not be missed or mistaken for a scam.

“I’d prefer post because it shows that HMRC have had to pay for the postage which shows a commitment to the survey.” (Gig worker or online sales platform seller)

“An initial letter followed by an email might help, because you tend to take letters more seriously, especially if it’s got HMRC on the envelope.” (Individual, high tax complexity)

Some concerns about the postal approach were also raised. Firstly, customers pointed to the added effort of having to type in the web address and log in details to complete a survey. They felt that this did not align with the short and efficient nature of the survey. Secondly, they recognised that this approach would be expensive and more difficult to administer for the research team. Finally, customers recognised the use of paper as less environmentally friendly.

4.4.3 Mitigations

Overall, no conclusion could be drawn from the customer groups on the most suitable approach to recruitment for the proposed study. This was due to customer preferences varying depending on the weight placed on the competing factors. There are, however, design options, which could be implemented to mitigate the concerns raised.

Business tax experts suggested contacting customers through the HMRC Business (BTA) and Personal Tax Accounts (PTA). To encourage engagement and assure customers that the research is not a scam, initial invitations could be sent via the BTA or PTA. This would be a new approach and, as noted by the experts, uptake of the government gateway is still relatively low so this option may be more suitable in future waves.

If a push-to-web approach was used, participants could be invited to the study via post initially, with email communications then used for follow-up waves. This may help to engage participants in the first instance but limit the environmental impact of the study.

4.4.4 Communication branding

Customers had mixed views on whether survey invitations should come from HMRC or the research agency.

One view was that customers would be more willing to take part if communications were addressed from HMRC and HMRC branding was used on communication materials. To some customers, HMRC branding was more authentic. As a government department, HMRC branding would be more recognisable to participants than an agency administering the survey.

Other customers felt that the research agency branding was favourable. Customers that were more familiar with the organisation administering surveys, were more likely to trust the research and therefore, take part. These customers were confident HMRC would handle participant data securely.

“At least if things go wrong, HMRC are liable to us.” (Micro-business, low tax complexity)

Other customers had negative associations with HMRC. They felt that HMRC branded material tended to elicit negative feelings, which typically related to fines or money owed. Consequently, they thought they would be less likely to engage with the research with HMRC branding.

In comparison, research agencies did not have these negative associations and these customers felt they would be more likely to take part in a study with agency branding. They also suggested that with research agency branding, participants may provide more truthful answers without fear their answers may impact their relationship with HMRC.

To mitigate these concerns both the branding of HMRC and the research agency administering the survey could be used in participant communications.

4.4.5 Recontact approaches

The focus groups explored how customers who had initially agreed to take part, should be recontacted in follow-up waves of the study.

Overall, customers favoured email contact. Having initially received contact about the study they could now be reassured that communications were not a scam, and they saw regular email communications to be less burdensome than contact through other channels.

Postal contact was seen to be more burdensome for the research team, who would need to send the letters and process responses. Participant burden would also be higher, requiring them to log in using details on the letter with a push-to-web approach or complete and post responses, if a postal survey.

Across the groups, there was agreement that the research team should ask for future communication preferences in the initial recruitment contact. This would allow customers to select their preference and by adhering to this, the research team would build trust with participants. Again, customers emphasised the importance of having a transparent relationship with the research team.

While they recognised the benefit of contacting participants via multiple channels, they suggested that the research team select one communication channel and stick to this to avoid irritating participants.

“[If contacted on multiple channels] I would probably feel like I’m being hounded, like I owe them something.” (Individuals, high tax complexity)

4.5 Data collection

During the depth interviews, research design experts were asked to consider best practice on data collection and the optimal approaches for the proposed study. Elements included survey mode, survey length and frequency of data collection. In the follow-up focus groups, customers were asked to give their preferences on each of these.

4.5.1 Survey mode

Experts agreed a digital approach would be most appropriate for the survey. This was seen to align with the short, sharp nature of the survey and limit the burden on participants. Experts also explained that this would help the research meet its objective of offering close to real time data collection. Through event-driven data collection, a survey invitation could be emailed or posted to participants as soon as they completed their tax return.

For these reasons, experts advised against a face to face or paper survey approach, which would be less efficient and more expensive to administer. Likewise, a telephone survey would incur more costs as they require interviewer administration. An offline mode could be made available for the digitally excluded.

However, other experts suggested that best practice was to ensure the mode aligned with the method used by the customer to interact with HMRC. For example, if a customer interacts with HMRC online they should be asked to complete an online survey, whereas if they tended to call HMRC, they should be interviewed over the phone. This had the benefit of enabling participants to complete the survey in a way they were familiar with, maximising engagement and response rate.

In line with stakeholders, experts were interested in the mobile app suggestion, recognising that it would capture ‘close to real-time’ data most effectively. This could allow the researcher to alert participants to complete the survey after completing their tax return. Equally they felt participants would value the convenience of this approach.

However, in a later interview, an expert more familiar with the use of mobile apps in research advised against this option. They explained that a mobile app would incur large up-front costs in development and further costs needed to ensure the app was compatible with updating operating systems.

For these reasons, the expert explained that mobile apps tend only to be used in shorter diary studies lasting a few weeks, rather than infrequent, multi-year longitudinal studies. This expert’s analysis suggests this option should be ruled out.

Customer views tended to align with experts in preferring an online survey approach. For those who had completed market and social research previously, they were familiar with this approach. They also recognised this would be easiest for most participants and appropriate for a 3 to 5 minute survey.

Despite this preference, customers recognised drawbacks of an online approach. These were similar to concerns around online recruitment methods and that online approaches exclude offline populations, such as the elderly.

“[I am] most likely to close page without thinking about it, so unlikely to take part this way (online).” (Individuals, low tax complexity)

As discussed, if an online approach is chosen, HMRC could offer an offline alternative, to ensure the survey is inclusive and representative of all customers.

4.5.2 Survey length

Initially customers worried about the burden of taking part in the research, with business customers particularly concerned about the time demand. Customer concerns around effort were alleviated when the proposed 3 to 5 minute survey length was explained. Survey length was an important driver in the increased interest among customers in taking part in the survey towards the end of the groups.

“If it takes quite a long time it would be too much of an inconvenience.” (Sole trader, low tax complexity)

“3 to 5 minutes is perfect, most people would be happy with that.” (Micro-business, low tax complexity)

Despite the short survey length, customers still felt they would need to be compensated for taking part in the study. They were also sceptical whether this survey length was realistic, expecting that it would take most customers more than 3 to 5 minutes. Those with high tax complexity also questioned whether this was sufficient time to adequately capture their experiences of tax administration.

“Every survey I do turns out to be longer that what it says.” (Individuals, high tax complexity)

The positive response to survey length suggests this could be emphasised in initial recruitment communications to encourage involvement in the study. To mitigate the concern of individuals with high tax complexity, it could be explained that the short, sharp style is sufficient in offering a ‘temperature check’ of customer feeling.

HMRC could also offer an estimation of survey length to build trust with participants to maintain their engagement with the study. This estimation could be established in the pilot study.

4.6 Timing and frequency of data collection

When and how frequently data collection should occur was a key discussion point in both the expert interviews and customer focus groups.

4.6.1 Timing of data collection

Experts were aligned in the view that data collection should be timed to tie in with tax interactions. It was suggested that data should be collected shortly after tax interactions or tasks, so the experience and feelings elicited were fresh in the mind. This would maximise accuracy of responses. Given the regularity with which some taxpayers complete these tasks and their administrative nature, experts thought memory of the interaction was likely to fade quickly.

Referencing their experience of measuring customer attitudes to taxation, tax experts suggested collecting participant data at 2 separate points in the year. Data collection would occur at one point soon after completing a major tax task and at another point later in the year. They explained that this method had given them an accurate and complete picture of the tax system. They gathered spontaneous thoughts soon after the completion of tasks, when the experiences were fresh, and then later in the year when attitudes were not influenced by specific interactions.

4.6.2 Frequency of data collection

When assessing frequency of data collection, experts and customers considered 3 factors: complexity of participants’ tax affairs, frequency of interaction with HMRC and participant burden of taking part in the survey. They recognised that as these varied by customer group, frequency of data collection should vary by customer groups. The 3 factors are discussed further below.

Customers recognised that tax complexity determined the level of change they would likely experience in relation to tax reforms. To accurately measure level of change, frequency of data collection should broadly align with participants’ level of tax complexity.

Customers with high tax complexity tended to favour quarterly or 6 monthly data collection for participants like themselves. Otherwise, they felt HMRC would struggle to get valuable data to accurately track their change in experience in line with the TAS reforms.

Customers with low tax complexity, such as those only paying PAYE, favoured less frequent data collection, for example, 6 monthly or annually. This frequency was felt to be sufficient, given they rarely interacted with the tax system and experienced minimal change. They thought more regular data collection would not be worthwhile to HMRC from a cost perspective. Experts also suggested that surveying participants too often, without them experiencing change, would disengage them from the study and could lead to panel conditioning.

“They can do it every day, but if I stay in the same job for the next 5 years it wouldn’t really make any difference.” (Individual customer, low tax complexity)

Customers who regularly contacted HMRC felt more frequent data collection would enable them to give feedback on their experiences of contact and any changes they had experienced. Customer frequency of contact with HMRC tended to align with the complexity of their tax affairs.

However, some customers who only paid PAYE, reported that their employment circumstances and/or tax code changed frequently prompting regular contact. This was because they had multiple jobs or they had recently started paying tax, for example, customers who had recently migrated to the UK.

The third factor considered was participant burden. Experts felt that participants should be surveyed once or twice per year so they would not be overburdened by the research. They suggested that more frequent surveying could lead to participant attrition or deter participation in the first instance. Participant effort was felt to be less of an issue because of the short survey length. Consequently, regular data collection, for example, quarterly, was acceptable to most.

4.6.3 Summary

To aid accuracy of participant recollection, HMRC could consider linking data collection to the completion of tax tasks, with participants ideally surveyed in the days following the event. This would necessitate incorporating event-driven data collection into the research design. In a similar vein, frequency of data collection should vary by customer group. This should reflect both the complexity of participants’ tax affairs and their level of contact with HMRC.

4.7 Maintaining participant engagement

Experts and customers were asked how participants could be engaged in the first instance and how the research team could maintain engagement between survey waves. There was a consensus that financial incentives would be required.

Research design experts suggested offering financial incentives in the first instance to encourage involvement, with further incentives offered to keep participants engaged in the study.

From their experience of running longitudinal studies, they suggested that participants need to feel rewarded for their efforts. Reference was made to social exchange theory, which states participants understand that they are providing value to the researchers and feel they should be compensated.

Similarly, customers felt financial incentives would be the most effective approach to engaging participants and encouraging them to take part in repeat waves. Most customers accepted that given the short survey length the initial payment would be a maximum of £5. They felt, regardless of value, they deserved to be paid for their time. In general, customers were happy for incentives to be in voucher form.

“You’re going to get a stronger response if people are paid a fiver to do the survey”. ​(Gig worker or online platform seller)

“[If money is offered] I’d be the first one to take part.” (Micro-businesses, low tax complexity)

Experts and customers tended to agree that additional, smaller payments, should be offered on completion of further surveys. Customers also liked the idea of offering a larger ‘milestone’ payment on completion of a certain number of surveys. They felt this would encourage further engagement by gamifying the process.

However, research design experts did indicate an incentive may not be necessary, given the short survey style. They also added that the cost and effort for researchers of administrating them may not be worthwhile. They suggested the cost might be better spent engaging participants by improving the design of other elements of the survey.

“Micropayments can be more hassle than their worth…with £1 vouchers, the admin of that might cost more than the pound itself.” (Research design expert)

Experts and customers did consider other approaches to maintain participant engagement. Experts suggested participants may be interested to hear about the research findings and HMRC’s response to these. For example, the research findings might help them understand whether their experiences dealing with HMRC, and tax administration were typical.

Similarly, customers suggested they could be encouraged to take part in the research by HMRC promoting the benefit to them as customers by sharing relevant survey findings throughout their involvement in the study. Though, they caveated that this should be backed up with evidence. For example, sharing previous findings with a clear sense of how HMRC have acted on these insights to make positive changes.

“Businesses need to know what HMRC are doing with the information they share…you said you’re going to do a temperature check, well what’s the temperature like on this particular X?” (Business tax expert)

“I think this would encourage people but what would encourage people more is actually seeing change and seeing what they’re going to get out of it. If I know they’re going to listen, if I’m not going to get money out of it then I don’t mind, if I’m being listened to.” (Individuals, low tax complexity)

Customers were also asked to give their thoughts on other approaches to engaging participants, which they tended to expect would have little impact. Promoting the social good of the study was felt to be vague and may not feel personally motivating to participants to participate in the survey. Therefore, they suggested that communications should emphasise how the research could be used to improve their experience individually.

Customers indicated that the opportunity to be entered into a prize draw would also fail to engage participants. There was, however, some interest in the opportunity to provide a charitable donation on completion of surveys.

5. Using effort to measure customer experience of meeting their tax obligations

A key objective of the study will be to measure the impact of the reforms on customer experience of meeting their tax obligations and the effort involved. This research explored the types of effort customers encountered when meeting their tax obligations and how this could be used to inform the questions for the proposed study. This was explored in both the expert interviews and in the customer focus groups.

This element of the research was informed by previous research on ‘HMRC’s proposition for an increasingly effortless experience for individual customers’. This prior research explored what an increasingly effortless customer experience might look like for customers across various tax and benefits regimes. It identified 3 types of effort which would need to be addressed: physical, mental, and emotional.

This chapter covers customers’ understanding and recall of each type of effort before discussing approaches used by customers to minimise the level of effort they expended.

5.1 Physical effort

Physical effort was defined as ‘the time and physical requirements to complete a task’. When introduced spontaneously customers did not associate the term with ‘time taken.’ Instead, they associated it with completing manual tasks or tasks that caused physical exertion, such as carrying heavy loads. However, once the definition was explained to customers, they had no issue with comprehension and were able to easily recall activities.

Unsurprisingly, customers with high tax complexity tended to report putting in more physical effort than those who had low tax complexity. For example, those who were PAYE, felt they did not put in much physical effort.

Broadly, customers reported 3 types of activities which required high levels of physical effort.

Firstly, preparation activities around final submission for SA. This included collecting necessary information such as collecting receipts, filling in their tax return or keeping records. Physical effort was especially felt by customers who kept physical forms of evidence.

Secondly, time spent trying to communicate with HMRC, which included being on the helpline and trying to find a convenient time in the working day to call HMRC.

Thirdly, the time they spent trying to navigate HMRC systems. This included navigating the website to find an answer to a query or navigating their government gateway. Of the less routine tasks, customers mentioned occasionally checking their tax code.

From these findings, using the term ‘time taken’ and focusing on the activities in which customers experienced physical effort could help to improve participants’ comprehension of the term physical effort. These activities could include preparation activities around final submission, time spent trying to communicate with HMRC and navigating HMRC systems.

“The physical effort I could see is physically keeping your receipts and documents…this can be an effort.” (Micro-business, low tax complexity)

“Physical effort is how much time it takes to get everything completed, including the back and forth involved in resolving an issue. . . [physical effort] involves navigating different things e.g. the app or the Excel spreadsheet or finding receipts online.” (Gig economy worker or online platform seller)

5.2 Mental effort

Mental effort was defined as ‘understanding what needs to be done’. When spontaneously introduced, customers confused mental and emotional effort. As with understanding of physical effort, once explained to customers they had no issues with comprehension.

However, customers still felt there was an overlap between mental and emotional effort. This was because tasks that required greater cognitive effort to understand what they needed to do also tended to induce greater worries that they had misunderstood.

The types of tasks customers associated with mental effort depended on the complexity of their tax obligations. Those with low tax complexity tended to associate mental effort with sorting out errors with their tax code as these types of tasks were unfamiliar to them.

Those with high tax complexity tended to cite tasks experienced when completing their SA. In particular, they mentioned working out what is applicable to them and determining what HMRC wants them to do. Given customers reported doing something new caused high mental effort, this suggests the introduction of TAS reforms may cause high mental effort for customers initially. This should be considered when interpreting survey findings from the proposed study.

Given customers’ initial confusion about the difference between emotional and mental effort, alternative terms could be explored in a further stage of testing to improve participant comprehension.

“At the beginning there’s a lot of mental effort, now I’ve done it for the last few years it’s much calmer.” (Micro-business, low tax complexity)

“[At first it’s] Trial and error, but then you get used to it and know what you’re doing.” (Micro-business, low tax complexity)

“The whole process is like going to the dentist, you just dread it. The fining for me happened a few times and this was because I didn’t understand what they wanted me to do, no matter how many times I’d read it.” (Individuals, high tax complexity)

5.3 Emotional effort

Emotional effort was defined as ‘overcoming worries about completing a task.’ Generally, customers understood and agreed with this definition. When recalling activities that induced emotional effort, customers considered tasks that had caused them anxiety, worry and stress.

Almost all SA customers explained that the final submission, and the preparation for this, was the most stressful period. This was because it required collecting large amounts of information from across the year and high attention to detail when inputting figures. Customers recognised this was an important task and mistakes could have significant implications.

The enormity of the situation meant some customers had to mentally ‘gear themselves up’ for the task and would ‘put off’ completing their final submission.

In comparison, PAYE customers encountered less emotional effort, with their day-to-day queries around PAYE not seen as too stressful.

Although PAYE customers did report some high emotional effort around checking tax codes. This was especially if the tax codes were wrong as they would then need to get in touch with HMRC. PAYE customers experienced uncertainty when carrying out these tasks as this is something they do irregularly. Due to their lack of familiarity with this type of task, they also have fears around getting it wrong.

PAYE customers with multiple sources of income also reported keeping track of what taxes they had paid or needed to pay as more stressful than those with one source of income.

As customers associated emotional effort with stress, anxiety and worry, these more specific terms could be used instead of emotional effort to improve understanding. Additionally, as customers experience emotional effort at specific times, such as around final submission, this could also be used to shape survey questions. This could assist participant recall and help to collect more accurate responses.

“If I keep to my routine which 95% of the time I can, then it’s almost this weight off my shoulders and I can be a bit relieved. I try not to let it get to the point when I’m feeling that it’s so much emotional effort. But it’s the least enjoyable part of the job for me.” (Micro-business, high tax complexity)

“It’s definitely worry, because if they [HMRC] get it wrong, it doesn’t matter, you still owe it. And they can take up to half of your salary”. (Individuals, Low tax complexity)

5.4 Summary of understanding and recall of each type of effort

Overall, customers agreed effort was a good measure of customer experience. They felt they had experienced all 3 types of effort when dealing with HMRC and completing their tax obligations. Although this was less pronounced for customers with low complexity tax affairs.

Once they were provided with the definitions, customers had good comprehension of all effort types and could confidently recall associated tasks with ease and how much effort they would attribute to each task.

Customers found tasks which were carried out repeatedly or were unusual easier to recall. Frequent tasks included those that were completed as part of routine business administration. For example, inputting figures into Excel spreadsheets or using software to track finances for SA customers.

Tasks completed less frequently often required higher levels of effort which also made them easier to recall. For example, PAYE customers contacting HMRC to correct an error in their tax code, or where a change in the operation of the business necessitated changes in tax information requirements, such as starting to export.

This suggests that tasks customers do periodically but are simple would potentially be the most challenging to recall. This is because they are neither unusual enough to stand out or frequent enough to be routine.

Customers also cited an overlap in the types of effort tasks required. Tasks that required high levels of one type of effort often also required high levels of other types of effort. For example, tasks that required greater working out (mental effort) tended to take longer to complete (physical effort) and induced greater worries about completing the task incorrectly (emotional effort).

“Tax is a complex subject matter and is difficult, and it changes a lot, and is not a one thing fits all. So it’s hard, it requires mental effort and emotional effort because you’re often worried that you’re going to get it wrong.” (Nano Business, high tax complexity)

5.5 Would agent use make participation in this study difficult

From the outset of this research, it was recognised that we would need to consider whether customers who used agents could be included in the proposed study. HMRC were unsure whether these customers felt they encountered any effort around completing their tax obligations, or whether this was mitigated fully by the use of an agent.

Customers in all groups used agents apart from individuals with low tax complexity. Customers who used tax agents explained that agents considerably reduced:

  • the amount of time they spent on their tax affairs
  • the associated need to learn or keep up to date with what was required
  • worries about making an error

Despite customers acknowledging agents reduced their level of effort considerably, they still felt they retained some of the effort as they are ultimately responsible for any mistakes. This was especially felt by businesses with high tax complexity. This suggests these customers would still be able to contribute to a survey measuring the effort associated with meeting tax obligations and could be included in the proposed study.

“I have an accountant who just does things for me. If I was filing it myself for the first time, it would probably take me all 3 [types of effort].” (Nano Business, low tax complexity)

“[Using an agent] Defers part of the responsibility.” (Micro-business, low tax complexity)

This chapter considers the feasibility of a real-time, longitudinal study using a mobile app or SMS survey. It also presents recommendations for an alternative design for the proposed study and recommendations for conducting a pilot study.

6.1 Feasibility of a real-time, longitudinal study using a mobile app or SMS surveying

The overall aim of this feasibility study was to explore options for gathering close to real-time longitudinal data on the impact of HMRC’s TAS reforms with a view to establishing a robust methodology and baseline to underpin the evaluation of the strategy.

The initial brief suggested using a mobile app or SMS survey to collect a real time measurement of time and effort expended on tax interactions. However, the desk research and interviews with research design experts indicated that this approach would not be feasible.

Research design experts questioned the value of adopting a longitudinal approach. They recognised that while a longitudinal study is an effective tool for monitoring individual level changes over time, there are additional challenges with this approach.

Notably, issues such as lower response rates, attrition and sample maintenance could lead to escalated costs compared to alternative designs. As the majority of HMRC customer groups have little interaction with HMRC, it is also reasonable to assume there may be minimal individual level change over time. As such, a solely longitudinal design may not be the most suitable approach for this study.

Instead, research design experts thought that a repeated cross-sectional design could be an efficient and effective approach. It would allow HMRC to measure the population level impact of the TAS reforms on different customer groups over time. By surveying the same populations repeatedly over time, HMRC can monitor the population-level impact of the TAS reforms on different customer groups.

However, we recognise that monitoring individual level changes may be a crucial requirement for HMRC especially for customers who are most likely to experience change due to the TAS reforms.

6.2 Recommendations for an alternative design

We have developed an alternative design, which HMRC could use to monitor the impact of the TAS reforms on different customer groups. This alternative design retains a longitudinal element, recognising that monitoring individual level change among some customer groups is of importance to HMRC.

However, this design does not allow for the real-time monitoring of the impact of TAS reforms as the use of mobile apps has been discounted. Instead, as recommended by research design experts, the design will collect data close to when HMRC customers interact, or complete tasks associated with tax. This will allow an element of ‘close to real time’ data collection for some HMRC customers.

We recommend utilising a mixture of longitudinal and cross-sectional approaches. This would allow HMRC to efficiently and effectively monitor the impact of TAS reforms among different customer groups.

This approach would entail using a cross-sectional design for some customer populations and longitudinal design for others.

A longitudinal design could be utilised for high complexity tax customer groups. This would involve inviting high complexity customers to complete multiple waves of the study to monitor individual level change over time. This more intensive approach would be warranted for these groups as they have more contact with HMRC and are more likely to experience change due to the TAS reforms. Low complexity customer groups could be monitored using a repeated cross-sectional survey.

This combination of designs would allow HMRC to cost effectively monitor population level changes across all customer groups. It would also enable HMRC to detect individual level changes among customers most likely to be impact by TAS reforms.

For the longitudinal element, we recommend a design in which participants would not be required to complete every wave. Participants would only be removed from the study if they did not respond to 2 consecutive surveys. This approach offers lower attrition compared to other longitudinal designs (where participants must respond to every wave to stay in the study).

The cross-sectional survey could also be used for recruitment to refresh and top up the longitudinal participants lost through attrition. It could also be used to introduce new customer groups into the study, if they are likely to be impacted by the TAS reforms.

6.3 Summary of the overall design

We recommend tailoring the designs to target the 3 populations of interest. The table below shows the recommend sample size, data collection mode and frequency for each group:

Table 1 overall design summary

Group Sample size Mode Frequency
Individual customers 1,000 to 2,000 Online (with offline alternative) Annual (with additional ad hoc waves)
Business customers 1,000 to 2,000 Mixed mode (Telephone or Online Biannual or Quarterly (with additional ad hoc waves)
Low incidence groups 250 to 500 Online Biannual or Quarterly (with additional ad hoc waves)

We would recommend varying the data collection modes and the frequency of data collection for each population.

For the Individual customers group, this would include Self Assessment customers, PAYE customers, and customers who are both Self Assessment and PAYE. For survey mode, we would recommend utilising online data collection methods. We also recommend offering an alternative offline mode for individuals who are not online. This alternative could include offering a paper questionnaire or conducting a telephone interview.

For the Business group this would include sole traders and micro-businesses. We recommend using a mixed approach of online and telephone surveying for small business customers. Online surveying would be a convenient and cost-effective means of data collection, facilitating engagement with a significant portion of the target population. Telephone follow-up would help maximise response rates, particularly for hard-to-reach customers.

The low incidence groups would include gig workers and users of online sales platforms. Online data collection approaches should be used to survey low incidence groups such as gig workers and users of online sales platforms. This could involve screening these individuals from HMRC Self Assessment databases and inviting them to complete an online survey.

We recommend that HMRC administers short, sharp surveys to collect customer data, which should take no longer than 5 minutes to complete. This will be vital in maximising response rates and ensuring customers in the longitudinal element continue to take part in future surveys.

The survey length will vary between the cross-sectional and longitudinal surveys. Cross-sectional surveys would need to be slightly longer as demographic information would need to be collected. Longitudinal surveys would be shorter, as this information would not need to be collected at each wave.

To ensure data collection aligns with the frequency of customer tax interactions with HMRC, data collection could be conducted at intervals that match the frequency of tax interactions for different customer groups. Experts emphasised that participant recall of tax events around effort and time taken were likely to fade quickly.

To mitigate this issue and collect accurate data, we suggest varying data collection time by customer group to ensure it occurs a short time after key tax events, which are likely to differ depending on taxes paid.

For the individual customer groups, we would recommend conducting data collection on an annual basis. This is because most individual customers have infrequent interactions with HMRC, so views are unlikely to change over a shorter period of time. In addition, most tax interactions for this group occur annually.

For small business customers, we would recommend conducting data collection more frequently, on a quarterly or biannual basis. Business customers are more likely to have more frequent interactions with HMRC, so their views may change over a shorter period of time.

To capture any potential impact of the TAS reforms, ad hoc data collection should also be conducted, specifically timed to coincide with the introduction of reforms. This would enable HMRC to assess the immediate impact of the reforms on individual customers’ experiences.

For all customer groups, to encourage participants to join the study we recommend an incentive at recruitment of £5 and a further £3 to maintain engagement for ongoing surveys.

6.4 Recommendations for a pilot study

We recommend conducting a pilot study to test key elements of the proposed study design. This would include data collection modes, recruitment and attrition rates and the optimal frequency and mode of contact.

The pilot surveys will replicate as closely as possible the proposed design. To do this, we would recommend conducting the following pilot surveys:

Individual customers: participants recruited using a push to web approach, and an online follow up survey conducted 6 months later

Small Business customers: participants recruited using a telephone interview and a follow up survey conducted 3 and 6 months after recruitment (conducted via an online survey or telephone)

Low incidence groups: participants recruited by screening from HMRC databases using a push to web approach. An online follow up survey conducted 3 and 6 months after recruitment.

6.4.1 Sample size options

We propose 2 options for the pilot with varying sample sizes, depending on the elements HMRC are interested in testing:

Option 1:

  • individual customers: 250 participants
  • small businesses: 250 participants
  • low incidence groups: 75 participants

Option 2:

  • individual customers: 500 participants
  • small businesses: 500 participants
  • low incidence groups: 75 participants

Option 1 would allow for the testing of the use of online and telephone data collection modes with individual and small business customers. As well as establishing likely response rates to the recruitment survey, and attrition over a period of time. This would help inform the required sample size for a full-scale study.

Option 2, with larger sample sizes would allow for more testing, including of different contact strategies. This includes mode, frequency of contact for follow-up surveys or varying incentive amounts. Increasing sample sizes further may allow for additional testing, such as experimenting with the mode and frequency of the follow-up surveys.

6.4.2 Elements to be tested in the pilot

The pilot surveys should be used to test:

Data collection modes: to ensure that the approach is suitable for each customer group and identify and address any issues or challenges of using the suggested modes.

Recruitment and response rates: to establish the likely response rates that would be achieved among different customer groups and determine the potential cost and practicality of conducting a large-scale study.

Attrition rates: to establish the likely attrition rates between waves and determine the sample size required to account for the loss of panellists over time.

Screening and recruitment of low incidence groups: to establish the screening and recruitment rate of low incidence groups to establish the cost and practicalities of recruitment.

6.4.3 Further exploratory work for low incidence groups

Alongside a pilot survey, we would recommend conducting further exploratory work to identify other suitable approaches to recruiting low incidence customers such as gig workers or users of online sales platforms.

This could involve exploring purposive recruitment approaches such as respondent driven sampling, other sources such as high-quality panels, or conducting large scale screening surveys to identify and recruit these customers.

We would recommend conducting desk research to explore these options.

6.4.4 Cognitive testing

We would also recommend conducting cognitive interviews as part of the pilot study to test customer understanding of key question concepts. These interviews could also develop measures to accurately capture concepts such as ‘time spent on tax obligations’ and the ‘different types of effort.’

This would include:

  • how to ask about effort expended, such as examining what terms best convey each dimension
  • how to ask about time spent on tax, such as examining what time period to use
  • assessment of accuracy of recall of effort and time

6.5 Developing the pilot study

We anticipate a pilot study would take one year to implement; this is to allow for follow-up surveys to be conducted 6 months after the initial recruitment. To deliver the pilot study, we recommend the following stages:

Set-up meeting: to agree the overall design and scope of the pilot study. Agree items to include in experiments (contact materials, contact strategies, incentive amounts, alternative modes).

Sampling: to agree the sampling approach for each of the populations of interest, with HMRC.

Development of questions and communications: this would include cognitive testing questionnaire and communication materials. We recommend conducting this with 20 participants so questions and communication materials could be tested with all 3 groups of interest: Individuals, businesses and low incidence groups. Testing would also be conducted in 2 stages, so revised questions and materials can be tested too. Participants would also complete a pre-task in preparation for the cognitive interviews.

Finalise questionnaire design with stakeholders: stakeholder workshop to review findings from cognitive testing. The outputs would be the finalised questions and communication materials to be included in the pilot survey.

Recruitment and initial fieldwork: this would include sending out communications to selected HMRC customers as well as purposive sampling techniques to recruit gig workers.

Follow-up fieldwork: follow-up surveys to be carried out with individuals, businesses and low incidence groups.

Analysis: analysis of fieldwork, response rates and experiments (testing of contact materials, strategy or incentive levels).

Reporting: reporting of findings and the feasibility of the design.