Declare your business trade and cost information alpha assessment report
The report for the Declare your business trade and cost information alpha assessment on 17th November 2022
Service Standard assessment report
Declare your business trade and cost information
From: | CDDO |
Assessment date: | 17/11/2022 |
Stage: | Alpha |
Result: | Not Met |
Service provider: | VOA (HMRC) |
Service description
‘Declare your business trade and cost information’ will enable businesses to provide rental, lease, ownership, cost and business receipts information to the Valuation Office Agency (VOA). For purposes of setting property business rates.
N.b. the Service was previously referred to as ‘tenure, cost and trade record’ (TCTR).
Service users
The service users span various business sectors and organisations, at a high level they can be categorised as below:
External Users:
● Business property owners
● Business property occupiers
● Business property agent
Internal Users at VOA:
● Administrators
● Valuers
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has engaged with users who have accessibility needs in order that they begin to understand their needs and has clear plans to continue to do so
- the team are working closely with Rent & Lease Details (RALD) colleagues to support learnings and help plan user research
- the team has worked together to plan and prioritise research questions and objectives
What the team needs to explore
Before the next assessment, the team needs to:
- the service team should build a complete understanding of all user touch and pain points of the full end to end service. This could include users determining which business premises to occupy, users receiving VOA letters, and how users understand why VOA is requesting information from them
- the service team should conduct further research on why 35% of users do not currently respond to the VOA’s request for information
- the service team should conduct further research on complex or ‘unhappy’ user journeys, such as what users do if they disagree with business rate charging
- further research into offline components of the service should be conducted. For example, do businesses rely on local services, social media or hire professional advice as part of their user journeys. Research on this point will help the team understand any barriers or pain points users face
- rewrite the user needs identified. Many of the user needs presented were written as non/functional requirements. Understanding true user needs will help the team to create a service which is truly user centred. User needs should express people’s goals and not include a solution
- consider alternative ways to research with users on the lower end of the digital inclusion scale. To date the team has utilised online recruitment methods, recruiting in this way is unlikely to reach users with low digital confidence and capability
- conduct research with small to medium sized businesses. The team has researched with relatively large businesses to date
2. Solve a whole problem for users
Decision
The service did not meet point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team has worked hard to simplify and reduce the time needed to complete FORs versus existing paper forms
- “Save as draft” functionality has been introduced, showing the team has identified and understood the service is potentially complex and lengthy for users
- the team has identified a need for clear and simple content to be present throughout the whole service, from initial letter to end of transaction
What the team needs to explore
Before the next assessment, the team needs to:
- show that they have considered minimising the number of times users have to provide the same information to government, for example, providing their business accounts
- consider the impact of not solving the issue of users having to type in the service URL from the letter. This point was identified from user research, and is planned in a later phase, so should be tracked closely during private beta to gauge user attitudes to this extra step
- research the content outside the service to ensure that the need for on-page help is reduced as far as possible
- ensure they share new patterns or components such as the turnover screen with both HMRC colleagues and the wider GDS and cross-gov design communities. This will help enable them to provide the most useful and usable tool for their users
- consider how it might understand how to solve problems the team has identified which may not be associated with digitising the form. For example, 35% of external users ignore the letter, the team should research whether digitising the form could improve this figure. The service team flagged an issue of letters being sent to incorrect addresses, this is relevant to solving a whole problem for users and should be included in research. Please refer to recommendations for service standard 1 and knowing the end-to-end journey and associated pains and needs of users
- understand relationships between Declare your business trade & cost information service and local councils, as well as the Valuation Tribunal and any other government user groups identified as being a part of the end-to-end service. A detailed service map would help to address this point
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has taken learnings from researching with the letter - the existing solution - and applied them to the digitised journey
- the team has engaged with users to identify their pain points and understand how they can help by designing a service which is simpler and easier to use than the current solution
What the team needs to explore
Before the next assessment, the team needs to:
- consider pushing back on the constraints of the project if “out of scope” solutions come through from research, e.g. invite by email
- consider researching alternative methods of turnover submission
- ensure that GOV.UK guidance content is well considered, so that users can easily understand the motivations for completing this form. Whilst it’s clear on the Start page, users may need more hand holding before starting the service
- ensure that any findings around content and language can be fed back to improve the PDF and paper form, this could improve the whole service for all users
- understand why 35% of users do not return their form so that they can begin to meet those users’ needs
- research with call centre colleagues to understand their needs and pain points and factor those into any service offerings
- understand how digitising the form impacts the call centre
- understand how they will go about updating any paper/pdf forms and ask the same questions in the same way as the digital service
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team have used standard GOV.UK components, such as Task list and Check answers patterns. The research and iteration cycles show the team’s commitment to providing a simple solution to a complex problem, as they outlined in their mitigation of current pain points
- content design and iteration has been led by user research, and that this has been both actioned throughout the service, and fed back to stakeholders in order to create better understanding for users across the business rate services
- collaborated regularly with related services, such as the Rent & Lease Details (RALD) team, to ensure the language used is simple and consistent
- the team has iterated on their initial designs based on user feedback, not just “lifted and shifted” the existing paper form questions. Good work has been done to remove unnecessary questioning dependent on the type of business
What the team needs to explore
Before the next assessment, the team needs to:
- ensure that the pattern used on the Turnover screen is device-agnostic, in line with other GOV.UK components and patterns. This is currently not the case, and so the panel would expect to see more research and design undertaken to be sure the simplest solution is utilised
- show through content that the service’s purpose is clear to users, both via the letter, the Start page and in subsequent guidance which does not yet exist
- validate through research that the choice of service name continues to resonate and be understood by users. The service name has been recently landed upon, and so further validation of this is required
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team has engaged with some users with accessibility needs already, and as clear plans to continue to do so
- the team has collaborated regularly with related services to ensure the language used is simple and consistent
- an accessibility audit is already planned for beta
- welsh language speakers have been considered and planned for in the next phases
- on-page help by way of a helpline has been researched and implemented in the service footer
What the team needs to explore
Before the next assessment, the team needs to:
- continue to seek users of the service with access needs and low digital confidence to research with. This can give the team more robust insight into their solutions, and ensure they can be used by all users
- consider the amount of users using the helpline and, as an extension of that, having VOA staff complete the form for them. This would be costly for the agency, and could bring into question the service’s usability
- consider alternative ways in which to interact with users during user research sessions. Utilising online methods likely means users have a certain level of digital confidence and capability
- understand how the paper invitation letter impacts users with accessibility needs and their ability to engage with the service
- understand why 35% of users do not return forms so that they can look to meet their needs
- understand how digitising the form impacts the contact centre
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the Service team covered the essential roles for alpha - in line with service manual guidance
- there are permanent VOA staff in leadership roles on the service team
- the service team has also benefited from a wider user centred design community at HMRC/VOA. For example, content crits were mentioned
- there are 2 user research colleagues working on the service team
What the team needs to explore
Before the next assessment, the team needs to:
- consider how the supplier component of the team can pass on knowledge of key roles - such as Delivery Management, User Research and Design
- consider whether the Service Owner role should be formally established. This role is perhaps being performed in all but name only by one or more VOA colleagues. Service assessments typically benefit from having a Service Owner present to represent the service. In the team section of the presentation two reported SROs were flagged for the service, however they were not present at the assessment
- ensure roles are aligned to the DDaT profession and Service Manual even if roles are fulfilled by suppliers (e.g. Scrummaster and Product Owner, should be Delivery Manager and Product Manager. The differences are not simply semantic). Aligning job roles to the DDaT profession will help the organisation’s wider agile transformation, build the DDaT reputation of VOA across government and help attract DDaT staff to the organisation. It will also help the service team use the service manual in working to the standard across beta
- delineate the project manager and scrum master roles in the service team in beta. This should be assisted by the DDaT job family recommendation
- more clearly communicate where ultimate product management responsibility for the service lies. Service teams with two product managers typically have differing job titles and seniority. For example, a Senior Product Manager and Associate Product Manager working to the Senior PM
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- show and tells are being well utilised to ‘show the thing’ to stakeholders and iterate prototypes when appropriate
- iteration was demonstrated based on user insights, aligning to agile concepts such as ‘responding to change over following a plan’ and testing hypotheses
- agile training budget will be available for staff
What the team needs to explore
Before the next assessment, the team needs to:
- the service team should look to book VOA staff on training to improve their understanding of agile ways of working. The GDS Academy and the cross government self-managed-learning suite of courses are both aligned to the service standard. The service team should consider whether either or both are suitable for their needs
- the service team should communicate their backlog and roadmap across interdependent service teams at VOA actively. The service team should also manage any potential dependencies with wider services which may block iterative releases of the service
- the service team should build an understanding of Scrum and Kanban to consider whether either would improve the agility of the service for beta. SAFe is not regarded to be an agile framework by agile manifesto authors such as David Thomas, Ken Schwaber and Ron Jeffries. Nor do the majority of DDaT leaders in government regard SAFe to be agile
- the service team should be empowered to have ‘problems to solve’ through beta
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the service team iterated prototypes based on user testing. Demonstrating an ability and willingness to improve content to meet user needs
- VOA subject matter experts and stakeholders have been included in show and tells. Which has resulted in organisational/policy sign off to reduce jargon and use plain english. This engagement with the wider organisation has enabled iteration and improvement through alpha
- for prototyping in alpha the service team was well resourced with design and user research colleagues to support iteration
What the team needs to explore
Before the next assessment, the team needs to:
- the Service team should develop a deep empathy and understanding of all users. This should be supported by empowering the service team to have ‘problems to solve’ as opposed to a ‘plan to deliver’
- given that ‘Declare your business trade and cost information’ is a service which is part of a wider VOA transformation; ensure the empowerment of the service team to solve problems for users is protected
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- they are leveraging the Multi-channel Digital Tax Platform (MDTP) which is used across HMRC
- MDTP includes protection against distributed denial-of-service (DDOS) and SQL Injection attacks
- they are storing property information in the database they manage PostgreSQL; however, they are storing no user information. This will be stored in DataVerse the Microsoft Dynamics database. The two are connected via a GUID
- they have retry limits and lockouts on incorrect submission of passwords
- ZAP and Penetration testing will take place in Private Beta
- all controls, protections and policies from other HMRC citizen facing service will be applied to this service
What the team needs to explore
Before the next assessment, the team needs to:
- ensure they conduct the Business Impact Assessment and the Data Protection Impact Assessment are conducted
- given the complexity and interdependencies on other services ensure there is appropriate security testing during future delivery phases
- ensure that discussions with other teams regarding any security limitations and risks of MDTP take place
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has a plan in place to monitor the 4 mandatory service KPIs
- the service team has defined success metrics beyond the 4 mandatory KPIs to benchmark service performance against intended business outcomes
- the Service team has evaluated the best approach for reporting performance data to internal stakeholders, including which technology choice would best support this
- the service team has used the learnings from another VOA service to inform the performance measurement of the service
What the team needs to explore
Before the next assessment, the team needs to:
- ensure performance data is used alongside user research to drive iteration across beta
- ensure the CSAT score identified includes a free text box for users to provide qualitative feedback
- ensure performance data is at the centre of governance reporting to wider VOA and the Business Systems Transformation Team
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- they have done a detailed assessment of competing approaches. The Mult-channel Digital Tax Platform (MDTP) and MS Power Apps Portal
- the team will be using Git as a code repository. Jenkins will be used for build and deployment pipelines including the running of acceptance tests on MDTP. Jenkins allows them to build once and deploy code to multiple environments
- Azure DevOps will be used for build and deployment pipelines on Microsoft Azure and the MSFT Dynamics Platform.
- Scala is used for forefront end development, a widely used language at HMRC
What the team needs to explore
Before the next assessment, the team needs to:
- ensure the dependencies on the larger project are clearly outlined
- to assist understanding in the complex landscape perhaps have a tech and data user story
- ensure dependencies and skills in the orchestration layer are effectively managed and risks are clearly highlighted. The orchestration layer is developed in C# on the Azure platform
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the service team are making their source code open on GitHub
What the team needs to explore
Before the next assessment, the team needs to:
- ensure the repositories are documented appropriately
- ensure new joiners can easily consume and use the code by reviewing GitHub
- ensure linkages and dependencies on other GitHub repositories are clearly highlighted and documented
- given complexities around data in this project assess and whether further data documentation should be delivered and made available on GitHub
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- they are using MDTP which is extensively used by HMRC and DWP
- Scala is their primary development language which is widely used by HMRC
- they are using standard patterns from MDTP including the Orchestration Layer and the Secured Data Exchange Service (SDES) which handles file uploads
- following GDS front end standards and adhering to the tech code of practice
What the team needs to explore
Before the next assessment, the team needs to:
- in later assessments highlight development dependencies they have on other teams
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- they use a wide range of tools to monitor the service including Kibana to visualise data held in logs, PagerDuty to provide alerts on service incidents. They also use Grafana and Pingdom
- they align to MSFT Azure uptime commitments of 99.9% availability
- they still have a paper-based journey if the service is unavailable
What the team needs to explore
Before the next assessment, the team needs to:
- the service has major dependencies on services being tested and built by other teams. This means perhaps only a full end to end test or soft go-live might test the complete functioning of the service. This needs to be carefully managed and scheduled. For the next service assessment, they should report back on how this is being managed
Next Steps
[Reassessment]
In order for the service to continue to the next phase of development, it must meet the Standard and get CDDO spend approvals. The service must be reassessed against the points of the Standard that are not met at this assessment.