Close a company beta service assessment report
The report from Companies House's Close a company beta assessment on the 21/01/2021
From: | Central Digital and Data Office |
Assessment date: | 21/01/2021 |
Stage: | Beta |
Result: | Met |
Service provider: | Companies House |
Previous assessment reports
Service description
The ‘Apply to Close a Company’ service allows company directors (and those acting on behalf of directors) to apply to voluntarily close (dissolve or strike-off) a private limited company (LTD) or Limited Liability Partnership (LLP).
Companies House receives approximately 250,000 applications for voluntary dissolution per year.
The majority of applications are completed for companies with only one director (‘single director companies’) and in these instances, the application can be made by one person. However, where there is more than one director, company law requires a majority of directors to approve the application to close a company; so as part of the service other directors are invited by email to approve the application.
The transaction is fee-bearing. Applying to close a company using digital service costs £8. It is £10 to apply using a paper form by post.
Service users
This service is for
- Directors of private limited companies (LTDs) and limited liability partnerships (LLPs) who are ultimately responsible for the company/LLP.
- Presenters acting on behalf of directors, such as company secretaries, accountants, solicitors etc.
1. Understand user needs
User research assessor
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has continued the iterative research with users substantially, with the service now enabling users to seamlessly dissolve a company online, which has mitigated the possibility of errors occurring and that were present in the previous version of the service
- the team has ensured previously unmet user needs are now mostly met, whilst they have also continued to ensure previously met user needs are still fulfilled
- the team has a good cover of use cases and scenarios and a clearly understanding of the primary and secondary users, including internal users
- the team continues to have a good understanding of the primary users of the service, i.e. individuals needing to dissolve their own company and/or agents acting on behalf of their client(s)
- the team has a cohesive connection across members and has improved the collaborative ways of working, i.e. what has been found in user research and the pertinent design implementations.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure their understanding of pain points is implemented in the personas, so that they are rounded and grounded user representations, particularly for primary users
- ensure their understanding of users with accessibility needs, assistive digital and digital inclusion is solidly catered for (i.e. covid19 has hindered testing with this types of users and as a result, the team has conducted user research with only one user), and that the service is tested with a variety of such users to ensure user needs are met
- ensure the authentication process is either further tested to ensure user needs are met or implement the technical change they proposed on the current assessment
- ensure the email on the payment process in the service is editable so that the user need to edit this as/when required lets them avoid contacting Companies House for the latter to carry out such changes
- ensure email correspondence with users, i.e. directorial confirmation email, carry a self-explanatory subject line in the title, so that users are offered the confidence of understanding the relevancy of such correspondence.
2. Do ongoing user research
User research assessor
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team has shown a continued and iterative user research process whereby they have tested with 31 users since the last touchpoint assessment and implemented relevant design changes.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure their understanding of users with accessibility needs, assistive digital and digital inclusion are catered for, and that the service is tested with a variety of such users to ensure user needs are met
- ensure all users with directorial responsibility clearly understand “what happens next” when the company to dissolve needs signatures from multiple company directors
- keep working cohesively among all team members and ensure that performance analytics starts to get solidly integrated with user research to make the most of understanding live users going through the digital service
- ensure the implementation of the User Panel banner in the right place/s in the service, so that they can make the most of recruiting users going via this.
3. Have a multidisciplinary team
Lead assessor
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the improvements evident during the previous beta workshop have continued. The team continues to remain appropriately resourced and empowered to meet its challenges and deliver the product it is responsible for
- although there remains some dependence on specialist contingent labour, particularly around the technical roles, the team have appropriately experienced permanent members of staff covering the core roles that are responsible for the control and direction of the service. The capability building work is continuing and knowledge transfer activity is taking place.
What the team needs to explore
Before their next assessment, the team needs to:
- continue to ensure that the business commitment to an appropriate resource model continues and report any slippage or concerns to GDS.
4. Use agile methods
Lead assessor
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team continues to employ agile methodology to operate in fortnightly sprints, observing all of the appropriate sprint ceremonies
- the team makes good use of digital tooling to capture and prioritise the needs of service users and overcome the challenge of continual remote working posed by the current situation with covid-19
- the previous instability in personnel duration evident during the previous workshop has been addressed, with a good duration of roles and personnel now evident.
What the team needs to explore
Before their next assessment, the team needs to:
- continue using the appropriate agile methods to build the service in line with the needs of its users.
5. Iterate and improve frequently
Lead assessor
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has continued iterating the journey in beta
- the team had no further incidents of challenges to its autonomy to report and were confident the improvements made ahead of the previous beta workshop will ensure this continues going forward
- the team were also able to provide clear examples of where they have iterated the product both in response to the feedback from the previous workshop and in light of user research evidence.
6. Evaluate tools and systems
Tech assessor
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team completely re-evaluated their approach from previous assessments and built an entirely new system
- the team completely removed dependency on non-government 3rd party suppliers.
What the team needs to explore
Before their next assessment, the team needs to:
- evaluate the new identity solution provided by the wider organisation and adopt if it meets the needs of the service.
7. Understand security and privacy issues
Tech Assessor, lead assessor
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the service stores data in a secure database and the form pdf in a secure S3 bucket in AWS. Most of the data the service deals with are publicly available (director name, company name)
- a firewall in front of the web application and rate-limiting applied to the API greatly reduce the risk of a brute force attack based on the code used for authentication.
- only people involved in monitoring and supporting the live infrastructure have access to the production environment.
- the team has conducted a penetration test.
What the team needs to explore
Before their next assessment, the team needs to:
- assuming the new identification process in Companies House has resulted in changes, the service should integrate with the new solution. The panel understands this is already on the service roadmap.
8. Make all new source code open
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the code for the web application is publicly available https://github.com/companieshouse/dissolution-web
- the code for the API is publicly available https://github.com/companieshouse/dissolution-api
9. Use open standards and common platforms
Tech assessor
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team uses open tools and technologies which fit the competency of the wider organisation, can be built upon and evolved, and are well supported in the tech community
- the service follows accepted practice around building web applications such as RESTful architecture, OpenAPI standard, and the ISO8601 date format.
- the team relies on well-established government technologies such as the GovUK Design system and GovUK Pay.
10. Test the end-to-end service
Tech assessor
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team uses automated testing as part of the deployment pipeline.
- there are dedicated quality assurance team members who run manual tests and write comprehensive automated tests
- the team has included pa11y (accessibility testing) as part of their deployment pipeline
- the team has complete stress tests for all components of the system, testing with 1000 requests or equivalent.
11. Make a plan for being offline
Lead assessor, tech assessor
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- as a microservice, the service slots into the existing architecture and holds its information for as long as needed in case of a back-office system downtime
- the services run side-by-side with the existing paper service which acts as a back-up
- the service has a 99.9999% uptime guarantee.
What the team needs to explore
Before their next assessment, the team needs to:
- assuming digital uptake is successful, the team should consider if there’s a future in which the paper system is removed completely and what the back-up option will be then.
12: Make sure users succeed the first time
Design assessor
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the service has been tested at the Digital Accessibility Centre
- the team has demonstrated how it iterated the user journey, content through user research and testing to improve the service
- in particular, it is good to see changes to the interface design and content structure of signature pages and sections in response to usability testing.
What the team needs to explore
Before their next assessment, the team needs to:
- DAC report highlighted accessibility issues with the payment process, the team should consider how to address this in future
- explore how to mitigate some of the constraints around account authentication and requesting an authentication code, to prevent a dip in completion rates - especially the case for multiple director user journeys or for users who need to change details
- it would be nice to see ongoing plans for testing and auditing future iterations to assisted digital journeys and assisted digital support
- also would be nice to see ongoing plans for usability testing and use of analytics to explore how you might support more unhappy paths and improve links to back-office call centre functions
13. Make the user experience consistent with GOV.UK
Design assessor
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team has clearly evolved service since the beta workshop before the beta assessment
- the team articulated well how designers, content, user research and developers iterated the designs following GOV.UK styles and patterns.
What the team needs to explore
Before their next assessment, the team needs to:
- some of the components (i.e. error messages) were not fully up to date, the team should look at the internal audit process to fix these.
14. Encourage everyone to use the digital service
Lead assessor
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team have considered how best to encourage the take up of the service and have realistic and achievable targets for progress before completing their public beta phase.
15. Collect performance data
Beta/live - Performance analyst assessor
Decision
The service met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- the team are collecting data to measure their performance
- the team is using data from both online and offline sources to support their decision making
- the team have a dedicated performance analyst.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure data is collected with the users’ consent using a compliant consent mechanism
- understand the scope and availability of more detailed offline data and how it could be used to support iterations on the service
16. Identify performance indicators
Performance analyst assessor
Decision
The service did not meet point 16 of the Standard.
What the team has done well
The panel was impressed that:
- the team have a defined set of KPIs
What the team needs to explore
Before their next assessment, the team needs to:
- show a clear link between the defined KPIs and the overall purpose and goals of the service
- show how iterations have made improvements on the KPIs
- provide a detailed breakdown of the £8 cost and how it is calculated.
17. Report performance data on the Performance Platform
Performance analyst assessor
Decision
The service met point 17 of the Standard.
What the team has done well
The panel was impressed that:
- the team are publishing data online
- there is a wealth of data and information available beyond the mandatory KPIs.
18. Test with the minister
Lead assessor
Decision
The service met point 18 of the Standard.
What the team has done well
The panel was impressed that:
- the team had already demonstrated the service to the then relevant minister ahead of previous assessments. The current relevant minister and the lead minister for Companies House’ parent department are both aware of the service as a result of its MI being included in the daily coronavirus ministerial briefing packs and the product has been demonstrated to the agency’s chief executive officer.
What the team needs to explore
Before their next assessment, the team needs to:
- arrange for the service to be demonstrated to the current relevant minister.