Appoint, Update and Remove a Company Director
The report for Companies House's Appoint, Update and Remove a Company Director alpha assessment on 28th April 2022
Service Standard assessment report
Appoint, update and remove a company director
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 28/04/2022 |
Stage: | Alpha |
Result: | Met |
Service provider: | Companies House |
Service description
Directors have a responsibility to send information on behalf of the company to Companies House. The new web and API services will allow director appointments, updates and removals. This will transform, and eventually replace, existing ‘legacy’ versions of digital services for external users and internal CH teams.
The service will enable users to achieve their digital user journey successfully without intervention, increasing customer satisfaction, by reducing rejections and reducing manual query handling of submissions.
Our mission is to improve Companies House’s ability to maintain, upgrade and re-use our services in a cost-effective way, whilst providing the building blocks for future legislative change and other strategic projects.
Service users
• Directors of companies who are responsible for maintaining the company data
• Presenters acting on behalf of directors such as company secretaries, accountants, solicitors etc.
• Providers of software as a service, or company secretarial packages
• Companies House internal users
• Search customers – consumers of output data e.g., credit reference agencies, banks, financial institutions
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has achieved a lot in the 8-week alpha
- the team reused, resynthesized and built upon existing research
- the team had strong personas, user journey maps and could evidence iterations made as result of research findings
What the team needs to explore
Before their next assessment, the team needs to:
- show the online journey in getting to the service. Where does the service rank on organic search engine results and what does it compete with (organic and paid)?
- research with a broader range of users, across the digital inclusion scale, and strive for that mix in each round of research. The team could plot the participants on a scattergraph of ‘Digital inclusion scale’ along the x axis and ‘Experience with Companies House’ along the y axis. The next assessment would benefit from a few video clips from sessions to evidence the team’s work in the area - especially digitally excluded
- consider screening and recruiting participants through the existing offline journey
- assisted Digital and Accessibility insights need to be added to the personas
- test rigorously, face-to-face, on mobile and tablet devices
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team knows of the cross-government ‘start a business’ whole user journey, has considered how this service connects to other elements of managing a limited company (for example, different from filing shares), and is aware of other work happening in Companies House to get off the current Webfilings system
- the team is aware of the constraints and technology relating to the service - for example, that some professional users use APIs that connect to their existing software and will continue to do so
What the team needs to explore
Before their next assessment, the team needs to:
- confirm and test their actual entry points from GOV.UK as proposed by the GOV.UK content team (GDS or Companies House). For example, this may mean using the current ‘tell Companies House about changes to your limited company’ start page and using filtering questions to guide users to either this service, web filings or the others being developed to replace WebFilings. The team can then change their entry point for prototypes and, if appropriate, private beta landing page
- monitor how the API developments will affect the rollout of the self-service digital service - for example, making sure that all guidance and comms remains up-to-date
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team has mapped the wider journeys of users and understand how and why users choose to use a digital service, commercial software or paper forms to send changes
- the team understands the offline parts of the journey and challenges (including the authentication code being sent by as a letter) and is working on changes such as improving the letter about company obligations
What the team needs to explore
Before their next assessment, the team needs to:
- continue making improvements to the wider journey where possible - private beta offers a controlled opportunity to observe how these changes affect real use. For example, the team may want to use private beta to trial improved paper forms to reflect the content improvements to the online service that can then be switched out in public beta
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the service uses the GOV.UK design system, and is generally following patterns such as ‘one thing per page’
- the team has been modifying content based on user research, for example putting key information in a paragraph with a header before the ‘continue’ button so that users noticed it and changing the service name from “terminate” a company director to “remove” a company director
- the team is using content crits as well as user research to make sure that their work does the hard work to make it simple
What the team needs to explore
Before their next assessment, the team needs to:
- continue testing of the key ‘unhappy path’ when directors wrongly apply to have their details protected (causing the application to be rejected). The team will get data about actual success rates in private beta. If users still wrongly choose it, iterations could improve a confirmation page of what this means if they choose it, or extra information at the entry point (as agreed with the GOV.UK content team as mentioned in point 2)
- continue to refine the end of journeys based on user needs. Reviewing the journey of emails and screens from submission to outcome may help the team identify where content can be removed. The panel were also concerned that the ‘rejected’ screens might cause more stress than necessary both in the language and the design decision to use a red panel component
- monitor other potential ‘unhappy paths’ for users such as: needing to stop and come back later (ideally users should be able to choose to sign out and come back later, at the very least they need to understand that they need to do it in one go, as does for example ‘Apply online for a UK passport’), problems matching with the company, losing their access details to the Companies House authentication system, or having to transfer details for managing directors’ details to someone else. To continue to public beta, the team must have analytics to show that these journeys do not happen a lot or cause long delays to users
- continue to strive for content in Plain English. The team should be mindful of how they iterate - for example, while the team found that ‘postal address’ didn’t make sense, ‘correspondence address’ is less plain English than ‘send mail’ or ‘send letters’ (which has been used in some tax services). As well as the internal content crits that the team are already using, there are also ‘Get Feedback’ sessions available across government (which can include all elements of design) that the team can use if they find them helpful
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team is considering access needs in their research has recognised that some of their users will have access needs, for example volunteers for social clubs
- the team has made changes to the service based on understanding needs such as reassurance of being able to get help right from the start of the service
- the team is using practices from other teams to make sure that their service is developed in an accessible way
What the team needs to explore
Before their next assessment, the team needs to:
- do their proposed testing with people with access needs (mentioned in point 1) - particularly the user group of volunteers for social clubs
- continue work on what users need for reassurance - for example, users may benefit from having a way to get help on every screen, not just at the start of the service
- work with the wider Companies House and cross-government design community if they wish to trial more global changes such as making links bold or changing colour contrast as these have to be based on needs rather than preferences - the GOV.UK Design Systems community meetups every second Friday are a forum to discuss these ideas
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team are made up of FTE and FTAs meaning consistency and continuity through Discovery, Alpha and Beta
- the Alpha team will be staying with the service through Beta
- the team have all of the expected roles at Alpha and already have a performance analyst on the team
- the team seem to be collaborating really well and everyone seems enabled
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that the interaction designer is onboarded
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the teams have a good process with the Download Sessions to ensure User Research is used to iterate and make improvements
- developers are included in the download sessions, as well as stakeholders and internal teams, this should help all members of the team to be customer centric and understand the vision, decisions and prioritisation
- the teams have all of the expected Agile ceremonies in place
- the team are using collaboration tools such as Jira and documentation tools such as Sharepoint
- the team are working with other service teams, internally and external working groups
What the team needs to explore
Before their next assessment, the team needs to:
- ensure there is coordination and support across the various programs as CH is a very complex organisation with lots of moving parts, particular across Service Design and Architecture
- continue to work with other teams to ensure knowledge and learnings are shared, particularly team who are already using the technology you plan to
- demonstrate how you are going to use data as a decision making tool
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team could demonstrate a wide range of iterations that had been made based on the user testing, this ranged from terminology understood by users to page structure
- the team collaborated well with users for the naming convention
- the team considered the impact of terminology on users with accessibility needs, particularly around wording that might cause undue distress
- the team are now using GDS design patterns
What the team needs to explore
Before their next assessment, the team needs to:
- be mindful when you’re using A/B testing that when the user is presented with all options, it is more like preference testing
- use data that highlights particular problems, such as 40% of rejections being caused by Directors inaccurately declaring they have the right to have your information hidden, to guide your research and add in KPIs to track the success of improvements
- the team have made a good start in just 8 weeks of Alpha but will need to ensure a wider variety of users are included in user testing during Beta
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- they authenticate with the OAuth
- they use microsoft STRIDE model to categorise different kinds of security threats and areas of focus
- PEN testing will be done in private beta
- they have done a DPIA
- mongo volumes are encrypted
What the team needs to explore
Before their next assessment, the team needs to:
- ideally move to two factor authentication
- complete a PEN test and share the results with the assessors
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team have thought of KPIs in addition to the mandatory 4
- the team have already identified the tools they will have available to them to capture and monitor a range of different data types
- the team have thought about a variety of different data sources including both quantitative and qualitative
- the team plan to gain learnings from teams who are already using technology
- the team already has a performance analyst who is also working across multiple programs
What the team needs to explore
Before their next assessment, the team needs to:
- consider whether the internal data that is currently shared monthly could be shared more regularly, weekly or even daily so that faster decisions can be made based off of the insight the data gives you
- think about what success looks like, both during beta and in live and how you will measure it - for example, can the team measure if the new service lets people successfully file changes within the required 14 day change period? If not, is there an opportunity to either increase this metric, or even extend the change period?
- consider adding in some KPIs around rejections
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- they are using existing Companies House components and data stores wherever possible
- using well known common technologies: Mongo, Oracle, JAVA, NODE amongst others
What the team needs to explore
Before their next assessment, the team needs to:
- the tech stack is broad and there are a number of concurrent projects at Companies house. Ensure cross team conversations take place so dependencies and risks are highlighted and understood
- improve the documentation provided at the end of beta so the overall landscape is easier to understand both for new joiners and those assessing the project
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- for this service all code will be in the open, however, for existing services which they rely on code will not be in the open
What the team needs to explore
Before their next assessment, the team needs to:
- highlight key source code they rely on which is not currently in the open that should or could be made open
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- using services that exist in Companies House
- using GOV.UK components such as GOV.UK Notify and GOV.UK pay
- assessing whether GOV.UK Sign In will be appropriate in the future
- they are using 12factor.net for software development
- the team is using the GOV.UK design system and in some circumstances iterating on the patterns as a department (for example, the postcode lookup that includes postcode and street number or name, and the detail card)
What the team needs to explore
Before their next assessment, the team needs to:
- add their lookup pattern and findings from private beta use to the GOV.UK design system ‘address’ issue - this may be of interest to other departments
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- they have 99.9% uptime SLA
- Mesos, Prometheus and AWS cloudwatch provide alerts for issues with services, instances and AWS components
- support teams respond to issues all year round
What the team needs to explore
Before their next assessment, the team needs to:
- continue their path towards continuous integration
- during their private beta report on any issues with reliability and testing they have seen