Get security clearance
The report for the Get security clearance alpha assessment on 23 March 2022
Service Standard assessment report
Get security clearance
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 23/03/2022 |
Stage: | Alpha |
Result: | Not Met |
Service provider: | UK Security Vetting (Cabinet Office) |
Previous assessment reports
- Alpha assessment of Future Vetting Service, 2020 - Met
- Beta review of Accreditation Check
Service description
“Get Security Clearance” enables the requesting and provision of UK Security Clearances for HM Government Departments, HM Forces and Civilian Government-Contracted Organisations.
It consists of a suite of user-friendly, self-service capabilities provided for applicants and requestors to request and manage security clearance so that national security risk can be managed appropriately. The service will also establish capabilities which support future Level 2 and 3 clearance products.
The first product offered by the Get Security Clearance service will be the new foundational Level 1 clearance type. This clearance product is designed to replace the existing CTC clearance type currently offered by UKSV.
Level 1 is a modernised clearance product designed to make effective use of automation, introduce portability of clearances between requesting organisations and standardise risk assessment and decision-making for lower risk clearances.
Service users
Users from requesting organisations - HM Government Departments, HM Forces or Civilian Government-Contracted Organisations:
- requestors – Designated HR representatives requiring new or existing applicants to be security cleared to fulfil secure or sensitive roles within their organisation
- applicants – new or existing employees or associates of the above organisations requiring security clearance to fulfil secure or sensitive roles
1. Understand users and their needs
Decision:
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team knows the importance of user research and the user centred design process
- the wider team is included in user research sessions
What the team needs to explore
Before their next assessment, the team needs to:
- gain a real understanding of user needs through their own user research with a larger number and broader range of users including those with access and assisted digital needs
- ground their design in what they learn about user needs and iterate these designs based on user research
- ensure they use a variety of channels to recruit users for user research - do not rely on digital surveys alone
- that personas are revisited and evolve based on user research findings
- that they conduct user research around managing user expectation, and issues around trust (especially when looking at sharing data across government in future)
- conduct user research on a variety of devices including mobile
2. Solve a whole problem for users
Decision
The service did not meet point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the service team showed understanding of their service and how it fits in with user journeys
- the service team has developed a high level map of the service
What the team needs to explore
Before their next assessment, the team needs to:
- gain understanding of constraints that affect the service with a larger number and broader range of users including those with access and assisted digital needs
3. Provide a joined-up experience across all channels
Decision:
The service did not meet point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team are gathering user satisfaction via surveys and usability testing
- the team is measuring the completion time submission to case processing
What the team needs to explore
Before their next assessment, the team needs to:
- explore and consider how the service integrates with offline channels
4. Make the service simple to use
Decision
The service did not meet point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a good understanding of the design process, iterating and improving designs
- the team has used the GDS Design System
- the team has understanding of content design
What the team needs to explore
Before their next assessment, the team needs to:
- ensure they source a content designer to work on the service
- parts of the prototype deviate from the GDS design system, style guide and interface guidance. The service team to review/audit and align content with support from GDS
- use different methods of user research on guidance content and service content e.g. highlighter, clozure to ensure it meets user needs
- test error messages and shutter pages (happy and unhappy paths)
5. Make sure everyone can use the service
Decision:
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team has understanding of assisted digital needs
- the team has understanding of accessibility
- the team plans to use the Digital Accessibility Centre (DAC) to test the service
What the team needs to explore
Before their next assessment, the team needs to:
- ensure the service is tested with users with assisted digital needs
- test the service with an external provider e.g. DAC and obtain an accessibility audit of the service
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has core roles expected at Alpha
- the majority of the team are permanent, complemented with one partner providing the rest of the roles
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that the team has access to a content person and they work closely to iterate the content, especially as some of the communication with users is quite sensitive
- ensure the team has the right resources considered in the tender whilst moving through the tender process
- increase the number of user researchers so that the team has capacity to do a lot more user research and spend time with users (it would be a big benefit to have at least one user researcher as a civil servant to ensure continuity throughout the service stages)
7. Use agile ways of working
Decision:
The service did not meet point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the service team is using collaborative tools such as Jira and Confluence
- the service team seemed to have taken learnings from the previously failed programme on agile ways of working
- the service team seems to have an understanding of the theory of agile development
- the service team is engaging early with other working groups and digital identity groups to understand where there might be opportunities to collaborate
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate how you are using the test and learn/fail fast ethos of agile. It was difficult for the panel to see how this was being used in practice even though the theory seems to be understood
- demonstrate to the panel how you are using user research to iterate and guide your development and prioritisation
- demonstrate where there is flexibility and room for learning and adapting to what user research tells you as the roadmap seems quite fixed
8. Iterate and improve frequently
Decision:
The service did not meet point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the service team seems to understand the design process and this seems to have been practised in the AC project
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate iterations. The panel struggled to see any iterations and improvements that had been done. Little to no user testing seems to have been done and with such little user research done through alpha it was incredibly difficult to see where iterations and improvements based on user research had been made
- demonstrate where your assumptions were challenged by user research and what changes you made based on that
9. Create a secure service which protects users’ privacy
Decision:
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team presented a sound architecture for beta and are already engaged in making it secure
- the team is working with GDS cybersecurity, NCSC and GSG
- the team has already developed a threat model which will drive some of the most important design choices during the beta phase
What the team needs to explore
Before their next assessment, the team needs to:
- develop a process to follow when cyberattacks happen, specifically data leaks
- at the next assessment the team should present how the system will deal with GDPR compliance (e.g. subject access requests)
- given that many architecture design choices seem to be coming from previous work on Future Vetting Service (FVS) and collaborations with NCSC, GSG and other partners, a decision log should be created to accurately record those choices and the reasons for them having been made. This will help in case of more team changes, project resets, etc.
- creating an “anti-persona” embodying malicious users and their behaviour is a good way to bring security considerations into the whole of the UCD process and not just the tech area. This is often very useful in services where security is paramount, such as this one
10. Define what success looks like and publish performance data
Decision:
The service did not meet point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the 4 core KPIs have started to be considered
- the team has considered an additional metric outside of the mandatory ones
What the team needs to explore
Before their next assessment, the team needs to:
- consider getting feedback on prototypes to understand user satisfaction with user testing or surveys
- understand where they will be getting the data from and show how they will be using it to improve the service
- consider how performance of the new service will be assessed against the current service
- consider if there are other measures needed as identified through user needs
- consider how they will be measuring performance during private beta with measurement tools still undecided
11. Choose the right tools and technology
Decision:
The service did not meet point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the service will be hosted on the public cloud
- the team is planning to use modern languages, frameworks and CI/CD tools
What the team needs to explore
Before their next assessment, the team needs to:
- re-evaluate each technology choice. As presented, the current tech stack wasn’t developed by the team themselves, but “inherited from Future Vetting Service (FVS)”. Luckily, most of those choices make sense (with one exception, see below), yet the team must be able to argue that each choice, whether theirs or not, is the best for the service. Again, a decision log will help document this
- revisit the decision to build the user-facing service as a single-page application (SPA). It might be the case that it is the right choice here, but most often it is not, as service teams almost always underestimate the effort to make SPAs accessible. The argument presented here is that an SPA will reduce the attack surface since it will be hosted on a serverless architecture. While this is correct it is likely that the resulting service won’t be accessible and therefore will discriminate against some users. Many other government services have tried before. In any case it would be a good idea to engage with accessibility specialists as early as possible
- even if an SPA reduces the attack surface, it can increase the risk of supply-chain attacks, since extra third-party libraries are needed. This is fortuitously illustrated by the fact that the team chose to use the govuk-react package, which happens not to be owned by GOV.UK, and whose authors don’t engage with the community
12. Make new source code open
Decision:
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- UKSV is already on github, where the project’s source code is hosted (currently privately)
- the team is planning to make their code open source
What the team needs to explore
Before their next assessment, the team needs to:
- if some of the code isn’t going to be made public, the team will need to have the proper justification for it. Hiding code should not be used as a way to prevent sensitive information from being leaked. In any case, the code should be void of any sensitive data, and that’s why it should be safe to publish. See the official guidance
- write code in the open from the start, and try not to make it public at the end of the design process
- make sure the public code comes with an appropriate licence and copyright
13. Use and contribute to open standards, common components and patterns
Decision:
The service did not meet point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the beta service will be based on the GOV.UK Design system and will use GOV.UK Notify
- the team has engaged with the GOV.UK Sign-in team, with a view to being one of their first users
What the team needs to explore
Before their next assessment, the team needs to:
- engage with the cross-government front-end community, regarding the use of ReactJS, Next.js, the GOV.UK Design System, and more. The team will benefit from the experience of many other teams that have evaluated similar tools before them. The team will also have an opportunity to make the whole x-gov community benefit from its own findings
- engage with web accessibility specialists as early as possible in order to test “risky” assumptions (such as the choice to build an SPA), even if the service hasn’t been built yet. Sharing those findings with the front-end community, or the public at large through blogging for instance, would be very useful
- make sure any common components are legitimate and actively supported (see the point about gov uk-react above)
14. Operate a reliable service
Decision:
The service did not meet point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the beta service will be hosted on a popular public cloud and will benefit from the supplier’s extensive experience in platform reliability
- a team of 4 SREs is available to make sure the system works reliably
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate how the service will work 24 hours a day, 365 days a year. Specifically present a CI/CD stack which will make it possible to deploy new software (or revert previous versions)with minimal downtime
- present a monitoring solution that will help identify problems as early as possible. Platform-level monitoring, as offered by the cloud provider, is usually not sufficient for identifying application-level problems or running smoke tests externally (as a service like Pingdom would)
- have an offline process. A fallback is only part of the solution. The team needs to have a plan for determining why, when and how the service goes offline, and who’s involved in going offline and bringing the service back online