Government Skills Campus alpha assessment report
Service Standard assessment report Government Skills Campus 21/03/2023
Service Standard assessment report
Government Skills Campus
From: | Cabinet Office |
Assessment date: | 21/03/2023 |
Stage: | Alpha |
Result: | Met |
Service provider: | Government Business Services |
Service description
To give civil servants and the Civil Service the tools they need to identify, develop and use their skills and learning effectively. Civil servants can record their skills, experience and learning in one place, receive personalised recommendations and improved career opportunities. The Civil Service, including departments and functions, can understand the workforce for strategic planning, development initiatives and workforce deployment.
Service users
Through our research, the team has identified the following main users of the service:
-
Employee “I want to develop my skills and career in the Civil Service by making the most of the learning and development opportunities available to me.” Typical role: all civil servants
-
Assessor “I encourage people to make the most out of their talent and help them achieve their full potential.” Typical role: team leader, line manager, head of profession
-
Data insights producer “I need access to data on skills and learning to help identify gaps and plan for the future.” Typical role: data analyst, learning manager, workforce planner
-
Resource seeker “I need to match people with the right skills to the right roles during periods of crisis.” Typical role: emergency resource manager
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- a wide range of users were selected to take part to ensure the team capture a wide range of views and experiences, and consequent user needs.
- a wide range of tools were used, including moderated and unmoderated tools, presumably reducing the risk of bias in the identification of needs.
- the team used learnings from discovery to further identify users to understand their needs.
- the team has identified different needs and mapped them into different quadrants on a continuum based on their experience and career path.
What the team needs to explore
Before the next assessment, the team needs to:
- explore alternative potential paths experienced by users, shifting the focus solely on the happy path.
- consider researching the unhappy paths, particularly potential areas of dispute between employees and assessors. This should particularly review how employees respond to unfavourable assessment, and how assessors will handle unrealistic or inaccurate submissions from employees.
- work to ensure there is confidence that as many user needs are identified as possible, with particular emphasis on the non-happy paths.
- conduct substantial research to identify specific accessibility needs to ensure these needs are also identified.
- alongside accessibility, consider possible access needs.
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team has shown some evidence of how User Research fed into the design of the service and contributed to the decision-making process.
- synergies were found and considered between the system and other current systems to learn from existing platforms and embed current solutions and good practice into decision-making.
- a wide range of methods and methodologies were conducted to test the system, including unmoderated and moderated methods and strategies.
What the team needs to explore
Before the next assessment, the team needs to:
- focus on researching and delivering a service that meets a clearly understood user-need, beyond the aspiration of holding better data on staff skills. Target real-world activities that provide benefit to your users, that will also deliver value to the organisation as a whole. This may include providing more of a ‘career development’ service than a ‘record skills’ service to better match users’ needs and behaviours.
- ensure that the identified ‘pain point’ that ‘employees do not have a central place to record skills’ is a genuine need that the majority of staff have.
- be clear and explicit about the research that has been carried out, including session and user characteristics as well as the rationale for the research in line with the aims and objectives of the team and project and aligned with the problem the team is working to solve.
- ensure all research is reflected in the decision-making and consequent improvement and iteration of the system for confidence that the team is solving the whole problem for the users.
- be mindful of any assumptions that naturally emerge from research findings which could cloud the distinction between user needs and user wants, particularly in terms of how the system might be accepted or used.
- test the whole user journey, from launching a browser to completing a submission - how will users find the Campus if they don’t know it’s there? How will the service feature in existing departmental intranets or ERPs? What steps are available to support departmental HR teams to encourage or direct their staff to the Campus?
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- efforts were made to iterate through the service, improving it based on research.
- consideration has been given to how to integrate the service with current services for a joined-up experience.
What the team needs to explore
Before the next assessment, the team needs to:
- provide a clear and explicit mapping of how research has informed design and decision-making.
- consider how the service will be supported in public Beta - what support will staff need, how will it be provided, how can those needs be preempted by good design.
- ensure that where staff may be unable to use the service, there is a viable and effective alternative for them, to avoid creating a two-tiered skills record, with resultant limitation on opportunity for some user groups.
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team has created a prototype that seems relatively easy to use, looks familiar, with logical information and guidance provided throughout.
- a considerable amount of research was conducted and consequent improvement iterations of the service took place based on feedback from users.
What the team needs to explore
Before the next assessment, the team needs to:
- consider accessibility more thoroughly when making decisions during iteration and consequent changes in the prototype.
- minimise the risk of making decisions based on assumptions to be tested, focusing instead on decisions substantiated by research.
- step back from adding imagery to the service until it can be tested at sufficient volumes to provide qualitative insight into the value. Imagery can add significant complexity in creation, maintenance and development. It should not be added unless there is a strong need identified.
- ensure that the service is designed for behaviour rather than opinion - whilst it can be valuable to gather insight into likes and dislikes, it’s vital to ensure that major decisions are informed by the way users behave in response to the service, with a focus on what they need rather than what they want.
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- many users were considered when conducting research, showing consideration for for different user characteristics.
- the team has invested significant time and effort into the technical considerations of accessibility, including planning for audit of their code and developing to a high standard.
What the team needs to explore
Before the next assessment, the team needs to:
- think beyond technical accessibility considerations and more thoroughly investigate barriers and opportunities for colleagues with access needs. Done well, this service could provide a significant asset to all staff, but there remains a risk if insufficient insight is gathered into the needs of disabled staff, that the service will further emphasise existing inequality.
- consider the different options for accessing the service (for example desktop vs mobile and tablet) even if the large proportion of users follow a happy path, to minimise the risk of excluding significant groups.
- conduct further accessibility testing and ensure that decisions reflect findings to minimise the use of assumptions.
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team is multidisciplinary, showing multiple skills, professions and levels of experience.
- the team was passionate about delivering a high quality service for their colleagues, and demonstrated an extremely high level of knowledge for the domain and their specific roles throughout the assessment.
- the team benefitted from an experienced, empowered and knowledgeable Service Owner.
What the team needs to explore
Before the next assessment, the team needs to:
- ensure the right blend of skills is retained for the move into Beta, and supplemented with appropriate expertise in configuration of the chosen technology.
- consider the long-term make-up of the team working with (most likely) a software-as-a-service technical foundation.
- ensure team members remain empowered to shape the development of the service, and are able to keep designing for their users, not solely to fit within platform constraints.
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team has shown evidence of the adoption of an agile way of working represented by quick service iteration and review, regular dissemination of findings, and fast and regular feedback.
What the team needs to explore
Before the next assessment, the team needs to:
- ensure that, as they move into Beta, they’re able to continue to demonstrate the user-centred design practice and agile working that have characterised Alpha.
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- a considerable number of research rounds and consequent iterations to improve the service based on user feedback were conducted.
- the team has made good use of Figma to allow rapid prototyping of their service and enable frequent design iterations, whilst still using GOV.UK Design System components and patterns.
What the team needs to explore
Before the next assessment, the team needs to:
- continue iterating with decisions based on research and accessibility testing.
- explore and demonstrate the ability of any proposed technology solution to enable iterative delivery to continue, avoiding the risk of delivering large pieces of functionality simply because they are available “out of the box”.
- review the technical options available during their procurement process, and ensure that scope for continuous improvement of their service remains possible with their chosen platform.
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team has identified a clear set of security and data protection stakeholders within Cabinet Office and started engagement with them.
- a number of different data privacy risks have been identified and are being considered.
What the team needs to explore
Before the next assessment, the team needs to:
- demonstrate that the service gives users a good understanding of how their information will be used and test whether they are comfortable with each of the different intended purposes.
- understand any data protection or privacy issues arising from different technology options.
- provide a more detailed design for the access controls that will be required given the very large amount of personal data the system will hold, especially where the running/operating of the service is concerned.
- consider the risks associated with handling or exposing data that may reveal or allow the inference of a protected characteristic.
- make a plan for how frequently the service should be vulnerability/penetration tested and how findings will be acted upon.
- consider the unhappy path in the event of dispute or contention between an individual and the validator of their skills. What route do individuals have to understand feedback recorded about them or the assessors that provided it?
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team provided an extremely well-considered approach to measurement and analytics for the service.
- additional metrics above the mandatory KPIs were extremely well considered, and felt appropriate and relevant to the service.
What the team needs to explore
Before the next assessment, the team needs to:
- demonstrate that the valuable analytical design work that’s been completed during Alpha can be delivered by the Beta service.
- ensure analytics are actionable, and that there is a suitable mechanism for demonstrating ongoing improvement of this service.
- consider what mechanisms should be put in place to demonstrate the wider impact of improved understanding on skills across the civil service - what scope is there to articulate the value of changing behaviours at a team, directorate or departmental level, that lead to measurable change in skill levels?
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team has identified an appropriate set of capabilities that would be required to realise their proposed “to-be” architecture for the service.
- there has been a good analysis of the existing landscape of tools across government, to understand the various integrations that would be required.
- the team has a good understanding of how this work fits alongside other areas of work and how it could integrate with these.
- an appropriate amount of thought has been given to how the proposed solution architecture must meet the Technology Code of Practice.
What the team needs to explore
Before the next assessment, the team needs to:
- review whether there is a smaller subset of the proposed solution that would form a meaningful MVP, allowing assumptions to be tested sooner. For example, this may involve testing only the functionality needed to allow employees to record their skills with their managers/assessors with a suitable taxonomy. The MVP needs to test whether users will willingly submit usable data, without which additional functionality will be of limited value.
- demonstrate how their technology choice would allow them to implement their current designs, to ensure the findings of the existing user research can be incorporated into the service. This should ensure a comparable quality of user-experience to other parts of GOV.UK, beyond visual similarity - performance and speed should be comparable with public-facing services.
- identify how they will test the suitability of different tools they could use without making long-term commitments early on that compromise their ability to respond to changes in requirements or new user needs. This should also test the hypothesis that the level of “configuration” needed by these commercial tools is actually less than a “build” option may be and requires less technical expertise.
- identify whether some of the capabilities or assessment criteria can be met by a combination of tools, or a mix of “buy” and “build” options instead of being constrained to a single supplier for a single tool or platform. This may include exploring separating the UI/presentation from different stores of data, to allow different parts of the service to be iterated independently more easily.
- explain how they will identify which of their market assessment criteria are the most important when comparing differing products on the market, if they do pursue a “buy” option.
- ensure their technology choices have appropriate ways to avoid getting locked into specific tools, for example by providing a suitable data export mechanism.
- ensure their technology choices provide suitable environments for development and testing that allow user research of new features with realistic data, in parallel with the live running service.
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team is aware that choosing to customise “off the shelf” software will limit their ability to release source code under an open source licence.
What the team needs to explore
Before the next assessment, the team needs to:
- identify what any customisation of “off the shelf” software they would need to undertake themselves (rather than by the vendor) and how it may be possible to open source this configuration if it could be reused elsewhere.
- demonstrate what parts of their service could be open sourced, for example any components built to integrate with ERP systems, or any API development
- ensure that newly developed patterns, particularly those that aim to meet specific needs of an internal, civil servant, user are shared openly, with the Design community across government and the Design System team in GDS particularly.
- talk about the work openly, through eg blog posts to articulate the value of user-centred, agile working for staff-facing tools.
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team recognises how important it is to get the skills taxonomy right for their service to provide value to users.
- the prototyping of the service started out using common GOV.UK Design System components and patterns.
What the team needs to explore
Before the next assessment, the team needs to:
- be able to demonstrate how they have contributed to the development of a suitable skills taxonomy or show how they evaluated one they have chosen to use is applicable across the Civil Service.
- identify how common components such as the GOV.UK Design System could be incorporated into their service to provide a user experience consistent with other services.
- understand how their service could use APIs to provide information to other systems, as well as consuming data from elsewhere.
- demonstrate how they have collaborated with the GOV.UK Design System community to share their research around custom components they have had to develop.
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team has identified a number of requirements around creating a reliable service including auditable, reliability, and scalability.
- the need for appropriate testing and test environments has been specifically highlighted, so that changes can be safely developed without impacting the live service. This includes an expectation that the team will review test strategies of any supplier-delivered components.
- the team is considering their different types of users when considering availability, for example those who may only be working outside of typical working hours.
What the team needs to explore
Before the next assessment, the team needs to:
- identify how their technology choice will work with their intending performance and monitoring tools to allow them to observe user behaviour and support users.
- understand how they will be able to test the integrations with other systems, especially across other departments who may have their own test strategies that may not be well aligned. This may differ depending on the systems being integrated with, for example different ERP clusters.
- consider how they will ensure their service is available and they are able to deploy changes regularly without significant downtime, even when dependent services may change at a different pace or may be unexpectedly unavailable.
Next Steps
This service can now move into a private beta phase, subject to implementing the recommendations outlined in the report and getting approval from the CDDO spend control team. The service must meet the standard at beta assessment before launching public beta.
To get the service ready to launch on GOV.UK the team needs to:
- get a GOV.UK service domain name
- work with the GOV.UK content team on any changes required to GOV.UK content