Public Appointments beta assessment

Service Standard assessment report Public Appointments 10/05/2022

Service Standard assessment report

Public Appointments

From: Central Digital & Data Office (CDDO)
Assessment date: 10/05/2022
Stage: Beta
Result: Met
Service provider: Cabinet Office

Previous assessment reports

  • Alpha assessment review

Service description

Thousands of public appointments are made each year by ministers across government. The recruitment campaigns for those appointments are run by government departments. The Public Appointments System (PAS) team (previously the Public Appointments Policy Team, PAPT) is responsible for oversight of the public appointments system and the statutory basis- the Public Appointments Order in Council 2019 and the Governance Code for Public Appointments - under which appointments are made. The team is also responsible for delivery and management of the Public Appointments website, used by all government departments to advertise their public appointment vacancies.

The new service will replace the current Public Appointments website. It will provide additional functionality to improve the applicant user experience and provide a cross departmental end-to-end appointment and campaign management service.

The service consists of the public-facing portal and a backend workflow campaign management system, linked by an Applicant Tracking System (ATS). The public-facing portal provides information on vacancies and a secure on-line application portal. Applicants will need to register to apply for roles and will be able to enter securely their personal and diversity data, manage their profile and track applications. The ATS captures the applicant data, so that applicants can manage their profile and track their own applications. Departments can access the backend to manage the recruitment process as set out in the Governance Code and use the talent pool. Departments can also access bespoke dashboard reporting to feed into departmental and ministerial reporting requirements and national statistics, including on diversity.

The website also directly supports a number of other public commitments, including improving the public appointments recruitment process and raising awareness and tapping into talent. The new website will enable public appointments to be explained and promoted more effectively. It will include supporting information for applicants and people wanting to know more about public appointments. The application process will be smoother and more accessible, with applicants being able to re-use previously entered information, change it when they choose to, and track their applications. Departments will be able to use the website to more effectively manage live recruitment campaigns, to forward plan recruitment for appointments which are ending and to proactively manage talent through the website’s talent pool. It will also enable PAS to access the management information it needs for its oversight role, and to discharge reporting requirements, avoiding resource intensive manual collection of data from individual departments.

Service users

  • Citizen users who wish to apply for a public appointment
  • Existing appointees / incumbents who already hold a public appointment(s)
  • Government department users: public appointment teams including campaign managers and campaign administrators
  • PAS team users - own the service and manage and oversee the correct and appropriate use of the service, as well as managing, auditing and utilising the data for departmental and Ministerial reporting purposes

1. Understand users and their needs

Decision

The service conditionally met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • some of the points from the alpha assessment report have been well addressed, such as testing important content in the external user-facing part of the service and maintaining a close relationship between research and content design to implement improvements
  • it was useful to see some specific impacts of user research clearly highlighted
  • a range of user panels had been created to support research and engagement activity

What the team needs to explore

Before their next assessment, the team needs to:

  • consider that it has conducted less research and testing with internal users, and acknowledge this as a gap. This should be addressed through further research to support evidence-based iteration of the backend of the service
  • challenge the belief that the needs and behaviours of international users will not differ significantly from those of users in the UK. There’s an opportunity to test this assumption, and the team should engage with some users who live abroad to do this
  • ensure that any user panels reflect the needs of a broad range of users, and that any gaps are proactively addressed by adding to the panel or developing research insights via another route

2. Solve a whole problem for users

Decision

The service conditionally met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has worked towards minimising the number of times users have to provide the same information to Government, for example by allowing the users to create an account so they can reuse and manage their data. This is due to be part of phase 2 release
  • the team worked closely with policy colleagues to design within the policy and regulatory constraints and were able to bring in some of the workflows and processes of internal users into the online system
  • the supplier backend system is connected to the front-end service developed by the service team. Updating the information automatically when end users progress on an application. Providing internal users access to reliable information via the tracking system
  • the team have used the GOV.UK Campaigns platform to increase the awareness of the service, reusing the existing template and standards

What the team needs to explore

Before their next assessment, the team needs to:

  • work with the GDS content team to join up GOV.UK content with the service you’re working on
  • explore if they can reuse the GOV.UK Sign-in and account functionality that the GDS Digital identity programme are working on. Not only to reuse best practice design-wise, but also as a way for users to reuse and manage that information more widely across government services. The GOV.UK Sign-in plans to connect with many services across government
  • explore if the information and data could be reused with user consent to apply for other roles and jobs within government. For example, for executive or non-executive, voluntary or paid roles
  • review the end to end journey of a department team or internal user using this new service:
    • explore alternatives to the campaign site to raise awareness for internal users. For example, a comms plans to drive adoption from departments
    • design an onboarding process
    • using the system
    • design a support model for internal users
    • leaving/returning to the service

3. Provide a joined-up experience across all channels

Decision

The service conditionally met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is working with an Outreach programme team that currently supports the existing offline application process. They plan to use this existing route for users that can’t use or complete the online process. The team plans to input that information offline to online through the ATS system, so the rest of the application process would be handled online

What the team needs to explore

Before their next assessment, the team needs to:

  • test the offline and online routes throughout the user journey to ensure that there isn’t a disparity between the journeys for users.
  • define how the online service connects to the offline journey and where in the journey users are redirected to offline channels. Would be good to see this process being designed and tested, working alongside the Outreach team
  • define how the team will work with the Outreach team to pass on information back to the online service
  • define how the Outreach team collects information from users (in a secure and safe way) and inputs back into the online system, reducing error from manual input
  • define how users will continue their application process once the information is processed, when they can’t continue online
  • design ways in which the end user is kept informed throughout the process once done offline
  • explore other routes and channels in which the user can apply for a public appointment and how it connects into the online system and database

4. Make the service simple to use

Decision

The service conditionally met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used GOV.UK patterns and components to ensure the end user facing service is consistent with GOV.UK and is following best practices and standards

What the team needs to explore

Before their next assessment, the team needs to:

  • further explore and design alternative routes to support users who can’t use or access the online service, aside from the Outreach team
  • create a support model for their service - both end user facing and internal facing. Design the processes and mechanisms to provide support for users who require it. Make sure you design something that works for your different types of users, including which channels are best suited
  • test with participants on different devices to understand the usability of the service on mobile
  • provide evidence of design ideas being tested and iterated based on user research. We didn’t see the “sifting and scoring” interface and journey for panel members in the demo or prototype
  • design and test the journeys of a user returning to their account. What are the entry points? How do they access it? How will it work with the GOV.UK header where users are expected to see a log-in? and see how this works with GOV.UK Design System patterns. The team have explained their plans to user comms emails as one alternative to route users to their account. This would need testing
  • review some of the design and content design elements in the end user facing journey (search and apply)
    • in the main page of the “search and apply” make it clear what are Public Appointments, what is required from the roles, what to expect from the process. The team mentioned plans to link to the campaign site that explains this better. Review content design and test journeys from GOV.UK
    • within the job description pages, the information is displayed using accordions. This means that users will not be able to search for keywords on a page using command F). This could potentially be a pain point for users who know exactly what they are looking for in a job description. Also, this assumes that all information is equal when it is not (for example, when would you not want to see the essential criteria)? The simplest design would have been one page with all the information, with clear headings for the sections. Recommend applying this approach and perhaps using anchor links at the top for each heading/section to enable easy access to the section and page search. The team can use the content design patterns from guidance pages as a way to come up with other ways to structure the information. Suggest speaking with the GOV.UK content teams in GDS to get guidance and support on this
    • after the “start now” page, the sections need headings as the details components are above the tables. This will enable better hierarchy of the information
    • application completed page - suggest reusing the timeline design that was used within the job description pages, where the timeline was vertical rather than horizontal. So there is consistency in the design and also allows better readability on mobile
    • test and iterate UI, navigation, and content design in the backend internal facing systems (ATS for Panel members and Department teams)
    • it’s currently very reliant on JavaScript. Work with the supplier to see what adjustments can be made
    • conduct user research and usability testing with a range types of users across the departments and stakeholders who will be using the system. Specially user with access need and those who use assistive technology
    • use that user evidence to work with the supplier team to adapt the designs of the system - configure the platform to be easy to use and map to what the users actually need

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the end user facing service (apply for a public appointment) will allow users to save some of their data with consent, avoiding duplication of effort when reapplying for other roles or resuming their application process at any time
  • in the end user facing service, the team has designed ways in which the users can opt out of providing diversity information if they prefer not to. This removes potential barriers for users who would be put off if having to do this

What the team needs to explore

Before their next assessment, the team needs to:

  • the team needs to conduct user research and testing with people with access needs and the use of assisted technology in the backend systems (ATS for panel members and department teams)
  • the team mentioned that they have tested the backend systems by setting up departments to test the systems, to run their tools on their laptops and to make sure that information was coming through. They then feed back to the team. This does not replace conducting your own user research and usability testing with departments. The method that was chosen by the team is reliant on departments to have the internal capability and assistive tech to be able to test the systems and also have access to a wide range of users. Which is why service teams need to do the user testing
  • make sure the service is accessible during private beta, and must have an accessibility audit carried out by an external agency of both end user facing and internal facing services and publish an accessibility statement before opening to a public beta
  • develop a plan and backlog on how you will be implementing recommendations from the Accessibility audit on your service (both end user facing and internal user facing systems)
  • provide evidence of how and when the ATS supplier will address any recommendations from the accessibility audit in their system
  • Have designed and tested offline processes and user support, including assisted digital channels
  • continue testing the content of the campaign site with users and ensure it meets GOV.UK Content standards and clear user needs. Information hierarchy needs to be revisited. For example, there should be clear content about what public appointments are at the top of the page rather than the various case studies

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a full multi-disciplinary team in place which includes the large majority of the DDAT roles expected for beta
  • the team has shown how they’ve created a culture that allows for policy and DDaT to work together in a complimentary manner. They have embedded policy within the service team which has helped DDaT to understand the policy aims and work closely with departmental users of the service
  • the team has clear lines of governance and escalation routes to sign-off changes, provide transparency and resolve disputes internally or with suppliers
  • the team has a clear plan for run and maintain to be taken over by a CDIO team and has already started the onboarding

What the team needs to explore

Before their next assessment, the team needs to:

  • consider that whilst they have done well to cover the DDaT disciplines needed in their current multi-disciplinary team, there is a capability gap in performance analysis. The team needs to consider how to implement the recommendations set out below in section 10 and provide evidence of meeting these at the assessment for phase 2
  • work with comms to focus on establishing a comms strategy that covers both internal and external stakeholders and users. Progress against this should be shown at the next phase 2 assessment

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using agile ceremonies and has shown how it has helped them to pivot and be flexible when needed
  • the team uses collaborative tools to collaborate and manage their work including Slack, Jira, Google Suite, Roadmunk, Confluence and Lucidchart and has now uses these tools with the backend supplier
  • the team has used retrospectives to identify siloes within the team and iterate ways of working to improve team cohesion and collaboration across the front and backend delivery teams
  • the team is providing regular updates to the minister and has an open channel to communicate and discuss specific challenges that may need senior intervention

8. Iterate and improve frequently

Decision

The service conditionally met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • they had embedded a process of using the analytics collected to drive and measure the effectiveness of their iterations during the transition into Public Beta
  • they had explored using hypotheses to come up with success measures for their iterations, this will help the team understand specifically what they should be looking for in the data to understand whether their iteration has been successful

What the team needs to explore

Before their next assessment, the team needs to:

  • use real user behaviour to explore opportunities to improve the service, initially focusing on phase 1 and then subsequent phases. Evidence of this should be provided at the next assessment for phase 2
  • define a plan of activities to improve the offline journey to identify opportunities ‘including quick wins’ that can have a high impact on the user experience for users who cannot use the online journey. Evidence of how this will be approached should be provided at the assessment for phase 2

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • Information Assurance colleagues were involved from the beginning in defining a robust data processing agreement (DPA) with the SaaS provider of the backend applicant tracking system (ATS)
  • the shared responsibility model of GOV.UK PaaS means that the service team are responsible for securing the web application but not any lower-level infrastructure
  • a recent external IT Health Check of the frontend system found no high or critical issues
  • there is a banner that allows users to opt in to accepting analytics cookies that track user activity, and this has been assured by the data protection officer

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work closely with the TDA and Information Assurance colleagues to ensure data is handled securely as new functionality is added to the system
  • commission a further external IT Health Check to assure the new functionality once work on “phase 2” is complete - this is the point when detailed personal data will be collected and stored, so assurance of the system at this point is crucial. The ITHC should be scoped to include information in transit between the frontend and the ATS and the treatment of personally identifiable information (persistence, retention) during the course of an application and prior to its submission to the backend system

10. Define what success looks like and publish performance data

Decision

The service conditionally point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • they had come up with a list of metrics to measure in public beta
  • they explored Google Analytics and Avature as data sources to measure their success measures and the 4 mandatory KPIs
  • they had implemented a compliant cookies mechanism, and have a plan to implement Google Analytics

What the team needs to explore

Before their next assessment, the team needs to:

  • explore getting support from a Digital Performance Analyst, either from within Cabinet Office or bringing in contracted resource
  • implement a Performance Framework to drive out a conclusive list of success measures for all aims of the new service
  • using the measures from the Performance Framework, ensure that Avature and Google Analytics will meet all data requirements and explore other data sources if necessary
  • confirm with Avature that the data and dashboard can amended and added to, if required
  • ensure that the team can receive accurate data from other departments using the service, and not just rely on user research

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have taken the decision to buy a SaaS product for the ATS backend rather than build this complex functionality themselves. The SaaS product has been chosen following thorough review of the market, with a view to ensuring that they are not locked in to a proprietary system indefinitely
  • the team understand what would be involved in migrating away from the chosen ATS backend in future, and know that they can extract all required data using the platform API if necessary
  • the campaign site and frontend function fully without JavaScript, which is used for progressive enhancement only
  • the ATS supplier has committed to making all core functionality of the backend system work without JavaScript
  • the system will integrate with ONS APIs for geographical data

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work with the ATS supplier to minimise the number of pages in the backend that require JavaScript to work
  • continue to monitor the suitability of the chosen ATS to ensure that it is working well for users across departments and ALBs once it is rolled out to them in “phase 2”

12. Make new source code open

Decision

The service conditionally met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the codebase currently resides in private GitHub repositories that are intended to be made public soon, following a code review by NCSC

What the team needs to explore

Before their next assessment, the team needs to:

  • understand the unequivocal requirement of both the Technology Code of Practice (point 3) and the Service Standard (point 12) to make all new source code developed within government open
  • make the existing source code repositories public, and continue to work in the open thereafter
  • commit to developing all new components of the Beta service in the open, except where there is a specific demonstrable reason to keep it private

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team uses the GOV.UK Design System patterns and components consistently in the end user facing service (search and apply journey), including updating to the latest components such as accordions and tags. They have also introduced a declaration statement for most of the requirements in the application questions which allow more accountability for applicants in terms of conflict of interest
  • the team are leveraging Government as a Platform infrastructure components where possible - GOV.UK PaaS for hosting and GOV.UK Notify for sending email
  • the team are using the open source Node.js runtime environment to build the frontend web services

What the team needs to explore

Before their next assessment, the team needs to:

  • share any new design patterns and research with the wider design community and with the Design System team. You can do this via the Design System website
  • review how information is structured in the job description pages (see detailed comment in the “make the service simple to use” section) and tests with users
  • consider the publication of Public Appointment campaign information as a structured data feed to provide a starting point for possible linking between this service and strategic recruitment platforms in future (e.g. Civil Service Careers)

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have established CI/CD pipelines using GitHub Actions that enable code changes to be deployed to production regularly and quickly if necessary
  • the system is monitored using GDS’s Splunk service, and the GDS cyber team are part of the support for this service
  • the service team have full visibility of the monitoring and alerts so they are aware of usage and behaviour of the system
  • there is good unit test coverage of the codebase, and tests must pass in the CI/CD pipeline before code can be merged to master and deployed
  • vacancy applications can fall-back to the current manual email and document-based process if there is a problem with the service

What the team needs to explore

Before their next assessment, the team needs to:

  • develop and test a plan for when the service goes offline
  • design to maximise uptime, for example, design the components so that they fall back to minimal functions if something goes wrong. And review SLAs and contracts with suppliers to ensure
  • ensure that application logging and monitoring of the forthcoming “phase 2” functionality gives good visibility and appropriate alerting of activity such as failed log-in attempts and other events of interest
  • continue to build out additional layers of testing to ensure that the service is working as expected, including load tests and functional end-to-end tests for new functionality that is added for “phase 2”
  • explore options for regularly backing up data from the ATS, for disaster recovery

Next Steps

  • The panel has taken into account the good work that the team has done and the effort they have put into working with constraints. Overall the panel feels it can only provide a conditional met for phase 1 of the service and recommends that the team returns for an assessment at the end of phase 1 before progressing into phase 2.
  • This means that the team needs to address the recommendations made in the not met (specifically in the ‘Make sure everyone can use the service’ section) and conditional met sections before the next service assessment. For any recommendations that span across multiple phases, progress will need to be shown at each service assessment in the future.

Updates to this page

Published 25 January 2024