HMPO Online Appointment Booking (CHAMP) - Beta Assessment

The report from the beta assessment for Home Offices's HMPO Online Appointment Booking service on 3 November 2015.

Department / Agency:
HO / HMPO

Date of Assessment:
3/11/2015

Assessment Stage:
Beta

Result of Assessment:
Not pass

Lead Assessor:
D. Williams

Service Manager:
J. Potton

Digital Leader:
M. Parsons


About this service

HM Passport Office has developed an online appointment booking service that allows users to book an urgent passport counter appointment. Applicants can be eligible for either a Premium and / or Fast Track appointment and the tool determines their eligibility through a series of straightforward questions. Users typically require this service if they have a need for a new passport urgently (within the next 3 weeks).

Outcome of service assessment

After consideration the assessment panel has concluded the Online Appointment Booking service should not yet be given approval to launch on the service.gov.uk domain as a Beta service.

The panel was very impressed with how well the service team had been working across two locations and organisations, and is very obviously working well together. A lot of good progress is being made and the service is on a very positive trajectory. The work to understand both users’ demand for and barriers to using a digital service is good, and the approaches to encourage digital take up are appropriate.

However, there is still some work to be done to improve the overall service. The panel found that there are some aspects that require further work before the service is ready for public beta.

Reasons

Point 1. Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for the design of the service.

Low-skilled users had been recruited for some of the lab testing of the on-screen prototype, and the team will be expanding this work in public beta. However, the team said they had not carried out research with users of this service to understand their support needs. The team were asking broad questions at the end of lab testing days about how users would access the service, but not about their support needs.

Point 3. Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

The panel had concerns around the team not having a dedicated interaction designer and user researcher on board, and are sharing that resource across the Passport Exemplar which may not be sustainable.

Point 12. Create a service that is simple and intuitive enough that users succeed first time.

The team demonstrated both the appetite and ability to change the design of the service in response to user research. However the current lack of either an interaction designer or researcher in the team will prevent the service from continuing to iterate in this way throughout the public beta stage - which is essential as the service starts being used by real users.

There is an assumption that the department’s phone support will meet users’ support needs. However, as the team has not researched what support users need, this and any other decisions around support offerings are not well informed. The team said they are working with Timpsons on a pilot to look at face to face support, but before progressing this the team should research to establish whether this will meet a support need.

There is a requirement for users of the on-screen service to have an email address. The team had observed that users in user testing had all entered an email address and managed to complete the service. However, this does not tell the service team to what extent entering an email address (and then having to access it) is a barrier or pain point for low-skilled users; whether users will struggle to access their email account even if they have one to submit; or whether some users would at this point opt for the phone service.

The service is able to measure and iterate the telephone support.

Recommendations

Point 1. Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for the design of the service.

The service should

  • do contextual research with users to understand what support they would need for this specific service
  • do user research to fully understand user needs not wants or preferences
  • conduct face to face testing with people using assistive technologies
  • include users with the lowest levels of digital skills, confidence and access
  • include users who would seek support from third parties
  • include users who would get support from friends and family, to understand what alternative support they need (because friends and family support can not be included in the model of support)

Point 3. Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

The panel commends the team for all the hard work they are doing to further refine and improve the service for end users, especially in the light of resourcing challenges. The good work team is doing around user research should be supported and the panel wants to enable them to take it further. To do this the team needs to be able to secure sufficient user research support, and the service manual generally recommends 3 days/week. A dedicated researcher on the project will enable the team to fulfil their ongoing research activities in a consistent way, as well as help the team to better understand the needs of their broad and diverse audience.

Point 4. Build the service using the agile, iterative and user-centred methods set out in the manual.

While implementation of agile was generally impressive, especially the adaptations made for a team based in multiple locations, the panel was slightly concerned about the overall departmental governance. The discussion of releases needing approval by change boards is not indicative of governance that enables the team to move fast. The team has done everything in their power to minimise the impact that this has on delivery, but the service manager should have the capability to make release decisions.

The panel recommend that the governance process be defined around “trust but verify” to allow the service manager to make decisions around deployments, but to verify that decisions are being made appropriately.

Furthermore, it was worrying that general oversight was described as a PRINCE2 wrapper around the agile project. This is not an approach that GDS recommends as it tends to result in reduced buy-in and issues with governance. Senior engagement with Show and Tell sessions was impressive; the panel encourage the team to continue to adopt agile governance practices to improve the overall service governance.

Point 5. Build a service that can be iterated and improved on a frequent basis and make sure that you have the capacity, resources and technical flexibility to do so.

It is concerning that the team still needs manual deployment steps when deploying to production rather than a fully automated deployment system. Good progress has been made on having an automated build system and automating the creation of installer files, but manual deployment processes are still subject to errors and mistakes. The panel recommends the team move to fully automated deployments, ensuring that the deployments are repeatable, reliable and auditable.

Furthermore, the deployments should be done without any disruption to users. Requiring deployments to be done out of hours or by resetting users’ sessions will actively prevent the team from doing more regular, more common deployments and prevent the team from reducing the cycle time appropriately.

The service should invest in zero-downtime deployment mechanisms that allow live deployment of the new system without terminating users’ sessions or shutting down the service.

Point 6. Evaluate what tools and systems will be used to build, host, operate and measure the service, and how to procure them.

The fact the system is being built upon the ‘Outreach’ platform when options to migrate away from the commercial platform are not available is concerning. It was unclear how much of the IP the service own, especially with regard to ownership of the platform and the custom work. Not owning the full IP would prevent migration to a different supplier.

The panel recommends that the migration options are fully understood so that the decision on the choice of outreach as a platform can be clearly articulated.

Point 12. Create a service that is simple and intuitive enough that users succeed first time.

The service should:

  • Work with the Home Office Digital’s Head of Profession for User Research and Design to secure a researcher and interaction designer for the public beta stage; the service will be high volume so should be resourced accordingly.
  • Iterate the design of their email notifications to match the GDS design patterns for email.
  • Put together a model of support that is appropriate to users’ support needs and includes:

    • routes provided by third parties (if required);
    • face to face support (if required);
    • alternatives for users who would get support from friends and family.
  • Ensure all routes of support are iterable, measurable and sustainable.
  • Be ready to test all support routes (from all providers) during the public beta, to be able to evidence in a live assessment that they are meeting users’ needs, as per the Service Standard.
  • Test for the above on the ‘unhappy’ path such as amending/cancelling appointments.

Summary

The outcome of this reassessment will no doubt be disappointing, however the team should know that the panel was very impressed with the enthusiastic adoption of agile practices and in particular the rapid iterations of the design following user testing sessions.

The panel was pleased to see that improvements have been made to the service based on the lab and popup testing that has taken place, together with the regular feedback from the Telephony Preference team.

The panel was also encouraged to hear that most of the team are actively involved with the user testing e.g. observing and analysing the insight collectively.

Digital Service Standard criteria

Criteria Passed Criteria Passed
1 No 2 Yes
3 No 4 Yes
5 Yes 6 No
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 No
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes

Updates to this page

Published 27 January 2017