Express An Interest in A Repatriation Flight - Beta Assessment Report
The assessment report for FCDO's Express An Interest in A Repatriation Flight on 18 November 2020
Service Standard assessment report
Express An Interest in A Repatriation Flight
From: | Central Digital and Data Office |
Assessment date: | 18/11/20 |
Stage: | Beta |
Result: | Not met |
Service provider: | FCDO |
Previous assessment reports
N/A
Service description
The service provides a method for British nationals travelling abroad to provide basic details to register an interest in a repatriation flight. It is to be used when country borders close and there are no other possible routes back to the UK, and where FCDO may be considering a repatriation flight as part of a crisis response. Users can provide their details, and numbers or details of the party they are travelling with. These details are then processed by staff in the relevant country to assess demand and communicate options before a flight booking can be arranged.
Service users
- British nationals travelling abroad (short-term travellers on holiday)
- consular staff in country (Internal FCDO, Consular Directorate)
- repatriation and crisis management staff (Internal FCDO, Consular directorate)
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the service team seems to have had great engagement with internal users for the delivery of the internal
- engagement of the whole team in user research activities
- a variety of user research methods was used
What the team needs to explore
Before their next assessment, the team needs to:
- further validate and refine the user needs based on evidence gathered from user research sessions with users (user needs in user manual) in particular for British nationals travelling abroad
- continue usability testing with actual or potential users of the service (British nationals travelling abroad), using the appropriate user research methodology and appropriate protocols. The team should be engaging with a range of users based on the appropriate user segments identified for the service
2. Solve a whole problem for users
Decision
The service did not meet point 2 of the Standard.
What the team has done well
The panel was impressed that:
- consideration has been given to how a user will be routed to the service from existing services, including the Travel advice pages
- consideration has been given to the needs of FCDO staff in terms of information gathering and management
- this product is being considered as part of a suite of products to help British nationals when they are aboard
What the team needs to explore
Before their next assessment, the team needs to:
- review their assumptions with real users of the service to test if they are accurate, and demonstrate that they are solving the whole problem for the (primary) user. The user problem is that they want to get home and can’t due to an emergency. Perhaps within this more specifically, the user problem is they want to know if and how the FCDO could help them get home, if they are eligible for repatriation, how the process works etc. The service demonstrated meets the business needs to solve this problem, but doesn’t solve the problem in and of itself. The panel’s view is that the solution seems to focus on gathering what the organisation needs to make decisions and arrangements, but not enough on solving something for the user. Further user research to fully understand the travellers needs would help with this. This point in the standard is around making sure the service is designed whilst considering the actual goal the user is trying to achieve, and the panel are not assured that this is the case
3. Provide a joined-up experience across all channels
Decision
The service did not meet point 3 of the Standard.
What the team has done well
The panel was impressed that:
- thought has been given to an offline solution, with contact centre staff equipped to promote the online offer as well as help customers if needed
- that there are strategic plans to join up communications channels to enable the best customer experience
- that the team are reacting to a period of unsustainable demand on the telephony channel by creating an online alternative
What the team needs to explore
Before their next assessment, the team needs to:
- make sure that they are testing with real users, with varying socio-economic backgrounds; different levels of digital capability; and different experiences of travel
- consider error messages around email and phone numbers on the service itself. Whilst the thinking has been done to allow an offline route, the movement between the two is not there
- be more clear for users how they will be contacted and how they should follow up any queries
4. Make the service simple to use
Decision
The service did not meet point 4 of the Standard.
What the team has done well
The panel was impressed that:
- consideration has been given to how a user will be routed to the service from existing services, including the Travel advice pages
- some existing GDS design patterns have been used, ensuring consistency with other GOV.UK services users may be encountering along this user journey
- the service has been designed to work online with a range of devices that reflects users’ behaviour, including mobile devices
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that the calls-to-action used across pages within the service are more consistent and more accurately reflect the scope of the service so that users are not left confused about what the form is for. For example, ‘request for a flight’ is referred to on the form timeout page and ‘flight reference’ is referred to in the confirmation email, but the scope of the service is for an ‘expression of interest’ in a flight
- ensure they are referring more closely to GDS design patterns for all form field inputs, particularly email address confirmation. Any deviations from GDS design patterns should be based on strong user evidence, and we’d encourage you to share and discuss this research on Github
- test the service with actual users (a good place to start would be with people who’ve used the service in the past) and potential users, using appropriate research techniques. Make use of telephone and video calls to conduct research remotely
5. Make sure everyone can use the service
Decision
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the service intends to prioritise vulnerable travellers
- consideration has been given to how users who cannot access the Internet may still use the service
- the site is accessible to keyboard users
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that the service is not excluding any groups within the audience they are intending to serve. For example, the service intends to prioritise repatriation of vulnerable groups, but there is no question identifying whether or not the user has a disability
- ensure that full accessibility and assisted digital testing of the complete service (including offline channels) is carried out with real users who have a range of access needs (including mental health, stress, etc.), are from a range of socioeconomic backgrounds, and have varying degrees of digital literacy
- ensure that a full accessibility audit (which uses manual as well as automated accessibility testing) is carried out and publish a complete and findable accessibility statement
- reconsider, with reference to Plain English, the accessibility of language and jargon within the service, including terms such as ‘express an interest’ and ‘repatriation.’ Consider testing the name of the service with actual or potential users of the service. Check the readability of all service pages using a reputable online readability checker
6. Have a multidisciplinary team
Decision
The service did not meet point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the FCDO and Kainos team have well established ways of working, and that design and research skills had been procured at early stages of the project, however there were gaps identified in the research that has been undertaken that can’t be overlooked
What the team needs to explore
Before their next assessment, the team needs to:
- reconsider the use of a professional user researcher and content/interaction designer
- consider the longevity of the service and how to deal with content approval for any changes to the question set. Whilst the service team saw this as unlikely, there is a clear risk that un-tested, unclear content could be in the public domain. It might be worth considering working up a suite of options that can be content designed and tested to allow the repatriation team to select from as needed
- consider access to a dedicated performance analyst to assess whole service impact
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a well established set of sprint ceremonies and a working cadence
- the team gets together to review ways of working and iterate on them
- senior stakeholders are visibly engaged - the panel were impressed that the Service Owner came to the assessment and had clear understanding of the team’s work
What the team needs to explore
Before their next assessment, the team needs to:
- be clear on what team will be there to support this service if it hasn’t been needed for an extended period of time
8. Iterate and improve frequently
Decision
The service did not meet point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the panel accepted that improvements had been made to the service following testing, and as a result of the way the rapid response (alpha) worked. However the limitations of the research and lack of real people using the service means that there is more to explore
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that iteration is based on wide ranging user research, as detailed above
9. Create a secure service which protects users’ privacy
Decision
The service did not meet point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the service team have built the cloud infrastructure with security in mind, including: using AWS’s security and privacy preserving tools as well as running a penetration test in the near future
What the team needs to explore
Before their next assessment, the team needs to:
- thoroughly establish what the threat model is: who would want to retrieve user data, for what purposes, how would they achieve it, and how would the team mitigate a data breach – this information should then dictate what decisions are made to protect user data
- create a privacy policy page
- make sure who the service’s SIRO is, and get sign off from them, as well as guidance from the department about what the service needs to support for GDPR compliance
- even if the data is secure in the service database, it will be downloaded as CSV files and sent around the FCDO. The team should be aware of what happens to that data and what risks exist that they may be able to reduce upstream
10. Define what success looks like and publish performance data
Decision
The service did not meet point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team is considering the performance of the end-to-end service to include performance metrics from non-digital channels. Reducing call volumes to the contact centre during a crisis is a key outcome for the product so is central to their framework
- the team is collaborating with the department’s business intelligence and analysis teams to make their service data and insights available in the future through internal dashboards to support wider organisation performance
- there is a plan to introduce web analytics tools (such as Google Analytics) into the service to monitor and investigate user journeys to identify issues and improve the service
What the team needs to explore
Before their next assessment, the team needs to:
- ensure their performance framework, analytics tooling and evaluation processes are in place before launching the service into Public Beta. The team should also consider how it will source the necessary for this. This would ideally be a dedicated performance analyst but they can be shared with related services where that isn’t possible
- deliver the planned analytics features (including web analytics) that will allow them to measure the service and make improvements. They should also implement the planned feed for the FCDO internal dashboards so that the critical operational data is made available
- work with the wider service team, in particular the contact centre, to ensure the contact tagging and user segmentation are in place to monitor trends resulting from launching the service. They should consider monitoring for avoidable contact, follow-up contact (after completing the digital journey), and contact related to their service. Insights from this should influence the service roadmap after launch
- liaise with the GDS Performance Platform team on the 4 mandatory key performance indicators and future plans for their dashboard on the Performance Platform. They should further consider how they will evaluate user satisfaction accurately as the sensitive circumstances of their users and context of the service will make this more challenging
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team has chosen a satisfactory technology stack, based on open solutions
- the team has designed a sound technical architecture, based on modern WS components
- the team are using git to manage the application code, and are hosting it on github
What the team needs to explore
Before their next assessment, the team needs to:
- change HTTP Authentication to something that will allow users to choose and reset their password if needed. Or ideally integrate the service’s login with other FCDO identity assurance management
- make sure the infrastructure isn’t dependent on AWS and porting it to other cloud providers is achievable should the FCDO change suppliers
12. Make new source code open
Decision
The service did not meet point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team use GitHub, which will make publishing the code easy once it’s ready
What the team needs to explore
Before their next assessment, the team needs to:
- make the application’s source code public. If there are sensitive portions of the code, then the code should be refactored to adhere to the principle that code should be public and sensitive parameters or data needed to run the code should be hidden. But as much of the code as possible should be open, unless the team requests an exception
- the team needs to add documentation to its GitHub repositories. Even though it’s unlikely that many people will want to reuse the application, it’s important to show to the world that we produce quality software, that we are fully transparent about how we build services and that we encourage software reuse. Right now, 3 out of the 4 repos don’t have any description and none of them include a software licence
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team used 3 government components: the GOV.UK Design System, GOV.UK Notify and the (Defra) form builder
- the team have adhered to the server-side principle used on gov.uk and avoided using browser frameworks where unnecessary
What the team needs to explore
Before their next assessment, the team needs to:
- make sure upgrading to new versions of common components is straightforward. For instance the GOV.UK Design System is often updated to improve user experience. Similarly, form builders is a very active area in government and many changes and improvements (eg, accessibility) are expected
- contribute back to the communities that create those components. It’s very likely that the team have improved or found issues, which should always be communicated back
14. Operate a reliable service
Decision
The service did not meet point 14 of the Standard.
What the team has done well
The panel was impressed that:
- a reliable cloud provider was selected to host the service
- the team have build a solid architecture, which follows reliability standards and includes monitoring
What the team needs to explore
Before their next assessment, the team needs to:
- improve alerting: infrastructure-level monitoring is essential but the application might malfunction nonetheless. Smoke tests should be in place to check that the service works correctly, even if it doesn’t crash (for instance by performing a sample transaction and checkout that the application responds the right way)
- establish and document who gets alerted in case something fails (at any hour) and who will be available to repair and restart the application. The team should be able to provide stakeholders with a time estimate of how quickly the service can be restarted in case of an incident
- given that the service will probably lay dormant for a long time before having to be activated at a very short notice, it is likely that a different team may have to perform that activation. Therefore is it essential that a runbook be created that provides a thorough understanding of the architecture of the system, as well as the processes to maintain, deploy, run, scale up or down, and fix it in case of incident
- similarly the process to quickly create or remove multiple links to the service from the relevant GOV.UK pages should be documented