Submit a General Aviation report beta assessment
Currently, General Aviation Report (GAR) data is sent through a variety of formats including fax, scanned forms, manual paper submission and email. Due to the current numerous types of outdated and manual submission methods, the public can not be assured the data has been submitted to Border Force in a correct and timely manner. The service will provide a more efficient, secure and easy way for the public to submit data, giving the public assurance whilst enabling efficiencies with Border Force processes.
From: | Central Digital and Data Office |
Assessment date: | 27 February 2019 |
Stage: | Beta |
Result: | Met |
Service provider: | Home Office and Border Force |
Previous assessment reports
Service description
Currently, General Aviation Report (GAR) data is sent through a variety of formats including fax, scanned forms, manual paper submission and email. Due to the current numerous types of outdated and manual submission methods, the public can not be assured the data has been submitted to Border Force in a correct and timely manner. The service will provide a more efficient, secure and easy way for the public to submit data, giving the public assurance whilst enabling efficiencies with Border Force processes.
Service users
This service is for use by anyone operating in the General Aviation industry that needs to complete and submit a General Aviation Report (GAR). The GAR is required to inform Border Force who is entering the country by means of General Aviation (any aircraft not operating to a commercial schedule).
The users can range from singleton pilots who fly for leisure, handling agents, large scale commercial outfits known as Fixed Base Operators (FBO’s) as well as administration staff - you don’t need to be a pilot in order to submit the GAR.
1. Understand user needs
Decision
The team met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team have conducted extensive research with their main user groups in the aviation community through alpha continuing into private beta. It was particularly good to see that the whole team were able to speak confidently about the needs of their users and the problems with the existing service
- the team have managed to include a range of their users in private beta, 35 users in total. These included ‘singleton’ leisure pilots, carrying only 1 or 2 people in the aircraft, to larger commercial organisations with a team of administration who could be submitting reports for flights carrying over 200 people. Research methods involved interviews, onsite visits and testing of the end to end service. The team have also conducted remote research with some users outside of UK, as inbound flights are also required to submit the same aviation report. They didn’t find any different needs for these users and haven’t yet identified any unique problems such as language, due to English being the primary language in Aviation. The team have also included new users in their testing to ensure that users without prior experience or knowledge could understand the form and the process
- the team has done well to recruit and find users with accessibility needs to test the service with. Where they weren’t able to find users with visual impairments they were able to find internal users the test the service with. The service has lower digital users who currently submit aviation reports by fax and post as well as users in rural areas with problems getting internet access so it was good to see that the team had tested their assisted digital support model.
What the team needs to explore
Before their next assessment, the team needs to:
- in private beta the team investigated the need for an API for large commercial organisations who could be submitting large amounts of data. Although the team said that they found no need for this they should continue to explore the best solution to meet the needs of these users. The team could measure the number of users who complete and upload a spreadsheet vs users who complete the form digitally to get a greater understanding of why users are choosing to upload a excel spreadsheet and whether there is a need for an API
- the team should continue their research with users with accessibility needs to ensure they are able to use the digital service once it’s public
- the team mentioned conducting some research with users outside of the UK in private beta, the team should continue to include this user group in their research to ensure they can use the digital service and are aware of changes to the service, particularly as there are upcoming changes to submission deadlines.
2. Do ongoing user research
Decision
The team met point 2 of the Standard.
What the team has done well
The panel were impressed that:
- the team have a plan for collecting user feedback in a central location once in public beta. They have a research and design backlog of upcoming things to test, prioritised by the user feedback they’ll receive in public beta. This includes testing how a user might edit a submitted report, as at the moment a user would have to cancel and resubmit if there were any changes
- the team also mentioned plans for a campaign to raise awareness of the service and new policy changes, like the change to submission deadline and the methods of which users can submit reports. By doing this the team hoped to reduce the number of users not submitting reports which is currently 7%.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure the team has capacity to be able to react to feedback quickly and monitor user behaviour using their analytics
- the team should continue to test with new users of the service and users outside of the UK to ensure the service continues to meet their needs
- the team mentioned conducting some internal research in private beta although their primary focus was on external users. The team should also focus on speaking to users internally, for example contact centre staff to see what improvements could be made to their processes
- alongside testing the live service the team should also consider continuing with prototype testing when making larger design decisions, to ensure the change isn’t going to cause any problems in the live environment.
3. Have a multidisciplinary team
Decision
The team met point 3 of the Standard.
What the team has done well
The panel were impressed that:
- throughout the private beta development phase there has been a co-located multidisciplinary team in place.
- since the alpha assessment, a dedicated content designer has been in place
- the team is empowered and reports into a project board via a Service/Product Manager.
What the team needs to explore
Before their next assessment, the team needs to:
- a full time user researcher must be in place, particularly as the service is currently a minimal viable product (MVP) and as more users use the service more needs are likely to be identified
- the team must show that they will continue to have a multidisciplinary team in place throughout public beta to ensure that the service can continue to be iterated in response to user needs
- a plan must be put in place to ensure that permanent team members are able to learn from suppliers so that digital skills can be built and the service can continue to run effectively
- going into public beta, the team should have access to a Performance Analyst to provide specialist input to build on the work the team has done so far, to help analyse consultation feedback and then to do analysis and actionable reporting to help the team do further iterations of the service.
4. Use agile methods
Decision
The team met point 4 of the Standard.
What the team has done well
The panel were impressed that:
- agile practices have been followed well during the private beta development phase, working in fortnightly sprints, with daily standups, show & tells and backlog prioritisation sessions.
What the team needs to explore
Before their next assessment, the team needs to:
- the team must continue to use the agile methods put into practice throughout public beta. The service is currently an MVP and will have a high digital take up after the consultation period.
5. Iterate and improve frequently
Decision
The team met point 5 of the Standard.
What the team has done well
The panel were impressed that:
- the team has continually iterated and improved the service in response to extensive user research.
What the team needs to explore
Before their next assessment, the team needs to:
- the team have a plan for moving the service into Business As Usual (BAU). The team must ensure that this does not happen too early. The service will go into public beta as an MVP. The team must continue to iterate the service throughout public beta, particularly in light of the consultation and the increasingly rich data and insights that will get from users post consultation
- continue with 2 week sprints during public beta to ensure that the service is iterated in response to user feedback.
6. Evaluate tools and systems
Decision
The team met point 6 of the Standard.
What the team has done well
The panel were impressed that:
- the service team are using the Home Office’s Application Container Platform (https://ukhomeoffice.github.io/application-container-platform/) which gives them a highly available and scalable environment allowing them to focus on building their service. They are using Node.js for the user interfaces and Python for the APIs. The service is decoupled from the legacy system to ensure that in the event of the legacy system being unavailable messages will be queued and not lost
- the API can be extended to the public if a need is found.
What the team needs to explore
Before their next assessment, the team needs to:
- it is recommended that the service team make the reference number more human readable.
7. Understand security and privacy issues
Decision
The team met point 7 of the Standard.
What the team has done well
The panel were impressed that:
- the service team evaluated the various options in regards to security and selected passwordless authentication, or magic links, and have followed good practices in its implementation
- the service team has been forward thinking in ensuring that all the mechanisms are available to make roles more granular.
What the team needs to explore
Before their next assessment, the team needs to:
- due to the data that is available on the service it is recommended that the service team do further analysis during public beta to see how to minimise risks and if additional granularity of roles would be beneficial to this.
8. Make all new source code open
Decision
The team met point 8 of the Standard.
What the team has done well
The panel were impressed that:
- the service team have made all front end code open.
What the team needs to explore
Before their next assessment, the team needs to:
- due to the nature of the service and systems involved, backend code could not be made open, in the future the service team should consider ways of potentially opening more of the code by abstracting away the sensitive connection information if possible.
9. Use open standards and common platforms
Decision
The team met point 9 of the Standard.
What the team has done well
The panel were impressed that:
- the team are using the GOV.UK design kit for all public facing forms and are employing the use of Notify for messaging as well as using the common Home Office platform ACP.
What the team needs to explore
Before their next assessment, the team needs to:
- the team should select one of the various port code lists to populate the user interface instead of a free form field. Further research should allow the service team to better adjust the selection of reference data.
10. Test the end-to-end service
Decision
The team met point 10 of the Standard.
What the team has done well
The panel were impressed that:
- the team have taken care to do negative testing and validate what would happen if various parts of the service were to fail.
11. Make a plan for being offline
Decision
The team met point 11 of the Standard.
What the team has done well
The panel were impressed that:
- the team have considered the various channels that could be used in the event that the service goes offline.
What the team needs to explore
Before their next assessment, the team needs to:
- when in consultation it is recommended the service team look carefully at what will happen in an offline event so that GARs will continue to be submitted.
12. Make sure users succeed the first time
Decision
The team met point 12 of the Standard.
What the team has done well
The panel were impressed that:
- the team has run a private beta with a mix of types of users of the service
- the team has run accessibility testing with AirAbility to ensure users with access needs can use the service
- the team has backed up the private beta research with 1 to 1 user research, and pop-up user research at aviation events.
What the team needs to explore
Before their next assessment, the team needs to:
- continue to look at analytics to find areas of the service to improve
- use feedback from the support channels to improve the service
- improve the solution for large operators and those using 3rd party software for flight planning (for example, an API and improvements to the spreadsheet upload route)
- monitor how users are using the organisation functionality and make sure it fulfills all their needs and modelling of roles
- make sure the UI is still working for users over time as more General Aviation Reports are filed and used
- improve search and filtering on large tables of information
- continue to improve and iterate the support channels for users
- mprove and iterate the emails (and text messages in the future) that the service sends.
13. Make the user experience consistent with GOV.UK
Decision
The team met point 13 of the Standard.
What the team has done well
The panel were impressed that:
- the user interface has been simplified and made clearer through user research
- some terms have been simplified throughout the service and are used consistently
- the service is using the Design System patterns and components.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure interaction and service designers are part of the iteration process
- share novel patterns with other Government design teams and incorporate into the GOV.UK Design System (eg passwordless login, spreadsheet upload, large table manipulation)
- keep the design up to date with improvements to the Design System and frontend
- incorporate typeahead and pick lists where possible in the flow
- share work and understanding with the Home Office design community
- keep a glossary of terms used in the service
- make sure the supporting information is consistent, and is understood by all users.
14. Encourage everyone to use the digital service
Decision
The team met point 14 of the Standard.
What the team has done well
The panel were impressed that:
- the ambition is for the GAR service to be digital only, subject to a consultation once the service is in public beta
- the team have worked with users with low digital skills, those with a lack of inclination to change from paper and those in areas of low internet coverage. They have also worked with Aerobility to engage with users with dyslexia, dyspraxia and cerebral palsy
- a communications plan is in place to encourage digital uptake during public beta.
15. Collect performance data
Decision
The team met point 15 of the Standard.
What the team has done well
The panel were impressed that:
- the team has a strong understanding of the current service’s performance and its inefficiencies poor user experience - in terms of lack of confirmation of submission, manual inputting and sub-optimal deployment of Border Force personnel
- the team has used Google Analytics to instrument the new digital service to collect data, for example, on users’ interactions and devices used.
What the team needs to explore
Before their next assessment, the needs to:
- ensure that digital analytics data ‘hygiene’ is appropriate - as advised in the Service Manual on GOV.UK
- as mentioned above, having a performance analyst will allow the team to explore the rich digital data that will be available - enhancing understanding of users’ interactions - not only regarding succeeding first time, but, for example, type of device, time of day, locations and how far in advance of flight users are submitting reports
- investigate access to a paid version of Google Analytics.
16. Identify performance indicators
Decision
The team met point 16 of the Standard.
What the team has done well
The panel were impressed that:
- the team has identified performance indicators in addition to the mandatory ones:
- advanced notification - so can manage staff better
- improved data quality and reduced re-keying
- reduced time officers spend doing manual checks and more time on value added activity like searching aircraft.
17. Report performance data on the Performance Platform
Decision
The team met point 17 of the Standard.
What the team has done well
The panel were impressed that:
- once in public beta, the team should progress in publishing a performance dashboard on the GOV.UK Performance Platform.
18. Test with the minister
Decision
The team met point 18 of the Standard.
What the team needs to explore
Before their next assessment, the needs to:
- the team have been unable to test with the minister but have sought to mitigate this by liaising with Private Office and putting a submission to Ministers to ensure they are aware. The Minister is also aware of the planned consultation
- the team must test with the Minister.