Record your catches beta assessment
The report from the Record your catches beta assessment on the 24/07/2019.
From: | Government Digital Service |
Assessment date: | 24/07/2019 |
Stage: | Beta |
Result: | Met |
Service provider: | Maritime Management Organisation |
Service description
The Maritime Management Organisation (MMO) has built a digital service which enables owners and skippers of fishing vessels that are under 10m in length (U10m vessels) to record their live catch in electronic logbooks and submit it to the MMO and other UK Fishing Authorities (UKFAs) and thereby comply with a license condition. It enables English, Welsh and Crown Dependency Fisheries Authorities to better manage the fishery and assure environmental sustainability by measuring and controlling U10m vessel catch in compliance with the Common Fisheries Policy.
Fishers record the species caught, their weight, where it was caught, with what gear, and where and when the vessels set off from and returned to. These users currently do not record at all, unlike 10-12m vessels (record fishing logbooks on paper) and over 12m vessels (record fishing logbooks via PC software onboard). U10m catch was previously measured from the “sales notes” generated by merchants who buy their catch.
Service users
There are 4,000 U10m vessels in the UK with 2,200 in England and 380 in Wales; there are around 2,400 licensed vessel owners (some vessels have multiple owners).
Around 500 English vessel owners catch 70% of U10m catch by volume and value. These are the most active (76 to 200+ days per year). A further 600 vessel owners are less active (26 to 76 days a year). There is then a long tail of 700+ fishers who are licensed but much less active (less than 26 days a year).
1. Understand user needs
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team were able to talk through the different research methods used to uncover user needs (lots of face to face interviews, usability testing, diary study and spoken to over 100 users)
- assisted digital users have been the centre of the research including users who use assistive tech to go online
- the team are aware there is a need for assisted digital support and have devised a triage for the support via their call centre
- the team were able to give examples where changes to design were made as result of user research feedback and what were the challenges and barriers users faced
- content and language changes made where possible based on feedback
- testing done on various devices and team had good knowledge through research, which devices users use the most to access online services
What the team needs to explore
Before their next assessment, the team needs to:
- continue iterating based on user research findings
- ensure analytics are in place to get the live feedback
- ensure correct measures are in place to provide Assisted Digital support to users who actually fall in that spectrum, otherwise digital take up could become an issue as all users may revert to the call centre route
2. Do ongoing user research
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team are aware of the need for continuous improvement and that ongoing research is key to making iterations
- research will incorporate feedback from users via call centre and google analytics
What the team needs to explore
Before their next assessment, the team needs to:
- nothing apart from continue iterating service based on good user research
- ensure the plan for research is followed (that is for ongoing research)
3. Have a multidisciplinary team
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the project uses a fully spread multidisciplinary team
- the team is collocated across London and Newcastle
- the SRO attends show and tells and is fully engaged with the project
What the team needs to explore
Before their next assessment, the team needs to:
- continue to transfer knowledge and document developments in the project
4. Use agile methods
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team follows scrum principles and is fully ‘agile’ and runs 2 week sprints
- the team runs daily stand-ups to identify blockers/issues etc
- all communication methods include Slack/Confluence and Jira
What the team needs to explore
Before their next assessment, the team needs to:
- continue to use agile tools and methods in their digital delivery
5. Iterate and improve frequently
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team uses semi and fully automated testing processes to speed up development and deployment
- continuous improvement linked to the above allows fast development and is based on feedback and results
- analytics and regular feedback will drive continuous improvement
What the team needs to explore
Before their next assessment, the team needs to:
- continue to iterate based on user feedback and usability testing
- continue to inform service improvement based on your performance analytics and feedback
6. Evaluate tools and systems
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the identity management system chosen Azure Business to Consumer didn’t support non javascript browsers so to overcome this bespoke front screens were built
- different technologies were explored with which to build the service’s mobile app before deciding upon mobile app native approach
What the team needs to explore
Before their next assessment, the team needs to:
- evaluate the pros and cons of moving from Cosmos to Mongo database (as the latter uses Mongo API facility already available)
7. Understand security and privacy issues
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the service uses an extensive test suite which follows the ‘service by design’ mantra and uses a code change peer review process
- the system minimises the amount of personal data needed to be stored to the essentials (vessel, owner and skipper names and commercially sensitive territories)
- the team has successfully completed a Data Protected Impact Assessment
What the team needs to explore
Before their next assessment, the team needs to:
- undertake external security assessment (to be conducted by BSI)
8. Make all new source code open
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- code is stored on private Github at the moment with the aspiration to share it on public Github as soon as commercial environment permits departmental approval to do this
What the team needs to explore
Before their next assessment, the team needs to:
- move code to DEFRA open Github once Principal Developer has given green light for this
9. Use open standards and common platforms
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- apart from Microsoft Azure the service is built using open source technologies
What the team needs to explore
Before their next assessment, the team needs to:
- as per point 6 - evaluate the pros and cons of moving from Cosmos to Mongo database (as the latter uses Mongo API facility already available)
10. Test the end-to-end service
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a Continuous Integration Pipeline - development and test system used to build and deploy service
What the team needs to explore
Before their next assessment, the team needs to:
- carry out performance testing as part of preparation for Public Beta
11. Make a plan for being offline
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the service desk run by MMO in conjunction with the Environment Agency will provide support for service
- the backstop in case service is unavailable for any length of time being that mobile app retains data
- there is 24 hours contingency planning in place - unlikely to be offline for this length of time in practice
- there is a back-up second cluster in place - automatic recovery
- random switching off of clusters has been carried out to further test the robustness of system
What the team needs to explore
Before their next assessment, the team needs to:
- continue the good work! This is one of the best responses to point 11 of the Service Assessment that I have had the pleasure to assess and the Technical Architect explained his system very well for the panel at the Tech Part of the Assessment day itself in 100PS
12: Make sure users succeed first time
The service met point 12 of the Standard with conditions as below.
What the team has done well
The panel was impressed that:
- the team had tested the service with users in contextual situations, including those using assisted digital support, and found that the vast majority of users were able to use the service first time
- assisted digital support needs are being considered front and centre - the team wrote their user research recruitment briefing specifically to focus on users with low digital capability. 30% of the users they researched with fell into this bracket. The offline channels they’ve put in place are sustainable and iteratable, with feedback loops and measurements, upon which the team act
- the team were able to explain improvements made to the service in response to research, testing and analytics, for example reducing the scope of the service, and designing the call centre route to allow previous callers to bypass the triage process to access support more quickly
- the team had an accessibility audit carried out by a third party and are acting on the recommendations
What the team needs to explore
Before their next assessment, the team needs to:
- consider other ways, other than email address (eg name, vessel ID or phone number) because some users will never have had an email address before and will find other identifiers easier to remember
- consider SMS as a way to communicate with users, instead of email and letters, explore the use of Notify
- the team must complete the start page, and reorder the pages of the online service so the registration and preferences section come after that start page (by-passable by non-newcomers by logging in, for example by offering a ‘Sign in or register’ option). This must happen before the public beta commences, unless the team can provide compelling evidence as to why this isn’t good practice
- consider creating a service community and a step-by-step guide
13. Make the user experience consistent with GOV.UK
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the service was easy to use, responsive and worked on mobile devices
- the service team includes designers, content designers and front-end developers, working in a fully integrated and holistic way. This extends to the call centre and the people at the ports who will be helping users to complete the service if necessary
- the team has made good use of the GOV.UK frontend toolkit and elements, and the GOV.UK Design System to which they are planning to propose new patterns to be included there. They have adhered closely to the GOV.UK style guide, and the user experience is consistent with GOV.UK
14. Encourage everyone to use the digital service
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team have considered the various methods of engaging with their users (via physical letter and/or email) and by being a required part of vessel licensing policy
- through the extensive user research the team have pre-engaged with users therefore many are expecting the delivery of the new digital service
- they have an intention to use the service investigate the roll-out for 10m+ vessels (currently a paper-based method)
15. Collect performance data
The service met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has an in-depth understanding of their Users, demonstrating comprehensive knowledge of the size and shape of their service. They have used this information to produce a data driven, phased roll out plan, following a risk based approach. This includes a plan for Assisted Digital and targeting Users based on size and activity of the vessels
- the team has made full use of the quantitative and qualitative data available from a number of sources, to improve the service and gain buy-in from the Fisher community
- the service team were planning to use online and offline data (Call Centre, Marine Enforcement Officers, Admin Database) to monitor and improve the full end to end journey
What the team needs to explore
Before their next assessment, the team needs to:
- continue to use the various data sets to monitor take up, pain points and feedback, using this to target any groups or individuals that need encouragement or support in registering and using the service
- explore and implement the digital analytical tools available for this service and app. including, as mentioned by the team, implementing Google Tag Manager, to give greater flexibility in what they monitor
- the service is storing commercially-sensitive data, so the team should look at paying for Google Analytics so any analytics remains MMO’s property (carried over from Alpha assessment)
16. Identify performance indicators
The service met point 16 of the Standard.
What the team has done well
The panel was impressed that:
-
the team have a clear idea of what the service needs to achieve to be a success, this includes the service performance indicators and wider metrics that have further reaching benefits such as;
- meeting EU requirements (removing the risk of infraction of EU regulations and subsequent significant fines)
- improved compliance with licensing agreements, and
- scientific research that will inform conservation of fish stocks
- the team have taken a holistic view of the service and have engaged with a wide range of stakeholders to continually monitor the impact of this offering (on the Fishers, the Call Centre, MEO, other parts of the business and the wider fishing community)
What the team needs to explore
Before their next assessment, the team needs to:
- use the data collected from the phased rollout to develop the targets for their service, especially as this is a new digital service and baseline data was limited
- continue to work with their Stats team (in the absence of having a Performance Analyst) to collect, visualise and monitor the necessary data to ensure the service is operating successfully
17. Report performance data on the Performance Platform
The service met point 17 of the Standard.
What the team has done well
The panel was impressed that:
- the team were currently looking to get their data into a format that could be shared with the Performance Platform
What the team needs to explore
Before their next assessment, the team needs to:
- register their service with the Performance Platform
- continue to work with the Performance Platform team to check that it can support the metrics you want to present on your dashboard
18. Test with the minister
The service met point 18 of the Standard.
What the team has done well
The panel was impressed that:
- the Minister has been closely sighted on the project; 3 ministerial submissions since August 2018. Have also sighted the Wales minister