Export from Africa to the UK alpha assessment
Report for the Export from Africa to the UK alpha assessment on 2 November 2021
Service Standard assessment report
Export from Africa to the UK
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 02/11/2021 |
Stage: | Alpha |
Result: | Not Met |
Service provider: | FCDO |
Service description
This service helps African businesses understand: export opportunities to the UK, programmes provided by the UK government and other organisations that support trade with and from developing nations, and the technicalities of exporting from Africa to the UK, such as preferential tariffs and regulations. If African businesses can’t find what they need in existing GOV.UK pages and services, this service enables them to apply for tailored help through the Growth Gateway’s UK-Africa trade team.
Service users
This service is for African businesses seeking information and support to export to the UK.
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has responded to the needs of content creators by producing a content management model, to ensure content is managed sustainably
- the team made content iterations based on pain points and user needs which were gathered in Discovery and Alpha research
- the team has outlined a plan for research in Private Beta, taking a mixed quantitative and qualitative data-based approach to fill specific research gaps from Alpha
- the team’s Service Owner expressed an interest in the team conducting fuller user research
What the team needs to explore
Before their alpha reassessment, the team needs to:
- better define their target audience, either to expand understanding of the diversity of users across Africa, or to better target the needs of users within specific countries / contexts
- increase the scale of research, to be confident research findings are accurate, and go beyond sample sizes based on the Rule of Five to ensure findings are robust and reliable. In this context, the panel felt the Rule of Five was not appropriate to apply to this research, given the size of the target audience and the number of unknowns at this stage of the project
- demonstrate understanding of the as-is journeys to exporting to the UK from Africa
- demonstrate understanding of the context of typical user journeys, and the needs and pain points which emerge for a greater diversity of user groups / types within the target audience of exporters, beyond the role of business owners. The team should consider that not all users may be the business owners themselves (e.g. administrative staff). This may include mapping the digital context of the service from a user’s perspective (identifying competitors), or reviewing their personas and fleshing out the journeys different user groups may take
Before beta assessment, the team needs to:
- explore translation / language requirements, as research has only been conducted in locations where English is either an official language or a de facto official language
- demonstrate understanding of all users in the end to end process, including internal users delivering in-person advice to exporters
- demonstrate their understanding of the offline journeys relating to this service, and the contexts which would lead users to the service, including points of entry to the service
- build empathy for users within the team by having team members from disciplines other than User Research and Content Design observe / participate in user research
2. Solve a whole problem for users
Decision
The service did not meet point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has explored the problem from the perspective of some users and also considered the wider context of existing services, which they demonstrated in mapping out the current service landscape and overlaying their key user needs on top to find the gap
- the team has been working in the open with other government organisations who have similar remit
- the service is scoped according to how users think in that the user research has determined that people want to see the information before reaching out for help
What the team needs to explore
Before their alpha reassessment, the team needs to:
- define what geographies the service will serve, to enable a fuller understanding of users’ contexts in different African countries, and how the service is solving the whole problem within those contexts
- account for the existing major services / contexts beyond UK government services (e.g. national or 3rd sector initiatives) in the target geographies, and understand their impact on the problem space and the end-to-end journey
- evidence why signposting users to in-person advice services is preferable to a fully digital offer, in the context of sustainable working and current policy, given the seemingly high digital capabilities of service users, and why this approach solves the whole problem better than a digital offer (for both internal and external service users)
- understand traffic to existing comparable services, to explore the risky assumption that this target user group will transition from comparable services to this service
- review where the service will be hosted, in order to align with other offers relating to exporting to the UK, and ensure that the service is likely to reach the target audience
Before their beta assessment, the team needs to:
- review the whole problem: if the problem is that export to UK information is unclear and disjointed, then this service solves that problem by collating relevant content and also providing a person to help. However, it does not address the root problem, that the exporting process itself is complex, so anything the team can do to influence that space should also be considered
- explore how the service would be scaled, and better understand the risks of high/low take-up of the service
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team engaged in pair writing with policy to ensure documentation was clear to users and factually accurate, avoiding jargon where possible
- the project team is working closely with the in-person support team to understand the questions being asked and to create content to support them; the in-person support team has been able to contribute to the development of the digital service
- there are longer term plans to use digital tooling to track questions and resolutions as the service scales
- the email to reach the in-person support team is not kept secret and should the service go down, the email will appear so that users do not lose access to the help they need
What the team needs to explore
Before their beta assessment, the team needs to:
- consider, and document, the limits of their service relative to the in-person service, e.g. what they cannot answer, when they are not able to help, how might they offboard a user, what happens with a dissatisfied user who does not feel their question was answered
- the service will ultimately signpost the user to other relevant services from other organisations. The team should decide if the tracking of the user satisfaction KPIs ends at the handover or will the service be aware of what happens to the user after the signposting
- document and standardise the signposting process. For example, will that mean they provide a contact detail, will they make an introduction or do they effectively transfer the user elsewhere. Will that mechanism evolve over time?
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the service team is using GOV.UK patterns
- the team has collated various links and information that are useful to the user, and gives them the option to reach a person at the earliest point. They surfaced the button, and it is clear they want people to contact them
- the service team has tested and iterated the service based on user feedback in a collaborative way with the team
- the content designers have provided guidance text to support the user journey
What the team needs to explore
Before their beta assessment, the team needs to:
- test the service from end to end from how the users would land on the page through to the end of the interaction with the in-person team
- ensure they test the service on feature phones and with users who use feature phones
- the service team should ensure they are capturing whether English is a first language and prioritise speaking to those who are less comfortable operating in English
5. Make sure everyone can use the service
Decision
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the Service Owner is committed to ensuring the system is accessible and available to all users as part of its remit
- the team has conducted some accessibility testing with UK-based proxy users with a range of accessibility needs
- the team has responded to initial feedback on accessibility from proxy users
- the team has used GOV.UK design patterns, to ensure they are compliant with AA accessibility standards
What the team needs to explore
Before their alpha reassessment, the team needs to:
- decide whether the project should hold itself to the same requirements around digital inclusion as a project with a UK citizen-based audience would be held to. If the team decides to promote digital inclusion as part of the service, they should explore specific digital inclusion requirements that the target audience will face, and seek to understand the specific accessibility/digital inclusion challenges faced in different countries across Africa, which can be different to those in the UK
- review how users’ digital skills are reported, to account for the Dunning-Kruger effect. The current research sample has included an unusually high number of ‘expert’ users, and further research may be needed to understand if this is truly reflective of the target audience. A greater diversity of users should be sought going forward if possible
Before their beta assessment, the team needs to:
- workshop how they might reach truly cold users - not known to any of the existing channels and established groups
- find and test with independent users, those who will find the service on their own, with no signposting, no existing relationship, no explanation and introduction from an organisation
- identify what assisted digital support service users may require, and explore how this can be delivered
- use channels other than online surveys to access assisted digital users and/or users with accessibility requirements
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team worked hand in hand with the in-person service team, seeking out opportunities to learn from one another and reflect on how best to solve the whole problem for users
- the Service Owner provides permanent and continuous leadership of the team that is outsourced currently and will be in-housed at the Beta stage
- the team, permanent and interim, digital and in-person service, had a shared plan of when to hand over and how. Some discussions had already taken place with the DDaT leadership of FCDO to make that possible
- the team responded to the recommendations during their mock alpha assessment and reconstituted a team with appropriate representation of professions at this stage. Combined with close collaboration with the in-person service team, this makes for a very strong alpha team
What the team needs to explore
Before their beta assessment, the team needs to:
- the Service Owner should get involved in user research and make the point to attend some research sessions firsthand, to strengthen their understanding of the users and their needs across the programme and both the digital and in-person service
- the team should seek to further engage with practitioners in FCDO. If the team that will take on the service is not clear at this point, the team should ensure they’re keeping the heads of profession aware of the learnings and thinking as their work progresses. This is being done with the technology profession already, but at assessment the team did not evidence that they were doing this with the user-centred design professions yet. This poses a risk for the future of the service. It is also a missed opportunity for FCDO to learn about its users and avoid repeating similar work in the future
- to a similar end, seek to participate further in communities of practice so as to spread their own knowledge with the practitioner DDaT team in FCDO beyond just handing over documentation
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team held the ceremonies expected to enable the team to work in an agile way
What the team needs to explore
Before their beta assessment, the team needs to:
- embed user-testing through all of their work. As identified in Points 1 and 2, user engagement has been limited to date despite a very large and varied set of users identified for the service
- engage with wider stakeholders at assessment: from wider DDaT colleagues, and FCDO teams. There was limited evidence at assessment that this engagement is happening beyond the immediate digital and in-person teams
- test whether the approach taken at alpha is delivering on the expected benefits. While the team considered a range of approaches, including the use of mobile phones as a journey start point, they honed in on one approach at alpha. This relates to the recommendations made under Point 5
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team showed strong iteration of their content and interaction design, on the back of user-testing feedback loops
What the team needs to explore
Before their beta assessment, the team needs to:
- test iterations of the service with a more representative sample of users
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team are minimising both the amount and time data is held
- the team are minimising the amount of data to be collected
- the team have already done a DPIA based on this
What the team needs to explore
Before their beta assessment, the team needs to:
- ensure that data from partially completed forms is definitely removed from the persistent storage
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team identified a number of metrics to assess the performance of the service
- the team considered the ways they would learn from both the digital and in-person services to improve their ability to solve the whole problem for users
What the team needs to explore
Before their beta assessment, the team needs to:
- have a clear plan to collect, monitor and publish all required performance management metrics outlined in the Service Standard at minimum, and possibly additional metrics too
- use the intelligence from this data to inform iterations and improvements to the multi-channel service (whether digital or in-person)
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team has done more than expected thinking on how they will approach beta with respect to the technology and tools. The choices are both technically sound and make sense in the organisational context
- the team has identified that a simple architecture is all that is required for this service
- the choice of tools has minimised the amount of custom work required
- the team has chosen to rely on common platforms to minimise the maintenance of the service over time
What the team needs to explore
Before their beta assessment, the team needs to:
- investigate whether any content delivery network used has a geographically suitable point of presence should it become a necessity
- investigate whether the chosen approach to observability is adequate to satisfy the needs of anybody providing support
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team was very keen to ensure that the digital assets for the service were released as open source once they are available in beta
- the team has already considered which licence they intend to use (MIT)
- the team has already thought about how they will separate configuration to ensure it is kept private
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team is keen to use open standards where the opportunity arises
- the team is using standard GOV.UK patterns
What the team needs to explore
Before their beta assessment, the team needs to:
- add back to the component library if the team identifies anything new while creating their service, particularly around their wizard
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team has designed a simple architecture, minimising the points of failure
- the team has a plan should the service become unavailable
What the team needs to explore
Before their beta assessment, the team needs to:
- ensure that they find a way to avoid losing tacit domain knowledge as the team changes
- integrate civil servants into the service team earlier than initially planned