DfE Developer Hub Alpha Reassessment Report

The Department for Education is adopting an API led approach to application integration.

DfE Developer Hub Alpha Reassessment Report

From: Central Digital and Data Office
Assessment date: 17/12/2020
Stage: Alpha Reassessment
Result: Met
Service provider: Department for Education

Previous assessment reports

Alpha Assessment August 2020

Service description

The Department for Education is adopting an API led approach to application integration. As such, there is a widespread need to establish an enterprise-wide approach to managing and governing APIs in order to:

  • promote innovation and support a more integrated approach for suppliers and third parties
  • provide managed access to the department APIs for third-party suppliers
  • increase re-use in APIs
  • simplify integration and enable data to be used more effectively across the department

The theme that runs throughout this assessment is a clear view from the assessment panel that the team are close to or at the minimum standard for meeting the service standard. The panel’s concerns and questions are around the broader context of the end-to-end service which they feel is weakly defined and owned by the team.

The team’s practice generally meets the standard, but the scope of consideration needs to be broadened to solve a whole problem for users and not focus on a core registration transaction.

The panel understands that this means the team will have to work outside of its direct management domain to influence and lead the contributions of others and would look to see significant improvements in this service design and leadership to be fully confident in the direction of travel.

Service users

This service is for:

  • third party suppliers
  • developers

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have learnt from other API services in Government e.g. HMRC and GDS
  • the team demonstrated a better understanding of why third-party suppliers might need to access these API’s e.g. to create more business opportunities
  • the team better illustrated the range of external users and explained why they had focussed in on the third-party developers
  • the team gave a good description of the user research undertaken so far including the range of participants and methods used
  • the team clearly explained their riskiest assumptions, how they had tested them and what they had learned
  • the team have improved their personas and demonstrated how the team are using them in the design
  • the team provided good examples of how the functionality of the tool had changed based on usability testing e.g. create an account
  • it was clear the whole team have engaged with the user research and acted upon the evidence

What the team needs to explore

Before their next assessment, the team needs to:

  • prioritise the work required to understand the needs of what they referred to as secondary or tertiary users. The design and build are at an advanced stage and not knowing these people presents a risk. These are the people who will help the team achieve the strategic project aims. At present, the design is biased towards people who have to use it as part of their role.
  • the user needs to be presented were still related to the function of the tool and contained some jargon. The team have some insights on the problems third-party developers are trying to solve and the users’ needs should reflect this e.g. meeting the expectations of schools/colleges

  • continue to review and update the personas with new evidence. As they are each based on the experiences of one person, they should be referred to as case studies or pen pictures to avoid confusion over what they represent. The team should be mindful of designing based on one person’s experience. The range and subtleties of developer’s behaviours and motivations could be lost
  • prioritise the research with accessibility users. The panel was disappointed this had still not begun. This reinforces the importance of building accessibility into every phase of research and not seeing it as a job that needs doing from time to time.

  • fully impact their user research beta plan against the people required to deliver it. The team are carrying over a large backlog of feature development and are yet to start accessibility work. Add this to the normal beta workload and it is far in excess of what one person can do well.

In conclusion, the user research to date has been focussed on the functionality of the tool that has been built. The panel would like to see research and evidence that reflects the end-to-end service driving the team. For example, quicker response times than those proposed came across as a key user need but is yet to be addressed. It is assumed this is reflective of the pressures third-party developers face. Focussing on this is what will actually make a difference to schools, teachers and students.

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have a vision of how their service will work for users, and the kinds of outputs and outcomes their products will enable
  • the team are taking clear responsibility for the interface between the ultimate user and the API owner which supports getting access to the required authorisations
  • the team has made an effort to engage internal and external users

What the team needs to explore

Before their next assessment, the team needs to:

  • support the development of the recently recruited service owner who understands the scope and takes responsibility to solve the whole problem of supporting the implementation of the API to enable an outcome rather than simply delivering an application process
  • continue to focus on maintaining a shared understanding of how the whole user journey will work with teams responsible for different parts of it rather than simply address one element of it

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has a broad engagement with user research and investigation so there is a broad understanding of the user perspective
  • the team is testing and making changes to users’ experience based on the feedback they receive and their growing understanding of the audience
  • the team is beginning to develop consistent internal design standards and governance for the presentation and management of APIs
  • the team are working with the BAU team and have visibility of the mailbox which enables them to learn from problems that are identified, and use this to improve the service

What the team needs to explore

Before their next assessment, the team needs to:

  • understand that part of their role in leading the service is to set quality standards for the documentation that supports the API implementation and support API providers actively

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has worked hard with GDS to address previous comments and the Developer hub is a significant and exemplary improvement to the previous iteration
  • the team has demonstrated a clear process in place for content publishing and governance for their federated model
  • the team have provided endpoint documentation

What the team needs to explore

Before their next assessment, the team needs to:

https://design-system.service.gov.uk/

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • as mentioned in point 4 the team has worked hard with GDS to address previous comments and the Developer hub is a significant and exemplary improvement to the previous iteration.

What the team needs to explore

  • ensure that issues around accessibility are addressed from the accessibility testing report
  • convert word-based guidance/documentation into HTML
  • ensure documentation and error messages continue to be user tested and conform to GDS design patterns and interface guidelines
  • consider documentation such as a “service guide” that takes the user through the whole journey - example service guides can be found on the HMRC portal
  • consider providing a high-level road map of APIs for functionality that will be made available to external developers
  • continue gaining feedback with a satisfaction survey and popup surveys
  • test (UR) endpoint documentation with users

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team benefits from most of the disciplines, such as user researcher, business analyst and delivery manager
  • the team has identified a sustainable home and leadership for the service going forward
  • the team is transitioning from using managed service to a sustainable team of civil servants

What the team needs to explore

Before their next assessment, the team needs to:

  • work further to clarify the separation in understanding and leadership between the product manager and the service owner

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated activities that enabled them to work in an agile way, such as working in sprints, daily stand-ups with the product team, demoing the work regularly and using retros to learn and iterate within the team
  • the team are utilising different tools such as slack and MS Teams to ensure they keep in touch and collaborate both when in the office and when they’re not working in the same place

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has stepped up to the challenges of previous assessments and been able to demonstrate iterative improvements in the work

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to work as much as possible within the COVID-19 restrictions to engage with end-users as the broader service elements and transactions are finalised

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • DfE security advisors have been involved throughout the development of the service, advising on key decisions and performing security testing. Threats to the system have been shared and discussed and potential fraud vectors considered

  • the service has been penetration tested by internal security teams and an external pen test is planned before the private beta begins
  • a Data Protection Impact Assessment (DPIA) has been conducted at this early stage
  • the team is working with API owners to put in place an approval process for APIs that may contain sensitive information
  • possible future access management needs including two-factor authentication and identity federation have been considered when selecting Identity and Access Management tooling
  • the team have already integrated logs from the website into existing Security Information and Event Management tooling
  • the DfE Developer Hub will be the public face of the much larger Enterprise API Management Platform (EAPIM). The EAPIM platform ensures that sensible policies around the visibility of the API, API access mechanisms and rate-limiting can be applied by API owners

What the team needs to explore

Before their next assessment, the team needs to:

  • complete the “NCSC Web Check” assessment as recommended by DfE security advisors
  • follow up with IT Health Check, Threat Modeling session and penetration tests if recommended by DfE security advisors
  • the team should look into options to force two factor authentication for “Developer Hub” user accounts with subscriptions to APIs containing sensitive information

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have identified some key metrics to measure success, such as requests raised, response time and have taken a broad view into diagnostics

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to iterate and develop the metrics in order to have a more complete, service-wide view on performance as the scope and desired outcomes are refined
  • consider how some of the softer intended outcomes, such as “driving innovation” will be captured

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the DfE Developer Hub will be the public face of the much larger Enterprise API Management Platform (EAPIM). The team has selected technology that aligns with the technology already in use as part of the EAPIM platform and that aligns with DfE strategic choices (Azure). The technology choices are working well for the team in alpha and they plan to carry them through into beta
  • existing common tooling will be used for log storage and security information and event management (SIEM)
  • the team has selected test tooling and methodologies that align with DfE strategic choices
  • the team has carefully considered their choice of Identity and Access Management tooling working with DfE architects and the architecture review board to make a decision. Azure B2C was selected after considering alternatives, lock-in factors and the costs and risks of developing a custom solution

What the team needs to explore

Before their next assessment, the team needs to:

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should ensure repositories reflect modern development practises (pull requests, commit messages, tags, releases)
  • the team should add content to Readme files to ensure it is easy for other services to reuse the code

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service uses the GOV.UK Design System to ensure a consistent GOV.UK experience
  • the team are considering the use of GOV.UK Notify to send emails as part of the developer subscribing to, and gaining approval to use an API
  • there is a commitment to following GDS API technical and data standards where possible. The Enterprise API Management Platform contains a variety of APIs of various ages and only some of them meet the standards. These APIs are being prioritised for inclusion on the developer hub and the service team are working with API owners to bring older APIs inline with the standards where possible
  • as part of the above commitment, APIs will be documented and described using the OpenAPI 3 specification
  • the team is working with DfE to consider use cases for open access to some data sets

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • a comprehensive testing strategy prioritising automation wherever possible is in place. This includes practicing Test Driven and Behaviour Driven Development (TDD and BDD) to drive collaboration and develop test cases
  • an automated deployment pipeline is being developed to help ensure the team can deploy with confidence
  • the team is confident that the technology choices they have made mean the service will be able to handle the capacity needs at beta. Performance testing is planned
  • the service already has some monitoring (and associated alerting) in place using existing tooling that is part of the Enterprise API Management Platform solution
  • a data backup process is already in place

Updates to this page

Published 23 January 2021