Digital Assessment Service Platform

The report for the Digital Assessment Service Platform alpha assessment on 06 June 2019

Digital Service Standard assessment report

Digital Assessment Service Platform

From: Government Digital Service
Assessment date: 06 June 2019
Stage: Alpha
Result: Met
Service provider: Standards and Testing Agency

Service description

The aim of the assessment service is to provide one place for both internal users and external users to develop, deliver, administer, conduct and report on primary national curriculum assessments.

The aim of the alpha phase is to scope the requirements that enable item validation trials (IVT) for the first assessment that will be delivered through the platform, Digital Reception Baseline Assessment (dRBA). Development of the required functionality will take place during beta, to facilitate dRBA undergoing IVT in September 2020.

Service users

Internal:

  • STA Test Development Researcher
  • STA Project Staff
  • Psychometricians
  • STA Admin
  • STA Support

External:

  • Expert Reviewers
  • Schools - Admin
  • Schools - Assessor
  • Schools - Pupil

some user research was collected in AS/RBA/MTA alpha, more research or validation needed in beta user-research not yet started, planned for beta

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • identified key internal and external user groups
  • the focus on internal user needs has been based on a clear rationale relating to addressing the largest knowledge gaps, rather than on convenience
  • demonstrated a good awareness of how the insights on internal users could be refined further. For example, by considering the distinct needs and experiences of newer members of staff
  • the team has made good use of previous research, using it to develop user needs for external user groups that they have not yet directly engaged with
  • the aim of meeting user needs and providing a good experience is reflected in planned performance analytics on user satisfaction, alongside more traditional and transactional measures, such as completion and take-up

What the team needs to explore

Before their next assessment, the team needs to:

  • engage directly with external user groups, such as school administrators and headteachers to better understand their needs and to validate previous findings
  • develop a deeper understanding of how internal user needs may vary. For example, for staff with access needs or for new starters unfamiliar with process or terminology related to test development
  • the end-to-end service was described as a “portal”, often things called portals become collections of disjointed services. It makes sense to have one platform to develop, deliver, conduct and report on primary national curriculum assessments but be mindful that users may see these as different services so may have different needs when it comes to onboarding. For example, you might need multiple start pages from GOV.UK

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team used an iterative process of refining design based on research findings
  • research findings were regularly shared with the team to prompt discussions about design implications
  • insights have been shared back to other teams via wider show and tells
  • although working within a clearly defined scope focused on back-end administration, there have been efforts to contribute to wider issues that do not fall directly within their remit. For example, helping the service as a whole to ensure that digital tests are reliable and valid, and that they meet the needs of all users
  • there is an awareness of current limitations of the research, and a clear plan to address these during beta

What the team needs to explore

Before their next assessment, the team needs to:

  • research with external user groups during beta should be geographically diverse, including schools with lower ofsted ratings and in poorer areas
  • fieldwork should be conducted where possible to ensure a good understanding of these potentially diverse contexts and to provide ecological validity
  • given the larger population and potentially more diverse needs of external users, the team should ensure that they access a good mix in terms of digital skills and confidence, access needs, and assisted digital needs
  • continue to ensure that findings that might be out of scope are shared with relevant teams (particularly dRBA), to help improve the design of the wider service

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the delivery team was well structured and employed people in all the relevant roles. Nobody was required to perform multiple roles and there was appropriate focus on, and resourced assigned to, user research
  • the service owner had the appropriate level of knowledge and autonomy to make day-to-day decisions to improve the service and challenge interference/un-evidenced commissions where required
  • the delivery team were able to scale up and down according to the needs of the service so that gaps in the team structure/resource profile can be filled quickly before they impact on the overall delivery of the service

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to ensure they are not over-reliant on contractors by increasing the availability of permanent staff and conducting effective knowledge and skills transfer

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the delivery team were able to explain how they are working in an agile way and were utilising appropriate tools and techniques to collaborate and deliver the service
  • the delivery team were able to provide clear examples of where they have responded to user research to identify issues within their initial design and then iterated accordingly to fix the problem
  • the overall governance of the service is also working in an agile way and has a clear plan for delivering the service in measurable, iterative stages whilst being aware of the risk of changes in policy and having plans to manage that risk

What the team needs to explore

Before their next assessment, the team needs to:

  • work with their estates team to ensure they are collocated within a distinct area which provides the team with the resources and facilities required to support an agile delivery project
  • remain fully joined up with the parent department’s digital team to ensure consistency in approach, effective use of resource and maximise opportunities for collaboration, knowledge share and upskilling

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the delivery team were able to provide a clear explanation of what has been built in the Alpha phase along with a sound rationale for why those elements had been prioritised
  • the delivery team were able to demonstrate how the outcomes of user research are fed into the life cycle of a user story to ensure the service is being built to meet user needs
  • the delivery team were able to provide a clear demonstration of their chosen technology stack and were able to confirm it is compliant with their Department’s IT strategy and technical standards
  • the team understands where their service touches other services, and is in communication with those teams, working closely with MTA and are part of a new wider digital team
  • the team built clickable prototypes for key journeys, iterating them, and throwing some away
  • encouraging a multidisciplinary approach to designing and developing features

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure the design remains flexible enough to accommodate their Department’s evolving data strategy and has a clear plan for working with any dependencies within the wider DfE infrastructure
  • ensure the design makes direct use of the appropriate sources of truth for any data it consumes
  • the panel would like to have seen a broader range of ideas and assumptions tested during the alpha phase. It seems as though the outcome was decided early on, however, the end design is a reasonable solution to the user’s needs

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the evaluation of the buy vs build decision seemed fair and comprehensive. It is very welcome to see that the team has considered off the shelf, commodity solutions, but also recognised the advantages of an internally-built open source service
  • the team explained the languages, frameworks and other technical choices made in alpha, and how these will affect the decisions for beta
  • we welcome the technology priorities stated by the team: understand the buy vs build trade-offs, create needed infrastructure, make the solution flexible. These are good priorities to have

What the team needs to explore

Before their next assessment, the team needs to:

  • test that their platform is usable by the downstream development teams who will depend on their API. Using the DfE common Azure platform for deployment and alignment with the DfE API Strategy is welcome, but the introduction of the GraphQL approach may test the boundaries of this
  • demonstrate how they monitor the status of the service, and include alerting as part of this strategy. The use of serverless, small components should make it fairly straightforward to develop a baseline of “likely benign” traffic
  • describe the tech stack changes and development toolchain changes made during beta and why. The team should note that toolsets are expected to continue to evolve, and the early decisions around the common Department for Education platforms and tools are open for revisiting as beta development progresses

7. Understand security and privacy issues

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has conducted a security posture review, and is lucky to have a security-experienced team member on a regular basis. The threat model represented in the anti-personas developed by the team represents a good way of thinking about the adversarial problem
  • the team is aware of the potential sensitivity of data being stored within the service and has a plan to partition sensitive from non-sensitive data
  • the appropriate privacy officers are aware of the project and are considering the data involved. We note that the service collects nationality and other data about pupils which could be considered sensitive, and that there are sections of the General Data Protection Regulations which apply specifically to systems used by children

What the team needs to explore

Before their next assessment, the team needs to:

  • implement multi-factor authentication for all agency users, and users in schools as well if at all possible, given the sensitivity of the data. We also suggest that the team review and implement the current NCSC guidance on passwords, noting the sections on complexity requirements, length, multi-factor and expiry
  • do more thinking about the monitoring solutions that will be in place, and specifically how data flows will be instrumented to detect exfiltration of data. The team should have a plan in place to understand any anomalous exchanges of information between technical service components and a way to identify if data leaves the trusted virtual network
  • describe how they are managing, logging and auditing the process of information flowing in and out of the system. As the system becomes more integrated with other data feeds from DfE and elsewhere, keeping track of the provenance of data additions and updates will become more important

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has begun discussions with the appropriate security assurance authorities and asked for permission to work openly. The team is aware of the requirement to work in the open and the limited number of exceptions to open code, which meets the requirement for an Alpha-stage assessment
  • the requirement to work in the open and retain intellectual property rights was a factor in the decision to build rather than buy a packaged solution, which is welcome
  • the team expects to be able to work from an open github repository once they receive clearance to do so, and will be able to separate credentials and other configuration code from the business logic of the service

What the team needs to explore

Before their next assessment, the team needs to:

  • gain the appropriate permissions and workflow affordances needed to open their code and work from an open repository
  • show evidence of continuous working in the open, including ongoing releases with commit comments, rather than periodic exports or flattened releases
  • publish the data schemas used to model the assessments and items and create API documentation available to developers making use of the data from the service

9. Use open standards and common platforms

Decision

The service did not meet point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team intends to use the common Department for Education platform for deployment of the service, which is very welcome and should be an efficient approach to hosting and operating the digital service
  • the solution will be built using standard open source components, which will help to avoid contract lock-in
  • the team has considered the use of common platforms for the system, and has adapted the common GOV.UK Design Patterns. The team is also prepared to contribute design patterns relating to administrative interfaces, which is welcome
  • it is welcome that the team has agreed to consider the publication of open data feeds using this assessment service platform wherever possible

What the team needs to explore

Before their next assessment, the team needs to:

  • consider whether the project could either consume or create Registers, given the authoritative lists and non-proprietary IDs being worked with by the system. The team should consider connecting directly with the EduBase/Schools Register projects at DfE to look at whether lists of schools can be reconciled and maintained in a common way
  • document the shape of the data, where it’s stored, how it moves, how it’s logged, other places it comes from and how it’s transformed. This could have an effect on a number of design decisions
  • consider advancing the thinking around the long-term data strategy, as the underlying data questions can lead to complex integration and feasibility issues. It is not too early to look at the characteristics of data moving in and out of the systems including batch, transformation, integration, messaging integration, changes to master learner records and Pupil Identification Numbers, etc. This should include analysis of legacy systems within the Department for Education ecosystem
  • consider using platforms such as GOV.UK Notify to help with the implementation of multi-factor authentication
  • propose the new Assessment Standard as a formal challenge to the government Open Standards Board, and then propose your solution and the governance that you are proposing for the evolution of the standard over time. This will help to establish a trusted ecosystem for the third-party developers that you expect to be working with over time

10. Test the end-to-end service

Decision

The service did not meet point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the alpha service has been tested with users and there is a plan in place for automated and frequent testing of the new platform
  • the team has good relationships with their user community and has recruited representatives of many of the potential user personas
  • the interface for managing the assessment items themselves has evolved based on user testing, and the “preview” and other features seem to be well considered
  • the choice of public cloud environment seems to form the foundation of an effective deployment environment, with the possibility to create new environments quickly
  • the team had performed both performance analysis and load testing to validate performance against the expected service level

What the team needs to explore

Before their next assessment, the team needs to:

  • move on from thinking about “minimal viable product” in terms of a minimal set of features deliverable within the timeframe, and focus instead on creating the system that creates value for users. For example, given the number of Items that will be managed by the system, exploring the different patterns of tag-based rather than keyword-based searching and management of items and the management of metadata will have a significant effect on usability, and will also affect the design of the beta solution. The team should be sure to take a broad and deep approach to delivering value for users using the expected amount of production data for both test configuration and results
  • test assumptions early on for the end-to-end service, including the complexities of deploying to end-user devices with limited connectivity. Despite this being a novel approach representing a significant risk, the “dual device” nature of the service did not yet appear to have been tested on the wide variety of potential end-user tablets and other devices likely to be found in use, nor to have been validated in the limited connectivity environments that formed part of the design constraint. These are significant risks, even for a service at an Alpha phase
  • conduct tests in an environment as similar to live deployment as possible, and if there are specific minimal specifications for these end-user devices, their browsers, or for broadband connectivity, these constraints should be validated and documented. This testing on devices, especially offline, is critical. The team should look at any fraud vectors arising from synching, re-connecting and restoring session progress
  • test with users requiring assisted digital support, including the assessment facilitator and various school administration personas

11. Make a plan for being offline

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • because of the expected disconnected nature of the service, clients are expected to cache data continuously and the team is exploring how offline sync will work
  • the team explained how users would be affected if the service was unavailable for any length of time

What the team needs to explore

Before their next assessment, the team needs to:

  • test the connection, disconnection and reconnection of clients from the central service
  • perform disaster recovery drills exercising the multi-zone Azure strategy for disaster failover
  • verify the user experience in configuring, downloading and returning results from assessments in periods of reduced connectivity and other service interruptions

12. Make sure users succeed first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • during alpha the team have had a small number of users to test with, these users rate themselves as above the level of competency for digital skills. It’s good to see that the team have a plan for research with people with access needs by August 2019

What the team needs to explore

Before their next assessment, the team needs to:

  • when designing for external users, the team must test the service on multiple devices and screen resolutions
  • when designing for external users, consider what support channels will be available and test these journeys along side the digital service

13. Make the user experience consistent with GOV.UK

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are developing the GOV.UK design patterns and we hope to see these shared with the wider design community for feedback

What the team needs to explore

Before their next assessment, the team needs to:

  • share design patterns with the community on the service design mailing list of through Slack
  • ensure new patterns are accessible, including colour contrast and with keyboard navigation
  • in beta we’d like to see the name of the service developed to reflect the pattern of verb-led names. As the platform may contain multiple services you could explore names for each one, for example, “Design national curriculum assessments”, “Run a national curriculum assessment”, “View national curriculum assessment performance” etc

14. Encourage everyone to use the digital service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the delivery team have conducted research to understand the majority of users digital inclusion score and had identified the potential edge cases of where there may be reluctance/inability to make use of the service
  • the design will facilitate the capture of more valuable data than the current non-digital approach
  • the design will make it easier for users with accessibility needs to make use of the main service than the current non-digital approach

15. Collect performance data

Decision

The service met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the delivery team were able to explain the rationale for the data they intend to be captured by the service and have selected an appropriate suite of data capture and analytical tools for this purpose
  • the delivery team have given appropriate consideration to the performance requirements to ensure the service performs as expected when under high usage
  • the delivery team have a clear understanding of the key user journeys that will enable the service to monitor the progress of user groups when the service is being used at scale

16. Identify performance indicators

Decision

The service met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the delivery team were able to articulate the baselines for the key performance indicators for the old service for comparison to the new service
  • the delivery team were able to articulate a plan for measuring the four mandatory key performance indicators

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the delivery team were able to confirm they have registered the service with the Performance Platform and have checked it can support the metrics they wish to present

What the team needs to explore

Before their next assessment, the team needs to:

  • complete the integration with the Performance Platform and demonstrate it to the assessment panel

Updates to this page

Published 21 October 2022