Digital Reception Baseline
The report for the Digital Reception Baseline alpha assessment on 03 April 2019
Digital Reception Baseline
From: | Government Digital Service |
Assessment date: | 03/04/2019 |
Stage: | Alpha |
Result: | Met |
Service provider: | Standards and Testing Agency (STA) |
Service description
The service is an online assessment of a pupil’s ability in communication, language and literacy, and early maths skills.
Service users
Reception Pupils, School Teachers/ Classroom Assistants, Standards and Testing Agency
1. Understand user needs
Decision
The team met point 1 of the Standard.
What the team has done well
The panel was impressed that the team:
- identified some key users groups: 4 year old pupils who have just joined reception, assessment administrators (teachers), head teachers/ school administrators
- understood the needs of the pupils interacting with a digital assessment and have iterated the digital service based on this evidence
- acknowledged that they will need to do research with the two device approach once they are technically able to
- is working with educational psychologists and test development researchers to develop and then test elements of the service
- collaborated with other government departments and agencies, such as the Skills Funding Agency, Cabinet Office Civil Services Competencies Team, EdTech, and organisations such as CBeebies and BBC education - learning from their approach to working with four year olds
- took a solid approach to research during alpha - using a range of methodologies to understand context and user needs, the use of heat maps on the prototype was shown to be effective and innovative
- is exploring options for school that don’t have reliable wifi connections
What the team needs to explore
Before their next assessment, the team needs to:
- consider diversity and inclusion much more widely of all user groups, a wider range on the digital inclusion, for both adults and children
- include access and assisted digital needs of all user groups, although the assessment is technically assisted digital, the needs of all groups for the end to end service need to be explored and how the standardisation of the assistance provided by test administrators will be handled
- understand if not having a standardised device and screen size for the assessment to be done on will affect the outcome of the assessment. As part of your private beta, it is recommended that you explore further whether all schools have the infrastructure and capability to undertake the assessment
- ensure that a digital only assessment with no standardisation of devices will not disadvantage any pupils, particularly those who have not had access to devices prior to joining the reception
- diversify the kinds of testing being researched, and collaborate with academics to understand whether this should be a digital-only assessment. Explore questions such as should there be a combination of digital and physical elements such as a physical test for pupils, and digital data capture and entry for test administrators? How would this affect the administration and outcome of the assessment?
- as part of beta, we recommend that you continue to explore whether there is an enterprise-level scalable and affordable option for schools with unreliable wifi, particularly for poorly performing schools with budget difficulties
2. Do ongoing user research
Decision
The team met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team clearly have used and will continue to use research evidence to iterate design and development, to plan sprints and longer-term planning including weekly research review sessions
- the team have good plans for the challenging circumstances of only being able to do contextual research with intended users in a six-week window each year around September
- the team has devised a thorough and clear plan for researching user needs in more depth in beta
What the team needs to explore
Before their next assessment, the team needs to:
- explore the setting up of the assessment with teachers in more depth, within a range of capabilities and in a range of school settings e.g. poor wifi connection, limited access to devices to understand if they are able to do this as well as undertake the test. To understand if it is workable to use two devices and identify assessors’ training needs. The team acknowledged that this hasn’t been possible as yet
- explore how learnings and insight for the interim solution will be incorporated into the development of the service?
- do research in schools with lower Ofsted ratings and in poorer areas. Also diverse geographic areas. Diversity in schools will be key in learning how to implement the service and administer the assessment
- factor in the scale of research they want to do in September, and the type of research - testing two devices will need at least 2 researchers
- continue to collaborate with academics such as developmental psychologists, education researchers, early years experts who are doing research in the space
- work with the Standards and Testing platform team to understand how all other user journeys, particularly for other key groups such as head teachers and school administrators and other senior stakeholders relate to the reception baseline test
3. Have a multidisciplinary team
Decision
The team met point 3 of the Standard.
What the team has done well
The panel was impressed that the team:
- worked well as a multidisciplinary team, with the product team collaborating closely with STA test developers, educational psychologists and policymakers to drive the service and policy development for the Reception Baseline Assessment
- has a mix of approximately 60% permanent civil servants and 40% contractors working on the project, ensuring that knowledge and skills can be retained and transferred within the team
- are hiring a civil servant business analyst for beta to support engagement and analysis with STA
- are bringing in a performance analyst in beta to help them to define KPIs and collect and analyse appropriate performance data
- collaborated with external teams, such as CBeebies and the psychometric testing team at the Cabinet Office, to draw on their expertise and knowledge of working with young children, and conducting digital assessments
- are utilising test developers to help write appropriate written/ verbal content for four-year-old users
What the team needs to explore
Before their next assessment, the team needs to:
- continue collaborating with test developers and educational psychologists - to ensure that any written and verbal content in the Reception Baseline Assessment is clear and comprehensible for all users, including; pupils, test administrators, and secondary users (e.g. school administrators)
- continue collaborating with external teams, such as CBeebies, to bring industry best practices and knowledge into the team. The team should also look to join-up more closely with the academic community (e.g. researchers with interests in child development), to ensure that the assessment is informed by a robust evidence base and open to expert review
- consider hiring another Full Time Equivalent user researcher to help support user research activities in beta.
4. Use agile methods
Decision
The team met point 4 of the Standard.
What the team has done well
The panel was impressed that the team:
- have used agile methods throughout the project, are working effectively in 2 weekly sprint cycles, and conduct appropriate ceremonies on a day-to-day basis
- at kick-off, established roles and responsibilities with a series of workshops
- have set-up a scrum of scrums across the digital projects in the STA, and has helped to drive and develop a roadmap for all STA digital projects, including the Assessment Platform
- have held weekly key stakeholder meetings, and attended monthly board meetings to ensure buy-in and input from the business
What the team needs to explore
Before their next assessment, the team needs to:
- continue collaborating with the Assessment Platform team closely, and other related Department for Education (DfE) teams such as Sign-in, to ensure that all teams have a shared vision and are working towards developing an end-to-end assessment service that works for all users
5. Iterate and improve frequently
Decision
The team met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the designer and researcher are collaborating closely and working in regular sprint cycles, to regularly iterate and improve their design. For example, the team demonstrated how they have used insights from user research, to develop and iterate aspects of the interaction design. For instance, they have designed and tested feedback interactions (e.g. highlighting the target area with a different colour on touch to give visual feedback to the child), and experimented with drag-and-drop functionality, all driven by and informed by user research
- the team is also working with policy to help iteratively develop the associated policy in this new assessments space
What the team needs to explore
Before their next assessment, the team needs to:
- continue to experiment with a range of possible digital and non-digital Minimum Viable Products (MVPs) with users in beta, rather than narrowing their design and iteration scope upfront with the current prototype
- continue to explore a range of other simpler digital (and possibly non-digital) solutions in beta - which could meet both user needs, and also the identified business aims - before committing to the current two-device and digital-only solution. The team are aware that the current proposed two device solution may not be the right approach and intend to test this in beta to understand whether this meets user needs
6. Evaluate tools and systems
Decision
The team met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the systems architecture plan has been well thought out, and adopting server-less architecture is commendable. It is the ideal scenario for adopting this architecture
- the adoption of progressive web apps is also commendable, although the test will need to be downloaded initially, the test will be available even offline
- the flows/ journeys for this service are captured well. There is even a platform function for ‘re-attempt’ - this is rightly identified to be a good possibility especially since the four year old children may want a break during the assessment
What the team needs to explore
Before their next assessment, the team needs to:
- explore the reality of the Information and Communications Technology (ICT) provisioning in schools for the tablets, apps, and connectivity to all work together. There is an assumption that the ICT will be in place in schools by 2020, we would recommend this part be captured as an important milestone for the service to work
- assign plans for upskilling or training the users to use the service - teachers staff, office managers and head teachers. We would recommend training users on the entire system as well as perceiving assessments results for consistency in the test results
- we would recommend the usage of React.js for the functionality (for example drag and drop feature) that is required, however, do consider fallback options
- explore more simple input solutions (e.g. asking children to point at, or verbalise an answer rather than manipulate it on screen) more fully before committing to a Javascript solution. This would be both a more simple and future-proof technical solution, and could also increase the validity of the test (i.e. so that the test measures what it purports to measure, such as numeracy skills, rather than fine motor skills or proficiency with touchscreen devices)
- explore a solution to the following problem. As per the service plans, javascript is a mandatory requirement due to the dependency for the apps between two devices as the team foresee a requirement for two tablets. However, if javascript fails or is switched off, users may not know what has happened
- explore the challenges around older browser versions and different operating systems across schools and ensure plans are in place for the patching/ upgrade of the relevant systems
7. Understand security and privacy issues
Decision
The team met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- all possible fraud vectors are considered and mitigation steps are put in place
- authentication is via OpenID, all data is encrypted at rest
- due consideration in place for physical device thefts as well
What the team needs to explore
Before their next assessment, the team needs to:
- action plans to integrate OWASP into deployments
8. Make all new source code open
Decision
The team met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- all source code is open
9. Use open standards and common platforms
Decision
The team met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- W3C for HTML standards
- existing DfE Sign In and GOV.UK Notify components are used
- confirmed plans for STA to own the intellectual property rights to avoid re-procurement every 3-5 years
What the team needs to explore
Before their next assessment, the team needs to:
- Integrate OWASP later as planned
- review the progress of DfE Sign in component and plan for any dependencies (if any)
10. Test the end-to-end service
Decision
The team met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- TDD is followed for ensuring the testing processes are spot on as the assessments will need to be baselined for all students over a year
- the entire testing process is automated and every change is unit tested, functionality tested and integration tested
11. Make a plan for being offline
Decision
The team met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- progressive web apps will remain available even when the service is offline
What the team needs to explore
Before their next assessment, the team needs to:
- consider the ICT and internet provisioning in primary schools, connectivity challenges for the initial assessments download to happen so that they are available in the event of systems being offline
12. Make sure users succeed first time
Decision
The team met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team has prototyped eight iterations of the service during alpha, including the service for test assessors and the test itself for pupils
- design decisions have been based on evidence from user research
What the team needs to explore
Before their next assessment, the team needs to:
- prototype and test the entire end-to-end user journey, including things like how assessors will discover and navigate to the service, what the service will be called and how pupils will be registered prior to taking the test. This has been hard to test so far because parts of the journey will be handled by the STA’s Assessment Platform, which is still in development
13. Make the user experience consistent with GOV.UK
Decision
The team met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the styles and patterns used in the service for assessors are largely consistent with GOV.UK
- the team demonstrated good evidence for not using GOV.UK styles and patterns in the test itself. For example, they changed from using the New Transport typeface to using a pre-cursive handwriting one, because children are taught to read this way and find it easier to understand
What the team needs to explore
Before their next assessment, the team needs to:
- explore and address potential accessibility issues in the test - colour contrast, size of touch target areas and how the test works for visually impaired users. The team is aware of this and intends to address it in beta
- re-visit the combined radio button/ table component on the ‘Choose a pupil’ page. There needs to be a clear reason to create new components like this and it’s not clear why the existing radio buttons, a table with links for actions, or some form of task list are not appropriate. If the team is sure that existing patterns don’t meet their needs, its findings should be contributed back to the GOV.UK Design System.
14. Encourage everyone to use the digital service
Decision
The team met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team were aware that comprehensive training and support would have to be developed for test administrators and secondary users such as headteachers, school administrators, ICT support before rolling out any digital service and intend to explore this in beta
- despite the fact that the digital uptake of the reception baseline assessment is a statutory requirement, the team has identified the needs of users and the preferred training and communication strategies to educate the users and encourage the uptake of the service
What the team needs to explore
Before their next assessment, the team needs to:
-
prove that a digital-only approach is the right one to take for all of their users, including robusting demonstrating that;
- continue to explore how pupils who have not had exposure to digital technology (including touchscreen tablets) prior to joining the school are not unfairly disadvantaged on the assessment, compared to pupils who have had previous exposure to digital technology and better digital skills
- pupils with SEND and/ or access needs would not unfairly disadvantaged or excluded from a digital-only assessment
- users with assisted digital needs could be supported to use the service or could be offered an alternative channel to administer and complete assessment
- conduct research with primary users (pupils and test administrators) with low digital skills and develop and test an assisted digital support model
- think about other channels the service may be offered in such as an offline or paper format for those who cannot use or access the digital service
- think about how they will provide adequate training and support for test administrators and other users
- incorporate findings from the non-digital interim Reception Baseline Assessment solution currently being tested in schools
- engage with the teaching lobby and unions to get feedback on the potential impact and feasibility of a digital-only Reception Baseline Assessment
15. Collect performance data
Decision
The team met point 15 of the Standard.
What the team has done well
The panel was impressed that:
- the team has started to think about the performance data they might want to collect in beta, including metrics around user satisfaction and journey completion
- in alpha has captured some analytics data, including heat mapping data, which they have triangulated with observations from user research, in order to better understand how users interact with the prototype
- in beta, the team plans to focus on understanding incomplete journeys through a combination of analytics and user research, to gain more insight into how and why a user may be unable to complete an assessment
What the team needs to explore
Before their next assessment, the team needs to:
- develop and put a plan in place with their performance analyst to start collecting appropriate performance data in beta
16. Identify performance indicator
Decision
The team met point 16 of the Standard.
What the team has done well
The panel was impressed that:
- hired a performance analyst to work with in beta, and has begun to think about some possible performance indicators
What the team needs to explore
Before their next assessment, the team needs to:
- work with the new performance analyst to clearly define their measures of success for the service and start to collect and analyse performance data to help them to understand this. For example, a key business goal is that the test does not place a significant burden on schools and should take no longer than about 20 minutes to complete per child. Therefore the team should think about how they will measure how long it takes to complete the test per pupil, and other associated activities, such as enrolling a child on an assessment. Other example metrics the team could consider include; teachers’ confidence levels in administering the test to the STA’s standards, and pupils’
- develop a cost per transaction estimate to understand whether the current proposed solution would be both feasible and cost-effective for both the STA and schools to implement. The panel appreciates that the project is at an early stage with many uncertain variables, more work needs to be done on this. The current proposed solution would involve a significant financial cost to schools such as due to substantial infrastructure upgrades, purchase of devices to administer the assessments and so on, and would involve a significant financial cost to the STA such as due to design, development and maintenance of the assessments and assessment platform, staff, training, and support costs