Schools Financial Benchmarking Website Live Assessment Report
The service provides schools with the ability to make comparisons between their spending in various categories with those of similar schools.
Schools Financial Benchmarking Website
From: | Central Digital and Data Office |
Assessment date: | 18/02/2021 |
Stage: | Live |
Result: | Met |
Service provider: | Department for Education |
Previous assessment reports
Service description
The purpose of the Schools Financial Benchmarking tool is to provide schools with the ability to make comparisons between their spending in various categories with those of similar schools. This information should be used by school leaders to inform their spending decisions and to identify if, where and how spending changes could be made to support pupil progress within the available level of resource. The data hosted online is also designed to provide transparency to the public.
The Tool is a key element of the Department’s excellent School Resource Management (SRM) programme.
The site can be accessed here.
Service users
- School Business Professionals, headteachers, finance officers
- Governors
- Parents
School Resource Management Advisors and other DfE users
1. Understand user needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- that iterations were designed in response to user needs - as in the example of the different benchmarking needs of special schools and the RAG rating tables on the self-assessment dashboard
- the team worked together on the analysis of research findings
- that they had a deep understanding of the needs/goals of the different personas
- the teams’ development and refinement of the personas through the project as their understanding deepened
- their work to ensure the team has an understanding of accessibility needs including their visit to the empathy lab, and their use of the GDS accessibility personas
- the evidence of improvements made to improve the accessibility of the service including the work on tables and charts
- it was good to see the benchmarking report cards to support user switch assisted digital needs
What the team needs to explore
Before their next assessment, the team needs to:
- ensure they always start from user needs when further developing this service, despite external pressures.
- from what was evidenced in the assessment there is only one user researcher - needs to make sure user research is supported and fully embedded in the team.
- ensuring that all of the services is accessible in line with WCAG 2.1 guidelines
- there is still some work needed on developing user-friendly content/labelling
2. Do ongoing user research
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- there was ample evidence that continuous research was harnessed for making improvements to the service
- that the team worked to address the area’s improvements raised in their last assessment
- that despite the restrictions with COVID-19 they carried out research with users representing the different user groups
- that research was carried out with users with different levels of digital confidence he initial work on the digital inclusion scale
- that testing was carried out with users with accessibility needs
What the team needs to explore
Before their next assessment, the team needs to:
- involve the whole team in user research (having them observe sessions first-hand and take notes during research sessions)
- being more clear about the exact methods used for research
- Make sure there is not too much reliance on surveys for feedback/insights post-launch but that they work closely with analytics as well
- further develop the digital inclusion scale and support for those with assisted digital needs
- work with analytics and performance data to develop your research plan going forward
- some of the performance metrics depend on users engagement with elements of the service that may not reflect user satisfaction or behaviour - it may be worthwhile considering other means to understand user satisfaction and user journeys for this service
- continue to find creative ways to recruit participants from a broad range of demographics to ensure that the service is easily used by all
3. Have a multidisciplinary team
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team is well organised and clearly multi-disciplinary
- if there becomes a need to transition to a new supplier team, plans currently include provisions for handover notes and a cross over period between the two suppliers whereby paired working could be implemented
What the team needs to explore
Before their next assessment, the team needs to:
- be wary of the fact that it would appear from GitHub data that only 1 developer is doing the vast, vast majority of the technical work (1,500 commits to 13 from the next highest person). This risks having an impact on code quality and building up technical debt. Having a 2nd Dev so that paired programming or pull request reviews take place would reduce this risk
- explore and determine exactly how they could transition into one that is made up of a better mix of civil servants and suppliers
- determine exactly how any handover from the existing supplier team to a new team is going to be managed to ensure all knowledge is transferred appropriately
- speak to and receive more support from DfE Digital in bringing in civil servants to maintain this service for the rest of its lifecycle and ensure all staff working on this service are linked in to relevant departmental and cross government communities of practice
4. Use agile methods
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- even prior to Covid-19 lockdowns, the team were not co-located and yet still made excellent use of digital tools and technology to enable them to operate successfully together from multiple locations
- the Product Manager is empowered, with a full mix of roles across the team
- that the team are using a Scrum methodology
What the team needs to explore
Before their next assessment, the team needs to:
- N/A
5. Iterate and improve frequently
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team were able to clearly demonstrate in a number of ways; how they have continuously iterated the service based on both usability testing and feedback and performance data from the Beta service
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate how they have improved the service based on both the recommendations in this report and their recent content review & accessibility audit
6. Evaluate tools and system
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team had made good use of the public cloud (Azure) and architected the platform to ensure zero downtime deployments. The range of environments and automation was appropriate for the project
- data cleansing tools have been developed and offer robustness in the process to ensure a step failure can be rerun
- appropriate attention had been paid to environment management
What the team needs to explore
Before their next assessment, the team needs to:
- review the role of javascript in frontend development, see if the application can benefit from more attention to progressive enhancement ensuring that all functionality works when JS is not present/blocked
- adding an accessibility statement to the site
- the team need to look at upgrading to use the latest version of govuk frontend and incorporate current GDS design components
- plan for upgrading versions of .NET
7. Understand security and privacy issues
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the service has considered privacy issues proportionately, taking into account that the service does not store any personal data, the data published is considered public domain, so there is limited risk
- the team are routinely undertaking security scanning using OWASP tools
What the team needs to explore
Before their next assessment, the team needs to:
- cookie compliance code needs to remove the initial GA cookie when off is chosen - it does not do this in Chrome at present
- address security pull requests on the repositories
8. Make all new source code open
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- all source code, bar data handling code, was published on Github and coding in the open approach was taken. https://github.com/DFEAGILEDEVOPS/Schools.Financial.Benchmarking
What the team needs to explore
Before their next assessment, the team needs to:
- investigate making use of pull requests and actions to demonstrate the code has been reviewed and checked against unit test coverage and linting
- address open security pull requests on the repositories
9. Use open standards and common platforms
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team has made some use of GOV.UK Notify
- the team has organised code to provide reuse within the platform
What the team needs to explore
Before their next assessment, the team needs to:
- N/A
10. Test the end-to-end service
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- there was automated and manual testing in place, the pipelines had been configured well to ensure quality gates during release
What the team needs to explore
Before their next assessment, the team needs to:
- enhancing unit test coverage in web application core component
11. Make a plan for being offline
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the service architecture is such that the likelihood of experiencing downtime is minimised as much as possible
- acceptable plain English pages exist for the common service outage types
What the team needs to explore
Before their next assessment, the team needs to:
- N/A
12: Make sure users succeed first time
Decision
The service met point 12 of the Standard.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure URL indexing is removed from google and users can start the service from GOV.UK
- explore the user journey to the self-assessment dashboard and the help context around it
- continue to ensure the service is accessible to users
13. Make the user experience consistent with GOV.UK
Decision
The service met point 13 of the Standard.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure URL indexing is removed from google and users can start the service from GOV.UK (previously mentioned in point 12)
- continue to ensure content and language aligns with the GDS style guide and design system and is user tested
- continue to ensure error messages align with the design system
- consider a meaningful service name as recommended
14. Encourage everyone to use the digital service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- digital take-up of the service is incredibly strong and that efforts have been made to ensure users without quality internet access can still use the service
- a non-digital route into the service exists through the creation of report cards that are mailed out to schools/users
What the team needs to explore
Before their next assessment, the team needs to:
- explore viable ways in which they might measure the impact the physical reports cards are having, and where there are pockets of users who only use these rather than the digital service so that a true digital take-up score can be calculated
15. Collect performance data
Decision
The service met point 15 of the Standard.
What the team has done well
The panel was impressed that:
-
the team demonstrated how data from Google Analytics, satisfaction surveys and the feedback form on the service, have informed their user research and service improvements.
-
the team look at trends in usage of the service, over time and collect data/feedback from teams in the wider department, as they engage with the primary users of their service
What the team needs to explore
Before their next assessment, the team needs to:
- ensure that they are gathering data on and reporting the mandatory KPI’s including cost per transaction, completion rate and digital take up
- ensure that they are clear about what (new) performance data is required to evidence any new KPI’s which may be introduced as a result of the current KPI review. Where a KPI relates to showing how the service is working to bring about change and whether this is happening in practice, a dependency on qualitative data may prove costly and time consuming for a small team. How this evidence can be reported/published should also be considered.
16. Identify performance indicators
Decision
The service met point 16 of the Standard.
What the team has done well
The panel was impressed that:
- the team is improving their service and work is underway to identify new KPI’s around the policy objectives and knowing how the service is working to bring about change and also how the Dept can know if this change is happening in practice. The intention is to use click-through data from the benchmarking tool to the recommended next step e.g. contacting another school as well as qualitative data from surveys.
What the team needs to explore
Before their next assessment, the team needs to:
- agree with any new KPI’s which show the value of their service in terms of saving money for schools, with stakeholders, how data will be collected and where such performance data will be published.
17. Report performance data on the Performance Platform
Decision
The service met point 17 of the Standard,
In this instance given guidance was not published, so there should be no penalty on a service for something they could not reasonably have been aware of.
What the team has done well
The panel was impressed that:
- the team did reach out to the Performance Platform team in GOV.UK to ask about a dashboard
What the team needs to explore
Before their next assessment, the team needs to:
- review and implement the latest guidance on the retirement of the performance platform and the need to publish data and then link from data.gov.uk.
- retrieve their historical performance data from the performance platform
- follow guidance and publish on data.gov.uk once that guidance is officially published.
18. Test with the minister
Decision
The service met point 18 of the Standard.
What the team has done well
The panel was impressed that:
- the relevant Minister within DfE has been walked through the service and is broadly aware of its existence
What the team needs to explore
Before their next assessment, the team needs to:
- N/A
This report will be published on GOV.UK
If there is a factual inaccuracy in the report, contact the assessments team immediately. If the assessments team do not hear from you within 5 working days of sending this report out, it will be published on gov.uk/service-standard-reports as is.