IPA Benchmarking service alpha reassessment report
Report for the IPA Benchmarking service alpha reassessment on 19th January 2022
Service Standard assessment report
IPA Benchmarking Service
From: | Central Digital & Data Office (CDDO) |
Assessment date: | 19/01/2022 |
Stage: | Alpha reassessment |
Result: | Met |
Service provider: | Cabinet Office (Infrastructure Project Authority) |
Previous assessment reports
Service description
The benchmarking hub will collect project data from completed projects and provide access to this data to project leaders across government to support project investment decisions. Helping to leverage the UK Government’s project portfolio data to support and shape future investment decisions. The platform will collect data on previously completed projects and future projects.
Service users
The following user groups and segments have been identified through user research and
analysis. One of the key findings is that many of the needs captured are agnostic across all
Users:
User groups
● Benchmarking Service Custodian (IPA - Head of Central Benchmarking Service)
● Benchmarking Service Contributors (Government Departments & ALB’s)
● Benchmarking Service Consumers (Government Departments ALB’s)
● Benchmarking Service Contributors (External Parties)
● Benchmarking Service Consumers (External Parties)
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has done a large amount of user research, allowing them to fill gaps in their knowledge about users and test prototypes
- the team has chosen the appropriate methods to answer their research question, and could clearly describe how they elicited detailed insight from users
- the team had mapped the end to end services based on this insight, resulting in a clear picture of the processes and tools to be designed
- the team undertook usability testing of part of the service journey, and could clearly articulate how this deepened their understanding of users and led to changed in design
- there is a good plan in place to further this approach in Beta
What the team needs to explore
Before their next assessment, the team needs to:
- do research on the end-to-end service. In Alpha, the team had tightly focused its scope and in doing so had de-prioritised some key user journeys, such as submitting benchmarking data and creating/managing an account. Before developing these journeys the team must do more research and should create prototypes to test their ideas and assumptions with users
- ensure that any testing aims to replicate real world conditions of service use as closely as possible. For example, testing of the upload journey should involve asking user to upload real data from one or more of their projects
- ensure that behavioural assumptions are validated via research before further design and development work. The team have a working assumption that certain teams/projects may be reluctant to upload data. This needs to be validated and, if it is valid, would mean work on the wider service to ensure there the incentives for uploading data (and disincentives for not doing it) lead to the outcome the IPA is looking for (sufficient datasets in the tool to allow for meaningful benchmarking)
- clarify what level of research and content testing is needed to ensure that all upload templates and displays for different project types work for all users across the large range of industries and project types that this service covers
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team had a sound overall design approach and UX process
- the team have a clear problem statement and understanding of the problem
- the team interviewed with users and stakeholders
- the team have conducted some accessibility testing
What the team needs to explore
Before their next assessment, the team needs to:
- expand the prototype to test parts of the user journey that were deprioritised or not developed
- consider how they will track and monitor the users uploading spreadsheets. This includes designing a process to support users, possibly including notifications for when users should upload data, any errors found in the upload and when a user hasn’t uploaded data
- consider that the key deliverable is metrics to improve how projects are managed. The team should explore how this data will be collected and the value being delivered
3. Provide a joined-up experience across all channels
Decision
The service did not meet point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team have engaged a range of organisations
- the team have designed all parts of the service from a service design perspective
- the team have a good design process which considers accessibility and inclusion
What the team needs to explore
Before their next assessment, the team needs to:
- develop the prototype to test how the service will work with online and offline processes. This includes user journeys that were deprioritised for the Alpha Phase
-
address points from the previous assessment as follows:
- carry out more investigation into what the offline aspects of the service involve, what the user needs and pain points are, and how you can help users complete them
- demonstrate how the offline journey will work, and show that the team is developing that journey in parallel with the online service so they’re consistent
- based on user research, create the clear path to getting to the digital service, its adoption and take up
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team have removed PDFs and converted these
- the team using common components from the design system
What the team needs to explore
Before their next assessment, the team needs to:
- audit the content against the GOV.UK Design System and guidance. Seek advice from CDDO for a detailed content review. Some content in the prototype does not meet CDDO interface guidelines and a detailed review will point this out
- continue to iterate and test the journeys especially those parts not prototyped and content design
- develop and test content for error messages and shutter pages
-
address points from the previous assessment as follows:
- carry out usability testing, ideally in user’s real work environments on their work devices, and reduce reliance on subject-matter expertise to make design decisions. This will make sure the team finds out what works, not just what people think or say they want
- consider what essential features need to be part of a minimum viable product, and what can be added later
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team have considered the use of an external agency for accessibility testing
- the team tested parts of the service with users
What the team needs to explore
Before their next assessment, the team needs to:
- conduct accessibility testing with users that have access needs as planned
- test all parts of the service and user journeys with users in particular those parts not developed/prototyped
-
address points from the previous assessment as follows:
- involve disabled users at every stage of their research and demonstrate how the service will meet their needs. At alpha, this involves understanding if current or potential users have access needs, and how online and offline parts of the service can accommodate them
- secure specialist accessibility advice for custom components that are not in the design system, in particular, graphs
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the service has had a multidisciplinary team for Alpha including most of the roles set out in the Service Manual
- the plan for Private Beta includes expanding the team to include more specialisms that will be critical for the next phase. This includes a content designer, DevOps and data engineer
- there is a structured plan for knowledge transfer between supplier teams as the service transitions into Private Beta. This includes robust documentation processes and collaborative working practices
- the Service Owner has ownership of the full benchmarking service, including the digital and analysis teams
What the team needs to explore
Before their next assessment, the team needs to:
- establish a sustainable team with the knowledge and skills to deliver and maintain the service throughout its lifecycle. The team is currently very reliant on its external delivery partner to provide the digital specialists needed for delivery. The move to Private Beta is expected to include transitioning to a third supplier since the beginning of Discovery. The team should ensure contractual arrangements with delivery partners are sustainable for the development, maintenance and support of the service
- work with senior leaders in IPA and Cabinet office to explore appointing civil servants into key digital roles either directly or through Cabinet Office’s central digital team
- ensure they have access to a content designer as the service needs to use complex and specialised content. This is included in their planned Beta Phase team
- make sure the knowledge transfer between Alpha and Private Beta teams is complete and the lessons learnt during Alpha aren’t lost
- assess whether an individual with data architecture experience should be added to the team
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team has adopted agile practices with an iterative and user-centred approach. This includes a scrum-based approach with 2 week sprints to iterate the service design. They have adopted a combination of tools to support remote collaboration and worked together effectively during Alpha
- the team do regular Show & Tell sessions that are open to their stakeholders and well attended. These have been successful at increasing awareness of their service alongside a ‘weekly note’ to share their progress
- the team have a clear governance structure and are empowered to make design decisions. The team are working effectively with their wider organisation to influence strategy and build support for their service
- the team is representing their service at existing stakeholder forums to increase awareness and prepare stakeholders for the new service
What the team needs to explore
Before their next assessment, the team needs to:
- review their ways of working as the team expands to adapt them to the changing structure, especially as the delivery partner for Private Beta joins
- consider further how they will select and invite Private Beta users to gather feedback on the service. This could include a subset of departments or projects that need to submit and access benchmarking data. They will need sufficient evidence from this to provide confidence that the service is ready to move into Public Beta
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team iterated prototypes rapidly during the Alpha Phase based on user research. This allowed them to discount ideas and correct assumptions about user needs based on feedback
- the team is improving internal practices by using retrospective meetings to constructively review their processes and improve
What the team needs to explore
Before their next assessment, the team needs to:
- make sure they have had the time to explore and prototype key design ideas. In the latest sprint, the team identified a new user need in usability testing for comparing a project to the benchmarking data. However, the team had not had time before the assessment to properly prototype and research this
- continue to iterate design ideas throughout the Beta Phase to design the service based on user needs and feedback
- review their approach to prioritisation to make sure they are researching the full user journey and iterating the design frequently
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team have looked at a range of personas and the threats they may pose to the security of the system
- the team is considering a delegated security model
What the team needs to explore
Before their next assessment, the team needs to:
- provide documentation around the data model. This will help inform which areas require additional security. They should also detail what information they are holding on users
- consider what data will be anonymised and what data remains in its raw form
- clearly articulate as part of private beta if there are any risks to their delegated security model and how these will be managed
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team has defined a clear set of performance indicators for the service tied to the vision and problem statements. This included how they expect to measure them and collect the required performance data. They have plans to investigate these further in Private Beta and embed analytics into their design process
- there will be a data analyst embedded in the team throughout Beta with responsibilities including service performance analytics
- the team has collected data to support their problem statements and prioritisation
What the team needs to explore
Before their next assessment, the team needs to:
- use performance data to inform iterations of the service design, including identifying trends in user behaviour and fixing problems
- establish baselines for each performance indicator to allow the future service performance to be compared against the current benchmarking processes
- setup performance measures for the end-to-end user journey including steps completed before accessing the digital tools, such as accuracy of completing data submission templates
- investigate options for implementing web analytics into their service to collect, analyse and apply data on user behaviours. This will underpin some of the key indicators identified in Alpha, including completion rates and uptake. This data should also support their ongoing user research and design activities
- ensure the embedded data analyst gets the support needed to take responsibility for performance analytics, such as mentoring from an experienced performance analyst
- ensure that they agree data quality metrics and controls as part of private beta
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team is using a range of common technologies that are widely used across government
- the team have assessed whether a single sign on solution is appropriate for their application, by reaching out to a number of tech leads in government
- the team will base their identity and access management solution on OpenId Connect which allows integration with existing solutions if a department requires
What the team needs to explore
Before their next assessment, the team needs to:
- take care with using GOV.UK PaaS and speak to the PaaS architects to understand whether it is the right platform for an application that uses data tools
- consider that as they are using many technologies which are new to the IPA, they should ensure they seek advice broadly. Test with real data and users as part of private beta
- consider that as they will be using some commercial tools they need to ensure processes and functionality built in these tools is well documented
- undertake user research on the excel templates that are provided to the users to complete and show these to the assessment team in Private Beta
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team is, where possible, operating in the open and publishing their code on GitHub
What the team needs to explore
Before their next assessment, the team needs to:
- consider that as critical information is held in the data model, and the columns they are collecting. Ideally this information should be documented and shared on GitHub. If this cannot be shared on GitHub this should be explained as part of private beta. Open documentation will help make the tool easier to maintain and support over the long-term
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
-
the team is using common components including:
- GOV.UK design system
- ONS design system for graphs and charts
- MOJ design system for tabular and data and filtering patterns
-
GOV notify
- the team also plan on using OpenAPI and CSVW in private beta
- the team is using industry based standards for data schemas based on International Cost Management Standard (ICMS) v2
What the team needs to explore
Before their next assessment, the team needs to:
- as part of private beta share the metadata they are using to provide confidence in the data
- consider guidance that has been published around spreadsheets and CSVs available on GOV.UK
- ensure they explain the data schemas they are using as part of Private Beta
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team will have a duplicate stand-by database that can be used in case of failure by the main database
- the team will follow a containerisation approach using cloud based monitoring services so the solution can be redeployed rapidly
- the team have identified additional staff that can support the project, including seven IPA data analysts
- the service team is a mixture of civil servants and an external supplier
What the team needs to explore
Before their next assessment, the team needs to:
- undertake extensive testing through private beta on performance, disaster recovery and worst case scenarios
- provide a long-term resource and support model into live