Policy paper

Performance Review of Digital Spend: Enabling Strategic Investment and Innovation

Updated 12 March 2025

1. Executive Summary

The Review Team has now completed the Performance Review of Digital Spend, which was commissioned by the Chief Secretary to the Treasury (CST) in August 2024. This report sets out the Review Team’s key findings and recommended actions to take forwards in response, as part of phase 2 of the Spending Review 2025 (SR25) process and beyond.

1.1 Overview of key findings

  • The review underscores the need for a significant shift in how digital initiatives are funded, managed, and tracked. Current processes are overly complex for many digital initiatives and experimental technologies, delaying decision-making and service delivery. It is essential to simplify governance for smaller projects while maintaining rigour for larger ones.

  • The review uncovered that there is often insufficient funding for service maintenance and improvement. Financial pressures during and between Spending Reviews (SRs) often mean short-term savings are prioritised over long-term digital investments and spending on service maintenance is often deprioritised. This results in mounting technical debt with outdated legacy systems and hampers progress. The absence of agreed upon metrics to measure outcomes also limits the ability to demonstrate value for money in digital spending.

  • The review emphasises the need for earlier involvement of digital experts in policymaking to consider a broader range of delivery options. Current approaches to policy making inadvertently narrow delivery choices early, limiting the range of options that can be considered during investment appraisals and preventing a full exploration of potential solutions. Furthermore, misunderstood guidance and the unsuitable application of appraisal methods by departments have often hindered digital investment. Additionally, Spending Teams require more technical support to properly assess digital, data, and technology (DDaT) bids, as current processes lack the necessary expertise.

1.2 Recommendations

The Review Team’s recommendations to address these issues are built on three key pillars: testing alternative funding mechanisms, enhanced training and guidance, and improved outcomes metrics and evaluation. The government’s aim is to test, iterate and institutionalise different approaches to both the funding and evaluation of digital spend by Spending Review 2027 (SR27), with a strong focus on demonstrating progress against outcome metrics in exchange for faster and more agile funding arrangements. To achieve this, the Review Team is recommending the following actions:

A portfolio of pathfinders to test new funding models:

A) Staged funding for innovative technologies: Introduce iterative funding for innovative technologies (e.g. AI), where funding is based on demonstrated progress rather than speculative forecasts and extensive documentation.

B) Staged funding for live services: Implement performance-based funding for live digital services by linking funding directly to outcomes through regular reviews, reducing bureaucracy and enabling faster decision-making. Change and run funding will be combined to facilitate continuous improvement efforts.

C) Portfolio Outcome-Based Funding: Initiate digital portfolios represented through a single multi-year business case, aligned to existing Green Book guidance and focusing on long-term outcomes. Work with departments to understand, leverage, and implement effectively.

D) Risk reduction in technical debt and cybersecurity: Departments should establish a tech and cyber risk appetite which reflects the broader risk to the government and public from the department’s activities. Investment plans will address legacy systems and technical debt, prioritising appropriate risk reduction over short-term savings, with clear metrics to track improvement.

Training, guidance and enabling more strategic spending decisions at SR25:

E) Upskilling through DDaT training: Deliver targeted DDaT training for departments and Spending Teams, with a focus on building better evidenced bids for SR25 phase 2 and on how to use agile funding approaches.

F) A digital first approach to SR25 as part of the overall Zero-Based Review (ZBR): Departments have been asked to involve their CDIO and internal digital functions when preparing and scrutinising ZBR returns. HM Treasury (HMT) and the Government Digital Service (GDS) will also provide the Digital Interministerial Group (IMG) with consolidated advice across all DDaT ZBR returns and investment priorities, with the support of an external challenge panel to ensure key strategic judgments are informed and robust before SR25 settlements are finalised.

G) Green Book supplementary guidance: Publish supplementary guidance which clarifies how to correctly apply Green Book principles when developing DDaT business cases, with a particular emphasis on legacy tech.

Improved outcome metrics and evaluations:

H) Outcome metrics: Develop new outcome metrics for tracking the benefits being delivered from major DDaT investments, including through the portfolio of pathfinders.

I) Evaluation plans: Evaluation Taskforce to provide advice on the development of robust evaluation plans for a few high value DDaT investments funded at SR25, focusing in particular on legacy tech.

J) Strategic decisions at SR27: Enable more strategic decisions at SR27, by agreeing priorities for business case development through the Digital IMG at least six months before the SR27 process begins.

1.3 Implementation approach

  • Agree on a portfolio of pathfinders from DDaT work funded in SR25 phase 2 to test and scale new funding models that enable faster spending approvals for new initiatives and greater flexibility in budgets for existing ones.

  • Launch targeted training at the start of SR25 phase 2 and publish new Green Book supplementary guidance for DDaT work to help departments provide better evidenced proposals. Elevate the consideration of DDaT bids in SR25 phase 2 decisions.

  • Support departments between SR25 and SR27 to develop outcome metrics and evaluation plans for DDaT work funded at SR25, and strengthen business cases for emerging DDaT priorities. This aims to use outcome data for future performance tracking and build a stronger evidence base for decisions at SR27.

2. Review background and methodology

The CST initiated a government-wide performance review of digital spending, aiming to address three core objectives:

  • Strategic alignment: Ensuring that investments in digital, data, and technology (DDaT) are strategically sound and effectively deliver intended outcomes.

  • Value for Money (VfM): Developing methods to track and demonstrate the value generated by digital spending, ensuring the efficient allocation of resources.

  • Decision-making evidence: Supporting departments in creating robust evidence to substantiate spending decisions.

This review sought to identify and replicate best practices in digital spending while addressing challenges in the current processes. The review adopted a two-pronged approach to gather a comprehensive mix of quantitative and qualitative evidence, focusing on broad insights and deeper engagement:

What How Who
Track 1: Broad Insight Building a broad evidence base and a baseline of current practices and ways of working. Tailored survey questions for stakeholder groups 18 responses from Finance Directors, SROs for major DDaT projects and programmes, and HMT Spending Teams.
Track 2: Deep dives on focal themes Deeper understanding of specific user groups on four focal themes: Outcome-based funding, Better business cases and delivery methods, Proportionate governance and assurance, Enabling continuous improvement of products and services User-centred scenario-based workshops 35 attendees from 12 organisations[footnote 1] ranging in seniority from product directors and service owners to finance strategists and economists, brought diverse functional perspectives (e.g. finance, digital, analytical, operational delivery).
    Interviews with internal senior leaders and external experts 15 Internal Stakeholders: DG/DD level across digital, finance, and operational delivery functions in departments.
    Interviews with internal senior leaders and external experts 6 External Stakeholders: Selection of consultancies, thought leaders, think tanks, and partners.

A Steering Group of senior officials was established to oversee the progress of the review, which included representatives from HMT, DSIT, GDS, National Infrastructure and Service Transformation Authority[footnote 2] (NISTA), Government Security Group (GSG), and departmental finance and digital representatives from MoD, DWP, and Land Registry

3. The case for change

The Review has emphasised that a paradigm shift in how HMG allocates and funds digital programmes is needed to ensure that it can enable better strategic investments in areas where long-term benefits are uncertain and more speculative, but the benefits are potentially large. This would enable departments to drive efficiency and productivity through digital service transformation.

3.1 How to ensure HMG’s spending on DDaT is strategically sound and enables intended outcomes

Business case realism for larger-scale change programmes: The Review Team was presented with a range of opinions on current business case processes. Many SROs and Finance Directors have told us that they find the existing governance models to be suitable for large-scale DDaT programmes, but overly complex for smaller enhancements and tactical projects (see below)​. There was recognition that whilst the Green Book remains an applicable tool for DDaT investment proposals, incremental changes to guidance would make an appreciable short-term impact. Views from within the digital and senior stakeholder community raised concerns relating to the level of certainty required for the business case process which is often not available/realistic, and the time taken for business case creation being lengthy and cumbersome and often slowing down service delivery. This is leading to the possibility of legacy platforms needing to be extended whilst the process is worked through. Likewise, concern has also been raised over the duplication of activities within departments at different stages of the process, which many viewed as having little value-add.

Approving more innovative DDaT spending: In contrast to large-scale programmes, there are significant concerns that the level of detail required in business cases at the start of more innovative DDaT initiatives; e.g. those involving generative AI, can often be too onerous or, in the worst cases, completely prohibitive. In particular, future milestones and benefits can be too speculative to robustly provide the evidence required to permit funding, using the existing guidance and appraisal methods.

Processes for modifying existing programmes: There are significant concerns that the level of detail needed for business cases for smaller changes to existing DDaT programmes is too high, and that this can lead to inefficient resource allocation and significantly delay in delivery. The evidence from interviewees suggests much of this appears to be driven by departmental internal assurance processes, which respondents often thought were excessive, rather than those mandated by HMT or GDS. This lack of proportionality in governance creates bottlenecks, leading to inefficiencies in resource allocation and project execution. A common recommendation is simplifying governance for smaller projects while maintaining rigour for major initiatives.

Impact of financial pressures: Financial pressures both at SRs and between SRs have led to strategic DDaT investments being deprioritised, particularly in cases where benefits are less quantifiable. The tendency to prioritise short-term savings at the expense of long-term investments is seen as a key issue. Since SR21, some departments have also cut agreed projects to cover financial pressures.

Lack of funding for the running and continuous improvement of digital services: Funding for running services is not always reflected in business cases. Even when it is, it is often seen by departments as something that can be cut in-year for efficiency and short-term savings. Continuous improvement funding is also frequently deprioritised which accelerates the accumulation of technical debt. There are also some concerns that tying continuous improvement funding to specific programmes leads to a loss of expertise when people move roles.

3.2 How to track and prove the value of this spending effectively to safeguard value for money

CDEL to RDEL switches: Digital spending increasingly uses off-the-shelf solutions that utilise subscription-based business models (e.g. Software as a Service, SaaS), which are categorised as RDEL rather than CDEL. Some of this spending was assumed to be CDEL when approved, which is leading to funding challenges because of difficulties in switching.

Lack of agreed metrics for tracking outcomes: Most business cases for digital transformation projects do not include agreed outcomes metrics for tracking benefits realisation. Barriers to agreeing outcomes metrics identified include a lack of knowledge at the inception of a programme of what outcome metrics the project will directly impact, and low demand in the past from seniors and Ministers in having regular access to the IM and outcome data which is already available. There is some evidence of outcomes being tracked retrospectively; e.g. NHS tracking of sepsis outcomes, but data on whether delivery has impacted the target outcomes is not being systematically evaluated or used to inform future digital delivery.

Effectiveness of existing central assurance process: The previous iteration of the Quarterly Business Review assurance process was seen as being too broad in scope and insufficiently focused on service performance. A couple of respondents also thought that the National Infrastructure and Service Transformation Authority (NISTA) could play an increased role in assessing performance.

3.3 How to enable departments to generate robust, high-quality evidence to support spending decisions

Limited engagement of functional expertise in policy and options development: Both Spending Teams and departments thought that for many DDaT programmes, the absence of digital, and other functional experts early in the policy-making process leads to only one pre-preferred policy option being considered and presented. This often prevents the full choice of digital delivery options and leads to lower value for money. Furthermore, business cases often focus on just one commercial supplier, driven by factors such as the cost and risk of transitioning to new suppliers, limited commercial expertise within departments, and the small number of suppliers for certain digital services. To improve this process, involving digital experts from the outset and expanding the delivery options is crucial. It was also suggested that the NISTA could support the development of commercial cases, to ensure a wider range of solutions is explored.

Departments’ use of existing appraisal approaches: Departments are not always using the best methods for appraising costs and benefits under the current guidance. For example, many departments are not using agile funding approaches, despite this being allowed under existing HMT guidance. Additionally, some departments rely on the Benefit Cost Ratio (BCR) method to justify investments on risk mitigation, where alternative methodologies like cost minimisation would allow for a better understanding of its expected value. The use of incorrect methods leads to the wrong choices being made when it comes to investment, with some high-risk cases deprioritised.

Spending Team capability: Spending teams expressed a need for more technical support to robustly assess DDaT spending bids. The challenges they face in assessing bids are compounded by the often very tight timeframes for assessing strategic Outline Business Cases during SRs, and some Spending Teams also being unaware of the wider support available to them for this from central functions.

4. International examples of funding reform

New South Wales

In recognition of similar shortcomings in funding digital and public service transformation outcomes, Australian state New South Wales redesigned its digital funding model in three critical ways. It released funding in smaller increments tied to progress toward specific outcomes, which reduced risk and encouraged agile behaviour. Secondly, it transitioned from funding multi-year projects to funding persistent teams that delivered end-to-end customer journeys. And thirdly, it reformed governance to focus on outcomes.

Victor Dominello, New South Wales’ minister for customer service, explained why funding reform was necessary: “Leading digital governments are thinking about funding in different ways, and putting that at the centre of their political programmes. Good governments have to compete with the likes of Apple, Google, and Amazon. People are used to customer service and digital tools that work.”

The State of New Jersey

Similarly, New Jersey’s unemployment insurance team was making improvements every few days throughout and beyond the pandemic thanks to application of a different funding approach. The Department of Labor celebrated a success built on “a continuous approach to IT modernization over an all-or-nothing strategy.” Dave Cole, New Jersey’s Chief Innovation Officer, said at a press conference: “Far too often, other states, large and small, have spent hundreds of millions of dollars to do one monolithic overhaul of their UI technology and applications, only for the resulting experience to remain just as confusing, just as frustrating, and just as demoralizing for claimants and state UI staff.”

United States - Office of the Administrator for the General Services Administration

The U.S. Technology Modernization Fund (TMF) is another notable example enacted at the federal level. It provides government agencies with funds to modernise legacy IT systems, and prioritises technology solutions to improve delivery of mission-critical services and projects that can serve as common solutions and/or inspire reuse. The TMF allows for flexible, outcome-driven investment, with a focus on projects that demonstrate quick wins and long-term impact. The savings generated from the modernisation efforts are used to pay back the initial funding, creating a self-sustaining cycle of continuous improvement and innovation.

United States - Department of Defense (DoD)

The DoD has implemented transformative funding reforms to address the challenges of the “color of money,” traditionally segmented into rigid categories of operational and capital expenditure. These reforms, including the U.S. Army’s indefinite use of Research, Development, Test & Evaluation (RDT&E) funding for software development and the introduction of a budget activity pilot programme, blur traditional expenditure boundaries to enable continuous integration and delivery. A new Software Acquisition Pathway further supports agile methodologies, allowing programme managers to dynamically allocate resources across development, acquisition, and sustainment activities. These initiatives enhance flexibility, reduce inefficiencies, and align funding mechanisms with the iterative and evolving nature of modern technologies.

Estonia

Estonia has implemented an innovative stage-gated funding model to optimise its digital investments, linking financial releases to demonstrated value delivery. Initiated in 2022, this approach is supported by a dedicated assurance team within the Central Digital Office, fostering accountability and value-based decision-making. The system integrates operational expenditure planning into the upfront capital expenditure process, estimating future maintenance needs at 20% of the total CapEx over a system’s five-year lifecycle. This model is underpinned by a strong partnership between the Ministry of Finance and the Central Digital Office, where financial and digital expertise converge to ensure agility and value for money. By coupling strategic assurance with flexible funding, Estonia is enhancing its capacity to sustain and evolve digital services effectively.

Previous initiatives in the United Kingdom

The UK government has successfully implemented persistent, outcome-driven funding models, in certain limited cases. For example, the GOV.UK Notify service was funded for continuous operation, allowing it to pivot quickly during the COVID-19 pandemic and send millions of notifications for various public sector organisations. This approach proved more cost-effective than the previous project-based plan (engaging a vendor to build an emergency alerts solution). However, such successes remain exceptions rather than the norm across the public sector in the UK.

Private sector

While still exceptional in the public sector, these approaches are reflective of more standard practice in the private sector. Lloyds Banking Group transformed its IT services and funding processes by moving from a fragmented, waterfall project approach to an agile, value-stream-based model focused on customer journeys. Previously, the bank managed around 4,000 siloed projects annually, leading to inefficiencies, misalignment with strategic goals, and frequent overruns. By reorganising into 10 customer journey-focused value streams, funding teams rather than individual projects, and introducing OKRs for prioritisation and transparency, Lloyds improved oversight and adaptability. These changes resulted in significant cultural and operational shifts, including faster delivery, better alignment with strategy, and improved cost and risk visibility. The transformation helped enable Lloyds to become the UK’s largest digital bank, with over 13 million digitally active customers, and achieving significant cost savings and improvements in customer satisfaction.

5. Recommendations

The government must enhance its ability to understand the value and impact of its digital investments. Currently, every digital investment is put through the same business case approval process, regardless of scale or risk, leading to inefficiencies and delays in delivery. This over-reliance on these processes stems from risk aversion, a lack of proportionate application of current guidance by departments, and a limited understanding of digital delivery methods within review and approval mechanisms. To address these challenges, the Review Team proposes a shift toward faster, smarter, and more proportionate funding processes. By reducing the time, cost, and administrative burden of current systems, and focusing on risk-value alignment, HMG can better support strategic investments and enable quicker realisation of benefits. Finally, improved outcomes metrics and evaluation will provide the data needed to track performance, more readily adapt programmes, and ensure that lessons learned inform future investments.

The Review Team’s recommendations are built on three key pillars: testing alternative funding mechanisms, enhanced training and guidance, and improved outcomes metrics and evaluation.

They are underpinned by the principles set out in the Annex, which are intended to help digital, financial, commercial and PPM professionals across government understand, easily reference, and replicate the HMT, GDS, and NISTA preferred attitude to funding for digital, as set out in this report.

A portfolio of pathfinders will establish and test new funding models that align more closely with the realities of digital delivery, reducing reliance on traditional processes and enabling more strategic, outcome-focused decision-making. Enhanced training and technical support will empower departments and Spending Teams to prepare stronger, leaner, and data-driven business cases and apply Green Book guidance correctly.  Together, these efforts will streamline processes, build capability, and lay the foundation for a more effective and impactful approach to digital investment.

5.1 A portfolio of pathfinders to test new funding models

To drive more effective and sustainable investment in digital and technology initiatives, HMT and GDS are introducing a strategically targeted portfolio of pathfinders designed to reshape funding practices. This approach moves away from traditional, waterfall-based governance models, embracing a lean, agile framework that aligns funding decisions with demonstrable progress and measurable outcomes. Funding products, services, and user journeys directly, rather than through broader transformation programmes, will aim to break the inefficient boom-and-bust funding cycle. These pathfinders will focus on reducing disproportionate governance while providing pathways to support uncertain, high-potential initiatives that are much less likely to be funded through the traditional business case approval process. This model will also allow funding decisions to be iterative, responsive, and informed by ongoing delivery insights, ensuring flexibility and pace without compromising accountability.

To test and refine this innovative approach, HMT and GDS will establish four funding models as part of the portfolio, each designed to streamline decision-making and accelerate delivery:

  • Two models adopt staged funding, one for live services, where funding is based on performance against agreed metrics; and another model for innovative technologies, where progress is demonstrated through regular showcases.

  • The third model focuses on outcome-based portfolios which will operate under a single, multi-year business case that defines enduring outcomes, using Lean Portfolio Management principles to streamline resource allocation across various projects to ensure optimal outcomes.

  • The final model addresses risk-reduction in technical debt and cybersecurity by establishing departmental tech and cyber risk appetite which reflects the broader risk to the government and public from the department’s activities and creating investment plans to address legacy systems and technical debt. It emphasises long-term risk mitigation, prioritising improvements in security and infrastructure, with measurable tracking of progress.

  1. The pathfinders will address critical areas such as technical debt/legacy tech and cyber security, ensuring priority risks are managed effectively. Supported by GDS, HMT, and NISTA, this portfolio will demonstrate the feasibility of different funding models while reducing bureaucracy, improving delivery speed, and enabling leaders to better understand what works and what does not. The lessons learned from these pathfinders will inform future funding practices and establish a foundation for broader adoption across government.
Actions Further detail including the problem(s) it would address Implementation milestones & completion date
A. Staged funding for innovative technologies An alternative mechanism to secure funding for trialling, testing, and potentially scaling new and innovative technologies - e.g. AI, Suitable for initiatives which aim to improve outcomes and delivery efficiencies where there are high levels of uncertainty and meeting the usual standards for evidencing cost-benefit would be disproportionate, A funding request would be considered based on a less comprehensive, more hypothetical analysis of the predicted outcomes. For approved projects, funding would then be released in a staged manner under an alternative governance process. Where work demonstrates progress towards the desired outcomes, further funding will be released and if this is not the case, funding will be stopped and reallocated. Pathfinder with one initiative launched in Q1 FY25/26, Further initiatives identified after SR25 settlements are agreed, GDS, HMT, NISTA support package established in time for Spending Review settlements, Target to solidify as a dedicated investment route for use at scale by all organisations by SR27.
B. Staged funding for live services Funding model based on robust and transparent performance management, tied directly to agreed outcomes, Combines change and run funding for products, services, and/or user journeys rather than funding these from transformation programmes, Utilises short-form agile business cases and regular review processes based on performance metrics and data-driven insights to unlock subsequent funding, enabling the ability to deliver at pace and demonstrate progress, Introduction of an earned autonomy model for business cases and funding requests, using progression of delivery against outcome to inform funding decisions. Pathfinder with one initiative launched in Q1 FY25/26, Initial proof of concept for the Live Services pathfinder launched with NHS in Q1 FY25/26, Further initiatives identified after SR25 settlements are agreed, Other opportunities identified and ready for activation by April 2026, Target to solidify as a dedicated investment route for use at scale by all organisations by SR27.
C. Portfolio outcome- based funding A blended programme and portfolio approach to deliver & sustain enduring outcomes, driven by a thematic collection of capabilities, The model maximises the flexibility allowed under existing Green Book guidance and utilises the best practice approach detailed in the NISTA’s Teal Book, A single multi-year business case for the whole portfolio retains the level of scrutiny that is proportionate for large programmes to ensure VfM, Departments are given autonomy to shift funding between smaller projects that fall within the overall portfolio, reducing potential bottlenecks for more innovative or minor changes, The model adopts Lean Portfolio Management that integrates lean and agile principles to drive value delivery, creating a portfolio of initiatives with business strategy and customer value while ensuring efficiency and agility. Pathfinder with one initiative launched in Q1 FY25/26, Immediate support provided to implementing MoD’s Capability Portfolio Management, Further initiatives identified after SR25 settlements are agreed, Other opportunities identified and ready for activation by April 2026, Approach embedded and adopted at scale by SR27
D. Risk reduction in technical debt and cyber security Departments to establish a tech and cyber risk appetite which reflects the broader risk to the government and public from the department’s activities, Introduce a rule for transformation programmes to require legacy risk and tech debt to be addressed as part of overall investment plan, Provide clear guidance for departments to bid to improve their risk position, regardless of short-term financial returns or cashable savings, Ensure departments prioritise risk reduction, enable a better understanding of the cost and time to improve HMG’s technical debt and cyber risk positions, and enable better tracking of the delivery of improvements. Value for money calculations should be risk-adjusted to reflect the broader tech and cyber risk exposure to the government and public that departmental activity can have. Approach established for investment over the SR25 period.

5.2 Training, guidance and enabling more strategic spending decisions at SR25 

To improve the quality of DDaT investment decisions, the Review Team recommends targeted actions focused on training, guidance, and enabling more strategic spending. The review revealed significant gaps in how departments develop and evidence SR bids and Business Cases for DDaT investments. Common issues include the misuse of Green Book appraisal approaches for risk-reducing investments, underutilisation of agile methods for innovative spending, and underestimation of funding for operational and maintenance activities. Spending Teams also highlighted the need for greater technical support in evaluating bids.

To address these challenges, the Review Team proposes enhanced training for departments, supplemented by technical support from GDS to Spending Teams, and clear concise guidance that clarifies the correct application of Green Book principles to DDaT Business Cases, with a particular emphasis on legacy technology replacement. GDS will emphasise that policy teams should take a proactive approach to reflecting digital considerations from the outset, which will help ensure that the full range of delivery options are considered. This approach can reduce the risk of failure and the need to re-work proposals, which often arises when digital, and other functional, expertise is engaged too late in the development of options. Utilising common digital services for approvals already being deployed will also help to further reduce friction, ensuring efficiency, consistency, and alignment to pan-government policies.

In addition to these measures, and in line with the overall Zero-Based Review (ZBR) approach to SR25 phase 2, HMT and DSIT are taking a digital-first approach to considering DDaT spend within the SR. Departments have been asked to involve their CDIO and internal digital functions when preparing ZBR returns. The Digital IMG will then have a crucial role to play in ensuring that digital expertise is baked into consideration of fundamental reforms, including for areas not currently digitalised. HMT and GDS will provide Digital IMG with consolidated advice across DDaT proposals, with the support of an external challenge panel to ensure key strategic judgments are informed and robust before SR25 settlements are finalised. By combining improved training, technical guidance, and strategic oversight, these recommendations aim to strengthen the overall quality of DDaT investment planning and decision-making across government.

The review also showed that the increased use of Anything-as-a-Service (XaaS) models is driving the increased use of RDEL rather than CDEL for digital projects, which is causing funding challenges for departments. The Review Team has established a dedicated workstream to further investigate these issues, and potential options to help address the impacts. Further advice on this will be provided to Ministers in March 2025.

Actions Further detail including the problem(s) it would address Implementation milestones & completion date
E. Targeted DDaT training for departments and support for Spending Teams Training for departments as part of SR25 phase 2 on how to evidence DDaT investment in business cases, which will make use of best practice examples for different types of DDaT investment; e.g. legacy, AI, security, frontline digital servicers, Would focus in particular on upskilling departments on how to robustly demonstrate the risk reduction benefits of proposals on legacy tech, given this is a priority for SR25 phase 2 and bids in the past have often been poorly evidenced, GDS will also provide an enhanced support package for HMT Spending Teams, including a formalised support route through SR25 phase 2 and future fiscal events and a regular training on DDaT assurance outside of fiscal events, A targeted training module for DDaT heavy departments on how to use agile for DDaT spending following SR25 phase 2, given this review indicated that use of agile approaches is currently very low despite the fact that it would provide a route to more easily approve innovative spending, e.g. on AI. Training for departments provided after the launch of SR25 phase 2, in Q1 2025, Enhanced GDS support package for Spending Teams confirmed by Q1 2025, Targeted training module on how to use agile for DDaT spending developed and delivered to DDaT heavy departments by the end of 2025.
F. A digital first approach to SR25 Collaboration between the centre of digital government and departments, with DSIT working with all departments to coordinate and assess ZBR returns, DSIT will use returns to: (i) identify the lowest value digital spending across government; and (ii) identify those areas of government that either provide the best opportunity to drive better outcomes for citizens or to save money through the application of modern digital approaches to service delivery and administration, Generate innovative ideas for improvement and opportunities for wider digital capacity beyond immediate DDaT spend, ensuring HMG takes full advantage of wider digitisation opportunities to identify efficiencies that could be made using digital solutions. For example, this could include opportunities to augment very manual processes with technology and opportunities to adopt common digital solutions across multiple areas to save money for the taxpayer, Consolidated advice on DDaT ZBR returns and investment priorities, and the digital IMG will have a crucial role in providing advice and in considering the fundamental, cross-cutting, reforms required to bring about digital transformation. An external challenge panel to provide support and advice to the digital IMG on ZBR returns and cross-cutting priorities for SR25 phase 2, Digital IMG meeting on the outcome of the digital ZBR to advise on the returns and potential areas for digitisation.
G. Green Book supplementary guidance Updating and publishing existing DDaT Benefit Framework Guidance to make it both more practical and visible to departments. This guidance provides advice on how to consistently estimate and appraise the benefits and costs of DDaT investments in areas such as interoperability, data and security, but has relatively low usage due to its poor visibility within government, New supplementary guidance on legacy technology to provide more detailed guidance than the above for legacy technology, reflecting the added challenges associated with it, e.g. the need to estimate the costs of continuing legacy systems, This would help address the problems identified in the Review of departments: (i) often using or misapplying the wrong appraisal methodology for evidencing the costs and benefits of DDaT investments; and (ii) the lack of robustness in particular of legacy tech related Business Cases that focus on risk mitigation, Alongside these steps, in 2025/26, HMT also plan to: publish a bitesize example of how to do a lifetime appraisal for a DDaT investment that includes operational, maintenance and decommissioning costs; and strengthen the requirements for departments to publish business cases for projects on the GMPP, including for DDaT. The publication of a higher share of business cases for DDaT projects in the future will provide departments with more best practice examples of robust business cases for DDaT projects than are currently publicly available. New DDaT Benefit Framework Guidance and Legacy Technology Guidance published in 2025/26, Focus on new digital elements in comms and training on the Green Book over the rest of 2025/26 (with a tailored comms plan targeting digital leads and appraisal teams).

5.3 Improved outcome metrics and evaluations

To strengthen the impact and value for money of DDaT investments, the Review Team recommends a renewed focus on outcomes metrics and evaluations. The Review highlighted significant gaps in the ability to track and measure the benefits of DDaT programmes, with many lacking robust outcome metrics and HMT and central functions having limited visibility of programme performance. This absence of systematic tracking diminishes the ability of departments and central teams to adapt and improve underperforming programmes. Additionally, DDaT programmes and projects are often concluded without evaluations, preventing the lessons learned from them being effectively applied to future digital initiatives.

To address these challenges, the Review Team proposes a phased approach to improving outcomes measurement and evaluation. Following SR25, HMT and GDS will work with departments to baseline their current metric capabilities and support the development of new outcome metrics for key DDaT initiatives. One of the key ways this will be realised is through the portfolio of pathfinders, which will have a strong focus on demonstrating progress against outcome metrics in exchange for faster and more agile funding arrangements. GDS and HMT will look to institutionalise this approach, partnering with the Evaluation Taskforce to ensure we are able to better link outcomes to digital investments and improve how we track and evaluate whether those outcomes are being achieved. Looking ahead to SR27, HMT will enable more strategic and evidence-based spending decisions by agreeing, through the digital IMG, the strategic priorities for developing Business Cases at least six months before the SR27 process begins. These efforts will ensure a stronger, data-driven foundation for future DDaT investments.

Actions Further detail including the problem(s) it would address Implementation milestones & completion date
H. Outcome metrics Develop with departments new outcome metrics for tracking the benefits being delivered from major DDaT investments in cases where metrics are lacking (or focused only on outputs rather than outcomes), This would help address the Review finding that a large majority of digital transformation projects do not have outcome metrics (with APIs for reporting), Metric development would be prioritised for high value DDaT investments linked to the Government’s Missions and/or frontline delivery, e.g. major NHS digitisation programmes. A key way that this will be taken forward will be through the pathfinders, which will have a strong focus on outcome metrics, In parallel, HMT and GDS will develop and agree new outcome reporting arrangements for DDaT programmes that are embedded within existing performance reporting provided to the centre, including through the NISTA and on strategic outcomes, Post SR27, departments will begin to be required to report regularly on outcome metrics through a new performance reporting regime for digital delivery. Process for improving metric capability for DDaT programmes agreed by the end of SR25 phase 2 in Q2 2025, as part of new processes for tracking progress against the new Digital Centre’s objectives. This will include integration with the GDS-led performance processes, Use data on funded DDaT programmes collected as part of the SR25 process to create a single list of DDaT programmes for tracking across govt, together with the Missions/strategic outcomes each of these programmes link to by Q3 2025, Develop with the NISTA new outcome reporting arrangements for DDaT programmes that are embedded within existing processes by Q2 2026, Departments required to report regularly on outcome metrics through a new performance. reporting regime for digital delivery post SR27.
I. Evaluation plans Evaluation Taskforce to provide advice on the development of robust evaluation plans for a few high value DDaT investments funded at SR25 (with no existing evaluation plans), This would address the review’s finding that major DDaT programmes currently finish without completing evaluations of whether intended outcomes have been achieved, HMT and GDS will prioritise developing evaluation plans for major DDaT investments funded at SR25 without a strong evidence base and/or those which cannot be robustly tracked through outcome metrics, e.g. those relating to legacy tech remediation DDaT programmes which will receive Evaluation Taskforce support will be agreed at the conclusion of SR25 phase 2, Evaluation plans for these areas developed within one year, by Q2 2026, Review at Q2 2026 over whether there are further areas the ETF should provide support on evaluation for ahead of SR27.
J. Stronger business cases Digital IMG will agree a list of emerging DDaT strategic priorities at least six months before the launch of SR27. Departments will then be asked to prioritise developing Outline Business Cases (OBCs) for these emerging priorities ahead of SR27, This will enable departments to prioritise developing more strongly evidenced business cases for those areas for use in the SR27 process, and would help ensure that SR27 bids reflect the centre’s priorities for new DDaT investment, It would support the Willetts Review recommendation to use OBCs more at SRs to improve the robustness of decisions and to fasten delivery of agreed DDaT projects post SR. Publication of an updated GDS multi-year Digital and AI Roadmap in Summer 2025, Develop with Spending Teams from Q1 2026 (tbc since dependent on SR27 timelines) list of strategic priorities for OBC development ahead of SR27 launch, Digital IMG to agree list of priorities for OBC development in Q2 2026 (tbc since dependent on SR27 timelines), six months before SR27 launch, Departments start developing agreed OBCs from Q2 2026, for consideration as part of the SR27 process.

6. Implementation timeline

HMT and GDS will provide an annual update in 2025, 2026 and 2027 on progress against these actions.

6.1 Now - up to SR25 settlements

Pathfinders

  • Pathfinders with one initiative, for each of the (i) staged funding for innovative technologies model, (ii) staged funding for live services model and (iii) portfolio outcome-based model, launched in Q1 FY25/26.

  • Initial proof of concept for the Live Services pathfinder launched with NHS in Q1 FY25/26.

  • Immediate support provided to MoD’s Capability Portfolio Management as part of the Portfolio pathfinder.

  • Other volunteer projects onboarded to the pathfinders.

  • GDS, HMT, NISTA support package established in time for Spending Review settlements.

Support and guidance

  • Training for departments provided as part of SR25 phase 2, in Q1 2025.

  • Enhanced GDS support package for Spending Teams confirmed by Q1 2025.

  • New DDaT Benefit Framework Guidance and Legacy Technology Guidance published in 2025/26.

Strategic investment approach 

  • Publication of updated GDS multi-year Digital and AI Roadmap in Summer 2025.

  • An external challenge panel to provide support and advice to the digital IMG on ZBR returns and cross-cutting priorities for SR25 phase 2.

Digital IMG meeting on the outcome of the digital ZBR to advise on the returns and potential areas for digitisation.

6.2 Next - after SR25 settlements

Pathfinders

  • Further initiatives identified after SR25 settlements are agreed.

  • The risk reduction in technical debt and cyber security model will be established over the SR25 period.

  • Pathfinders continue, with ongoing evaluation.

  • Other pathfinder opportunities identified and ready for activation by April 2026.

Support and guidance

  • Focus on new digital elements in comms and training provided on the Green Book over the rest of 2025/26 (with a tailored comms plan targeting digital leads and appraisal teams).

  • Targeted training module on how to use agile for DDaT spending developed and delivered to DDaT heavy Depts by the end of 2025 (post SR25 phase 2).

Strategic investment approach 

  • DDaT programmes which will receive Evaluation Taskforce support will be agreed at the conclusion of SR25 phase 2.

  • Process for improving metric capability for DDaT programmes agreed by the end of SR25 phase 2 in Q2 2025, as part of new processes for tracking progress against the new Digital Centre’s objectives. This will include integration with the GDS-led performance processes.

  • Use data on funded DDaT programmes collected as part of the SR25 process to create a single list of DDaT programmes for tracking across govt, together with the strategic outcomes/Missions each of these programmes link to by Q3 2025.

6.3 2026 onwards

Pathfinders

  • Pathfinders continue, with ongoing evaluation.

  • Target to solidify pathfinders as dedicated investment routes by SR27.

Strategic investment approach

  • Develop with Spending Teams from Q1 2026 (tbc) list of strategic priorities for OBC development ahead of SR27 launch.

  • Evaluation plans for ETF areas developed within one year, by Q2 2026.

  • Review at Q2 2026 over whether there are further areas the ETF should provide support on evaluation for ahead of SR27.

  • Digital IMG to agree list of priorities for OBC development in Q2 2026 (tbc), six months before SR27 launch.

  • Develop with the NISTA new outcome tracking arrangements for DDaT programmes that are embedded within existing processes by Q2 2026.

  • Departments start developing agreed OBCs from Q2 2026, for consideration as part of the SR27 process.

  • Departments required to report regularly on outcome metrics through a new performance reporting regime for digital delivery post SR27.

7. Annex: The principles of the new funding approach

7.1 Funding approach

  • An alternative, non-waterfall governance process where continued funding decisions are informed by the demonstration of progress in delivery.

  • The approach encompasses transformation and funding for running products, services, and/or user journeys, instead of these being funded from transformation programmes.

  • It will reduce disproportionate governance and tackle boom and bust cycles of funding for digital and technology.

  • It will provide a path to funding for outcomes and delivery efficiencies where there are high levels of uncertainty.

  • The introduction, guidance and the governance for these approaches are themselves agile and lean.

  • This approach operates comfortably alongside and in conjunction with the current programmatic and project methodologies.

  • These approaches will receive a package of multi-functional and multi-organisational support led by GDS, HMT, and NISTA.

7.2 Guiding principles

  • The funding approach reduces bureaucracy instead of adding to it.

  • It is lean.

  • It reduces time to delivery.

  • The approach recognises services, products and programmes.

  • It increases understanding of what works and what does not.

  • The approach will enable stopping work early to prevent waste and poor investments.

7.3 Approach to implementation

  • Establish three new channels for funding for pathfinders, with representation from a variety of types of service/programme in each, to build the case that investment can successfully be made and stopped under this new approach.

  • Evaluate the funding approach and iterate accordingly.

  • Publicise progress and initiate a second round of pathfinders, with finance teams who are willing to allocate Departmental budgets in this way.

  • Codify this approach, use functional channels to communicate it to Finance, Operational, and Digital leaders, and begin to set a new expectation for how digital spend should be managed, which can be used in SR27.

7.4 Three channels for funding

  • Two variants of staged-funding which are decided through regular, lean data-driven reviews based on progress against agreed outcomes alone, using short-form agile business cases.

    • For live services, funding based on performance against agreed metrics.

    • For innovative technologies, funding based on regular demonstrations.

  • Portfolios are initiated and funded with a single multi-year Business Case that bounds the portfolio, articulates the enduring outcomes, and within which teams have the freedoms and parameters to deliver at pace, e.g. CDEL/RDEL allocation is defined once for the portfolio. Lean Portfolio Management ways of working are used.

  • For risk reduction work, the investment will focus on priority areas of technical debt and cyber security, regardless of short-term financial returns or cashable savings, by establishing tech and cyber risk appetite for use in investment plans and enhanced central support.


  1. Cabinet Office, GDS, Defra, DHSC, DSIT, DVLA, DVSA, DWP, GDS, MHCLG, NHS, SLC. 

  2. The Infrastructure and Projects Authority (IPA) will form part of NISTA, when NISTA becomes operational in spring 2025. For clarity, NISTA has been used to refer to both organisations in this report.