Evaluating One Big Thing 2023 (HTML)
Published 30 January 2025
Acknowledgements
Thank you to all the participants in the evaluation projects.
Cabinet Office Modernisation and Reform Unit led and implemented One Big Thing and took decisions about how evaluation should be embedded in its delivery, supported by the One Big Thing Senior Sponsors Network.
Evaluation data was collected in the One Big Thing platform provided pro bono by the i.AI team.
Colleagues from the Evaluation Taskforce and Government People Group led and managed the collaborative evaluation presented in this report.
Dr Jack Blumenau, ESRC Policy Fellow, peer reviewed the analysis.
Introduction
One Big Thing (OBT) is a new annual initiative which sees all civil servants taking shared action around a Civil Service reform priority. OBT 2023 focussed on data upskilling and ran from September to December 2023. OBT is sponsored by the Cabinet Secretary and is designed and implemented by the Modernisation and Reform Unit.
We carried out an evaluation to provide initial results on whether OBT 2023 met its aims. We surveyed all civil servants who took part in OBT 2023, and we also carried out a smaller case study evaluation in a single business unit.
This summary report outlines the key results and conclusions from our evaluations. A technical report containing detailed information about the evaluation’s methods and results is also published.
What was One Big Thing 2023?
The better use of data is a priority for government. Using data can improve our understanding of complex problems and with targeting policies and activities to deal with them. OBT 2023 was designed to boost these efforts and help ensure we remain a modern Civil Service able to use data effectively across all our roles.
OBT 2023 promoted data upskilling through three main activities:
- New online training materials available to the whole of the civil service.
- Allocating seven hours of every civil servants’ time between September and December 2023 to self-directed data upskilling activities, with a supporting catalogue of resources.
- Asking all line managers to host an activity, conversation or team meeting focused on the use of data in teams’ day-to-day work.
Online training materials
OBT gave all civil servants access to a new 90 minute data course delivered on the platform Civil Service Learning (CSL). The training was tailored to different levels of skill and experience, offering training aligned to three competency levels (awareness, working and practitioner level). Participants completed a pre-course assessment to direct them to the training most suited to their competency level.
Seven hours of self-directed data upskilling
Senior leaders and line managers encouraged every civil servant to spend seven hours on self-directed data upskilling activities during the period in which OBT 2023 was live. To help civil servants access relevant materials, the online platform gave civil servants access to a catalogue of existing data training and resources, which had been checked for their quality and relevance by data and training experts. Most departments and professions also made materials and activities available which were tailored to their specific context. Data and digital skills have been a learning and development priority for several years, and previous cross-government communication campaigns have reinforced this priority and signposted materials available for upskilling on the Government Campus. Civil servants could record any data training they had done in 2023 as part of their seven hours of OBT data learning.
Line manager conversations
After teams had completed the online data training, line managers were encouraged to hold conversations and run activities within their teams to reinforce data upskilling and its relevance to day-to-day work.
Delivering OBT across the whole Civil Service
To make sure that the whole Civil Service could participate in OBT in a way which worked for them a senior sponsors network was set up. This network met regularly, and ensured OBT 2023 met the needs of the whole Civil Service, was feasible to implement locally, and was designed in the right way to deliver against its aims. Cross Civil Service and local communications channels were used to get the message about OBT 2023 out to all civil servants, and promote participation.
What were the aims of OBT 2023?
The aims of OBT 2023 were:
- To create a practical moment of shared participation to reinforce that we are one Civil Service.
- To have a measurable uplift in data awareness, confidence, knowledge and understanding across the Civil Service.
- To have a long-term impact on participation in data and other training and initiatives.
- To contribute towards achieving better outcomes in the delivery of public services and policy through the use of data.
Our evaluation gathered data to help us assess whether OBT 2023 had met its aims.
What did we find?
Aim 1: Did OBT 2023 create a practical moment of shared participation to reinforce that we are one Civil Service?
Overall our evaluation results suggest that OBT 2023 did generate participation across the Civil Service. 42% of the Civil Service registered for OBT 2023 on the formal, online platform. 82% of those who registered completed the initial online training modules. This means of the whole civil service around one in three (34%) both signed up and completed some training. 567,000 data learning hours were recorded on the official platform. 23% of those who registered for OBT 2023 recorded completing the target seven hours of data upskilling (about 10% of civil servants). This suggests that the majority of civil servants (around nine out of ten) did not complete OBT as designed. We do not have any data on civil servants who may have participated in local OBT 2023 activities, such as team discussions, but did not register on the online platform, or registered but did not log the upskilling activities they completed, so the above figures may underestimate overall participation.
We cannot say whether OBT 2023 was experienced as a “shared moment” based on our data. We asked civil servants about their identity as a civil servant and their sense of connection with other civil servants before and after taking part in OBT, but the results were inconclusive. It is likely that other factors not related to OBT would also be influencing civil servants’ sense of shared identity and connection with other civil servants, possibly to a greater extent than OBT, which may explain these results.
Aim 2: Did OBT 2023 lead to a measurable uplift in data awareness, confidence, knowledge and understanding across the Civil Service?
Across the two studies, we found some very small positive improvements in participants’ data awareness, confidence and knowledge.
Our cross-government survey assessed people’s data awareness and confidence, using a five-point scale, from strongly agree (5) to strongly disagree (1).[footnote 1]
We found very small increases in participants’ awareness of the relevance and use of data in their data-to-day roles during the period in which they participated in OBT. For example, participants’ self-reported awareness of how data could support their day-to-day roles increased from an average of 3.98 before starting OBT, to 4.12 afterwards. The results on other questions about the relevance of data are illustrated in Figure 1.
There were very small increases in respondents’ self-reported data awareness, confidence and knowledge
Question theme | Pre | Post |
---|---|---|
I know how to use data effectively in my day-to-day role | 3.83 | 4.04 |
I am aware of how data can support my day-to-day role | 3.98 | 4.12 |
I think data is relevant to my role | 4.12 | 4.18 |
I feel confident about using data in my day-to-day role | 3.88 | 3.99 |
Likert score: 3
Figure 1: Mean scores for each question on a scale of 1-5, before and after participation in OBT. The scale is: 1 = Strongly disagree, 2 = Disagree, 3 = Neither agree nor disagree, 4 = Agree, 5 = Strongly agree
We also found very small increases in participants’ confidence around data-related ideas (like ethics) and activities (like visualising data) after taking part in OBT. Average participant responses ranged from 3.55 for communicating data more confidently to 3.81 for learning about the importance of evaluating outcomes of data-informed decisions. This is illustrated in Figure 2 below.
After participation in OBT, respondents on average reported increases in confidence around data-related ideas (like ethics) and activities (like visualising data).
Question theme | Post |
---|---|
I have a better understanding of data ethics | 3.80 |
I have a better understanding of how to quality assure data and analysis | 3.74 |
I understand better how to communicate data insights effectively to influence decisions | 3.76 |
I feel more confident to use data to influence decisions | 3.61 |
I can communicate data information more confidently to influence decisions | 3.55 |
I have learned about importance of evaluating outcomes of data-informed decisions | 3.81 |
I have a better understanding of what data means | 3.79 |
I know more about how different data analysis techniques can be used to understand data | 3.74 |
I understand better how to critically assess data collection, analysis and the insights derived from it | 3.72 |
I know more about visualising and presenting data in a clear and concise way | 3.75 |
I am better at interpreting data | 3.56 |
I understand better how to anticipate data limitations and uncertainty | 3.74 |
Likert score: 3
Figure 2: Mean scores on questions relating to self-reported confidence around data-related ideas (like ethics) and activities (like visualising data), on a scale from 1-5 after participation in OBT.
Our case study evaluation used a knowledge and behaviours assessment, taken by a small group of civil servants (288) before and after OBT 2023. We found very small increases in civil servants’ ability to correctly answer some of the questions we set, which involved applying data to perform tasks (like calculating something) and about key data-related concepts (like averages). We also found small increases in reported use of data in writing and decision-making, but did not find changes in all the data behaviours we asked about. The size of the difference is considered very small because, on average, participants in the ‘after OBT’ assessment answered less than one additional question correctly (0.6) compared to the ‘before’ assessment. We have confidence that our results represent a real change in these participants because we randomised GPG into two groups, with one group taking the pre delivery of OBT and one taking the post delivery, to make sure the effects we were seeing were not just the effect of having taken the assessment before. Overall, this suggests that OBT 2023 may have resulted in some very small gains in these participants’ data awareness, confidence, and knowledge, including their ability to apply this knowledge to day-to-day work.
Overall scores on the data literacy knowledge check were slightly higher after the delivery of OBT.
Average score | Pre | Post |
---|---|---|
Total | 6.3 | 6.9 |
Figure 3: Average scores on an 11-item data literacy knowledge check across two samples, the ‘pre’ sample who completed the knowledge check before OBT was delivered, and the ‘post’ sample who completed the knowledge check after OBT (whether or not they actively engaged with the training and activities).
Aim 3: Did OBT 2023 have a long-term impact on participation in data training and initiatives?
We analysed trends in the volume of bookings of centrally-offered data training before and after OBT. To do this we used a statistical technique called an interrupted time series analysis. This technique allows us to assess whether OBT may have caused any changes in the uptake of data training. Our results were inconclusive. These inconclusive results were a consequence of the type of data available to analyse, which had a lot of variance, and also only covered a sub-set of the data upskilling activities civil servants may have participated in before and after OBT. Therefore, we cannot say whether OBT did or did not have a long-term impact on participation in data training and initiatives
We also explored how relevant OBT 2023 participants found the training. Whether people found OBT 2023 to be relevant or not could help explain why they felt their knowledge, confidence and awareness had improved, or not, after taking part in the training. It could also give an indicator of how likely it was that their use of data in day-to-day work would change after taking part in OBT. It may also have an influence on whether or not they went on to complete other data training. On average, we found that people only moderately agreed that OBT was a good use of their time and that the content was relevant to their role. Participants also moderately agreed that they were likely to apply learning from OBT 2023 to their roles, but recorded lower scores for their intention to complete specific actions such as booking further training or creating a personal development plan. [footnote 2]
Aim 4: Did OBT 2023 contribute towards achieving better outcomes in the delivery of public services and policy through the use of data?
This was not something we were able to assess in our evaluation, because it relates to more complex, long-term trends.
How did we generate these results? Our two evaluation studies and their limitations
Cross-civil service evaluation
We administered two surveys to all Civil Service OBT participants: one before they started OBT activities, and a second survey once they had completed the training. The surveys asked questions about participants’ data awareness, confidence and knowledge, their civil service identity and how connected they felt to other civil servants. We analysed data from these surveys by comparing the mean scores from survey participants before and after OBT, to see whether these had changed. This was an opt-in survey open to all civil servants who had registered for OBT 2023, without any sampling techniques being used. This means that our results only tell us something about whether OBT met its aims for the civil servants who chose to complete our survey, not about whether it met those aims for the wider Civil Service.
Case study evaluation
We carried out the case study evaluation in Government People Group (GPG), a large Business Unit in the Cabinet Office. We took an objective measure of data literacy and data behaviours in GPG before OBT 2023 launched, and after it finished, using an assessment. We used statistical tests to identify whether GPG’s average scores had changed (up or down) between the before and after assessment and used some specialist techniques to reduce potential biases and improve the reliability of our tests. GPG does not have the same characteristics as the Civil Service as a whole, so our findings only tell us something about whether OBT 2023 met its aims within GPG. As the survey was opt-in it has similar limitations to the cross-Civil Service survey.
What are the lessons learned from our findings?
1. A training initiative like OBT may be able to achieve very small increases in participant knowledge, awareness and confidence across a large number of civil servants.
Our evaluations found very small improvements in data awareness, confidence and knowledge after taking part in OBT 2023. Even very small improvements may be valuable if they are achieved over a whole organisation. Previous evidence shows that small, but widespread changes, may in fact offer greater value than an intervention which achieves large effects with smaller groups of colleagues. [footnote 3]
Almost 220,000 civil servants took part in OBT 2023. However, the majority of civil servants did not complete OBT as intended. As OBT matures as an annual initiative, and lessons are learned from implementation, it is likely that higher participation rates could be achieved in future.
This evaluation has not captured the views of people who didn’t take part in OBT, or complete training hours. Therefore, we are not able to draw conclusions based on these findings about what changes could be made to increase participation in OBT in future. But there is lots of existing evaluation evidence that can be drawn on when planning future OBT events, focused on different reform priorities, to increase the chance of achieving the highest possible impact with the largest possible group of people.
2. The design of future OBT events could do more to support people to apply new learning into their day-to-day roles.
OBT 2023 took evidence-based steps to support people to apply learning into their day-to-day role, by including line manager conversations and local, context-specific activities as part of the programme. This was a sensible place to start because these are relatively low-cost and simple to implement. Findings from the cross-Civil Service survey showed that there was still a gap between people’s general intentions to use learning from OBT, and their intention to take specific, practical action to do so. Not everyone found OBT relevant and focus groups are exploring the reasons for that. Some small changes in people’s reported behaviours were found in our case study, but not across all behaviours.
In planning future OBT events, further attention could be given to connecting the upskilling content to specific local work and goals, to help people apply new learning in their day-to-day roles. For example, this might include more scenario-based content in the training[footnote 4], or providing evidence-based templates to support line managers help their teams embed the new skills within day-to-day work, for example structured prompts and cues; action planning; self (or team) monitoring; and opportunities to continue to repeat the new skills within work[footnote 5]. Giving departments more autonomy to tailor OBT resources, components and structures to better suit their staff’s local needs and contexts may support this, providing the overall design of the training remains evidence-based.
3. OBT ought to be evaluated every year, to gain even more extensive and robust evidence to support future delivery of OBT events and other upskilling initiatives.
OBT 2023 was the first of its kind in the Civil Service. Our evaluation has provided some useful lessons learned for how evaluations of future OBT events could be carried out. Overall, our evaluations show that it is feasible to evaluate OBT. The relatively light touch methods we have used to evaluate OBT 2023 (pre/post surveys and assessments) could be adapted, improved and used again to understand whether future OBT events achieve their aims. Other evaluation methods might also be considered, so the evaluation can be well-tailored to the strategic questions about OBT and Civil Service upskilling we need to answer. Planning and resourcing evaluation from the outset ensures that the widest possible range of suitable evaluation methods are available.
Looking ahead to future OBT events
One Big Thing 2024 has now been announced with the title “One Big Thing starts with One Small Change”. The overall aim is to stimulate an “innovation culture” of continuous improvement, problem solving and experimentation, and to give staff an opportunity to practise the process of innovation and be empowered to produce better outcomes for the public. Civil servants will receive e-learning (an innovation masterclass) and guidance on the process of innovating (how to spot challenges, how to find solutions, how to implement change, etc.). A cross-civil service practical exercise will then be implemented where teams will be asked to generate new ideas for furthering their departments’ organisational objectives and to pilot, implement and review them. The OBT team is applying insights from this evaluation, the lessons learned exercise and focus groups into the planning and delivery of OBT 2024.
-
We interpreted an average (mean) score of between 3.1 and 5 as agreeing with the question, and an average score of 1-2.9 as disagreeing with the question. We considered scores of 3, or very close to 3, as a neutral answer (neither agreement or disagreement). ↩
-
After taking part in OBT, participants’ were asked whether they agreed with different statements, using a scale from 1 (strongly disagree) to 5 (strongly agree). Average responses were as follows: - “The OBT training was a good use of my time”: 3.19, - “The content was relevant to my role”: 3.44, - “I intend to apply learning from this training in my role” 3.55, - “I will book a training course related to data”: 3.13, - “I will create a development plan”: 3.16 ↩
-
Lacerenza, Christina N et al. “Leadership training design, delivery, and implementation: A meta-analysis.” The Journal of applied psychology vol. 102,12 (2017): 1686-1718. doi:10.1037/apl0000241 ↩
-
Chernikova, O., Heitzmann, N., Stadler, M., Holzberger, D., Seidel, T., & Fischer, F. (2020). Simulation-Based Learning in Higher Education: A Meta-Analysis. Review of Educational Research, 90(4), 499-541. https://doi.org/10.3102/0034654320933544 ↩
-
Sims, S., Fletcher-Wood, H., O’Mara-Eves, A., Cottingham, S., Stansfield, C., Van Herwegen, J., Anders, J. (2021). What are the Characteristics of Teacher, Professional Development that Increase Pupil Achievement? A systematic review and meta-analysis., London: Education Endowment Foundation. The report is available from: Teacher professional development characteristics ↩