Generative artificial intelligence (AI) in education
Updated 26 October 2023
Applies to England
This document sets out the position of the Department for Education (DfE) on the use of generative artificial intelligence (AI), including large language models (LLMs) like ChatGPT or Google Bard, in the education sector.
This statement:
- is informed by the government’s white paper on a pro-innovation approach to AI regulation
- follows the government’s announcement to set up an expert Frontier AI Taskforce to help the UK adopt the next generation of safe AI
Understanding generative AI
Generative AI refers to technology that can be used to create new content based on large volumes of data that models have been trained on from a variety of works and other sources. ChatGPT and Google Bard are generative artificial intelligence (AI) tools built on large language models (LLMs).
Tools such as ChatGPT and Google Bard can:
- answer questions
- complete written tasks
- respond to prompts in a human-like way
Other forms of generative AI can produce:
- audio
- code
- images
- text
- simulations
- videos
AI technology is not new and we already use it in everyday life for:
- email spam filtering
- media recommendation systems
- navigation apps
- online chatbots
However, recent advances in technology mean that we can now use tools such as ChatGPT and Google Bard to produce AI-generated content. This creates opportunities and challenges for the education sector.
Opportunities for the education sector
Generative AI tools are good at quickly:
- analysing, structuring, and writing text
- turning prompts into audio, video and images
When used appropriately, generative AI has the potential to:
- reduce workload across the education sector
- free up teachers’ time, allowing them to focus on delivering excellent teaching
However, the content produced by generative AI could be:
- inaccurate
- inappropriate
- biased
- taken out of context and without permission
- out of date or unreliable
Using AI effectively
Teacher workload is an important issue and we are committed to helping teachers spend less time on non-pupil facing activities.
We are working with the education sector and with experts to identify opportunities to improve education and reduce workload using generative AI.
Having access to generative AI is not a substitute for having knowledge in our long-term memory. To make the most of generative AI, we need to have the knowledge to draw on.
We can only:
- learn how to write good prompts if we can write clearly and understand the domain we are asking about
- sense-check the results if we have a schema against which to compare them
Generative AI tools can make certain written tasks quicker and easier, but cannot replace the judgement and deep subject knowledge of a human expert. It is more important than ever that our education system ensures pupils acquire knowledge, expertise and intellectual capability.
The education sector should:
- make the most of the opportunities that technology provides
- use technology safely and effectively to deliver excellent education that prepares pupils to contribute to society and the future workplace
The limitations of generative AI tools
Generative AI tools can produce unreliable information, therefore any content produced requires professional judgement to check for appropriateness and accuracy.
Generative AI:
- returns results based on the dataset it has been trained on – for example, a generative AI tool may not have been trained on the English curriculum
- may not provide results that are comparable with a human-designed resource developed in the context of our curriculum
Whatever tools or resources are used to produce plans, policies or documents, the quality and content of the final document remains the professional responsibility of the person who produced it and the organisation they belong to.
Schools and colleges may wish to review homework policies and other types of unsupervised study to account for the availability of generative AI.
Higher education institutions may wish to review the intellectual asset management guide in regards to developing student policies on the IP they create, and how they interact and use IP of others in light of generative AI use.
Protecting data, pupils and staff
Generative AI:
- stores and learns from the data it is given – any data entered should not be identifiable
- can create believable content, including more credible scam emails requesting payment – people interact with generative AI differently and the content may seem more authoritative and believable
Schools and colleges should:
- protect personal and special category data in accordance with data protection legislation
- not allow or cause intellectual property, including pupils’ work, to be used to train generative AI models, without appropriate consent or exemption to copyright
- review and strengthen their cyber security by referring to the cyber standards – generative AI could increase the sophistication and credibility of attacks
- ensure that children and young people are not accessing or creating harmful or inappropriate content online, including through generative AI - keeping children safe in education provides schools and colleges with information on:
- what they need to do to protect pupils and students online
- how they can limit children’s exposure to risks from the school’s or college’s IT system
- refer to the filtering and monitoring standard to make sure they have the appropriate systems in place
Find out more on:
Data privacy
It is important to be aware of the data privacy implications when using generative AI tools, as is the case with any new technology. Personal and special category data must be protected in accordance with data protection legislation.
If it is strictly necessary to use personal and special category data in generative AI tools within their setting, the education institution must ensure that the products and procedures comply with data protection legislation and their existing data privacy policies to protect the data.
Education institutions should also be open and transparent, ensuring the data subjects (pupils) understand their personal or special category data is being processed using AI tools.
Find out more about:
Intellectual property
Most generative tools will use the inputs submitted by users to further train and refine their models.
However, pupils own the intellectual property (IP) rights to original content they create. Original content is likely to include anything that shows working out or is beyond multiple choice questions. Intellectual property can only be used to train AI if there is consent from the rights holder or an exemption to copyright applies.
Some tools allow users to opt out of inputs being used to train the models.
Education institutions must not allow or cause pupils’ original work to be used to train generative AI models unless they have appropriate consent or exemption to copyright. Consent would need to be from the student if over 18, and from their parent or legal guardian if under 18.
Exemptions to copyright are limited, and education institutions may wish to take legal advice to ensure they are acting within the law.
Formal assessments
Schools, colleges, universities and awarding organisations need to continue to take reasonable steps where applicable to prevent malpractice involving the use of generative AI.
The Joint Council for Qualifications has published guidance on AI use in assessments to support teachers and exam centres in protecting the integrity of qualifications. This guidance includes information on:
- what counts as AI misuse
- the requirements for teachers and exam centres to help prevent and detect malpractice
Knowledge and skills for the future
To harness the potential of generative AI, students will benefit from a knowledge-rich curriculum which allows them to become well-informed users of technology and understand its impact on society. Strong foundational knowledge ensures students are developing the right skills to make best use of generative AI.
The education sector needs to:
- prepare students for changing workplaces
- teach students how to use emerging technologies, such as generative AI, safely and appropriately
At different stages of education, this teaching may include:
- the limitations, reliability, and potential bias of generative AI
- how information on the internet is organised and ranked
- online safety to protect against harmful or misleading content
- understanding and protecting IP rights
- creating and using digital content safely and responsibly
- the impact of technology, including disruptive and enabling technologies
- foundational knowledge about how computers work, connect with each other, follow rules and process data
The Office for AI is currently conducting research into the skills that will be needed for future workforce training.
The education system should:
- support students, particularly young pupils, to identify and use appropriate resources to support their ongoing education
- encourage effective use of age-appropriate resources (which, in some instances, may include generative AI)
- prevent over-reliance on a limited number of tools or resources
DfE will continue to work with experts to:
- consider and respond to the implications of generative AI and other emerging technologies
- support primary and secondary schools to teach a knowledge-rich computing curriculum to children up to the age of 16