Generative artificial intelligence (AI) and data protection in schools

How to address potential data protection risks of using generative AI in schools.

Generative AI refers to any type of artificial intelligence that creates new digital content, such as text, images, videos or other data. Unlike traditional AI, which relies on exact programming to complete specific tasks, generative AI uses machine learning to create new digital content.

This guidance is to help schools understand:

  • how to protect personal data when using generative AI
  • data protection legislation in relation to generative AI
  • the risks and biases of AI to personal data

Watch ‘Protecting children’s privacy when using artificial intelligence’ on YouTube.

In a school, generative AI tools can be used as a starting point to develop resources, including: 

  • lesson plans or activities
  • questions and quizzes
  • revision activities
  • images to help with character descriptions or stories
  • communications for parents and carers
  • creating timetables

Check with your data protection officer or IT lead for further guidance on what may be acceptable use for your school.

Example

A KS1 teacher wants to teach their class about the Great Fire of London through an interactive lesson. The teacher uses an AI tool to develop a lesson plan where the class play villagers, fire fighters and King Charles II and his advisors. The teacher does not put any personal data into the AI tool and fact-checks the information against a reliable source.

Use our posters to remind staff at your school about the important aspects of data protection, including the use of generative AI.

Data protection and AI tools in education

Generative AI tools could offer significant benefits in education, but they come with certain risks related to data protection.

Schools should be open and transparent about how they use generative AI tools. Staff, students, governors, parents and carers should understand how their personal data is processed.

Ofsted have published guidance on the ways it expects providers to use AI.

The Joint Council for Qualifications (JCQ) guidance on AI use in assessments sets out best practice for students and teachers, to help them understand their responsibilities where AI tools are being used as part of qualification assessments.

Open and closed generative AI

There are important differences between open and closed generative AI tools in terms of data protection.

Open generative AI tools are accessible and modifiable by anyone. They may store, share, or learn from the information entered into them, including personal or sensitive information. You should avoid including any identifiable information in the data you enter into open AI tools, to protect personal and special category data.

Closed generative AI tools are generally more secure, as external parties cannot access the data you input. This makes them a safer option for handling personal or special category data.

It’s not always obvious whether a generative AI tool is open or closed. Check with your school’s data protection officer or IT lead to find out more about the generative AI tools available at your school.

If your school has a closed generative AI tool and you choose to put personal or special category data into it, you must include how you use this data in your school’s privacy notice.

Compliance with data protection legislation

The generative AI tools you use must comply with data protection legislation and your school’s data protection notice.

To protect data when using generative AI tools, you should:

  • seek advice from your data protection officer or IT lead
  • check whether you are using an open or closed generative AI tool
  • ensure there is no identifiable information included in what you put into open generative AI tools
  • acknowledge or reference the use of generative AI in your work
  • fact-check results to make sure the information is accurate

Example

A school administrator wants to email a parent about their child’s behaviour. They are unsure how to word the email. The administrator enters notes into an open generative AI tool and asks the tool to create the email for them.

The notes include the pupil’s name, class and details of a behavioural incident. When the administrator submits the information, they release the pupil’s personal data to the generative AI tool. The tool may incorporate this data in future responses, making it available to other people. The data will also be retained and visible to the organisation that owns the tool. The organisation that owns the tool may keep the data and be able to view it.

If you choose to enter pupils’ personal data into an AI tool, ensure to check with your data protection officer or IT lead that it is safe to do so. Observing data protection principles while handling personal data in AI tools will help your compliance with data protection legislation.

The ICO has information on what to do if you want to use generated AI to profile children or make automated decisions about them.

Personal data collected by generative AI tools

Some generative AI tools process and store more information than just the text you enter into them.

Generative AI tools may collect and store additional data such as:

  • location
  • IP address
  • system information
  • browser information

The data collected by these organisations can be viewed or sold to third parties. Schools must include how any data is collected, processed and stored by generative AI tools in the school’s privacy notice.