News

Identifying AI Specific Risks

Can AI Safely Support PSHE/RSHE Teaching in Primary Schools?

AI in education is increasingly being used by primary schools to reduce workload, particularly in planning intensive subjects such as PSHE/RSHE. The search intent behind this topic is clear: school leaders and teachers are not asking whether AI exists, but whether it can be used safely, compliantly and effectively within a safeguarding critical subject.

Across primary settings, leads are under sustained pressure. The subject carries statutory responsibility through Relationships Education and Health Education, yet is often led alongside multiple other roles. Planning must be accurate, age appropriate, inclusive and defensible to parents, governors and inspectors. It is within this context that AI tools are being used to support lesson planning.

In practice, teachers are using AI to generate lesson ideas, discussion prompts and activities at short notice. This reflects operational reality rather than poor practice. However, the key issue is not access to AI, but whether its use aligns with safeguarding expectations and statutory guidance.

request pshe taster pack

AI can support planning efficiency, but it does not understand school context, pupil need or compliance frameworks. That distinction is critical. The remainder of this article sets out how schools can use AI safely, where the risks sit, and what robust implementation looks like in a UK primary context.

Compliance First: What UK Schools Must Not Get Wrong

Any use of AI in primary schools UK must sit firmly within existing statutory and safeguarding frameworks. There is no separate compliance pathway for AI. Instead, it intersects with established expectations.

The most relevant frameworks include:

  • Relationships Education and Health Education statutory guidance
  • Keeping Children Safe in Education
  • UK GDPR and data protection legislation
  • Ofsted Education Inspection Framework

From a compliance perspective, three non negotiables apply.

Firstly, AI generated content must be accurate and aligned with statutory RSHE expectations. AI tools are trained on broad datasets and may produce content that is not suitable for UK primary pupils or does not reflect current guidance.

Additionally, no identifiable pupil data should be entered into AI systems unless the platform has been formally approved and is fully compliant with data protection requirements. This includes safeguarding scenarios, behaviour incidents or pastoral records.

Lastly, accountability remains with the school. The Department for Education is clear that AI can support, but does not replace professional judgement. Any content used in the classroom must be reviewed and approved by a qualified teacher.

A simple compliance checklist for schools:

  • All AI outputs are reviewed before use
  • Content is checked against PSHE/RSHE policy and curriculum intent
  • No personal pupil data is entered into AI tools
  • Safeguarding leads are aware of how AI is being used

Without these controls, the risks are not theoretical. They are operational.

Safeguarding in Schools: Identifying AI Specific Risks

Safeguarding in schools extends beyond direct pupil interaction. It includes the systems and tools staff rely on. AI introduces a new layer of risk that must be explicitly managed.

The most significant AI safeguarding risks in schools fall into four categories:

  • Content risk: AI can generate material that is factually incorrect, developmentally inappropriate or poorly phrased for sensitive topics.
  • Context risk: AI does not understand your school community or local safeguarding concerns.
  • Data risk: Entering pupil information into AI tools can create GDPR breaches.
  • Professional reliance risk: Over reliance on AI can reduce critical evaluation.

From a DSL perspective, these risks align directly with existing responsibilities under Keeping Children Safe in Education. Schools should treat AI as they would any external digital tool.

Practical safeguarding controls include:

  • Restricting AI use to planning rather than live pupil interaction
  • Providing staff with clear examples of unsafe prompts
  • Embedding AI within online safety training
  • Including AI in safeguarding policy updates

Where AI Adds Value to PSHE Lesson Planning

When used within clear boundaries, AI in education can support PSHE lesson planning in ways that are both efficient and low risk.

The key is to limit its role to structured support tasks rather than content authority.

Effective use cases include:

  • Differentiation support: Generating adapted questions for different ability levels
  • Scenario creation: Drafting neutral discussion prompts
  • Vocabulary adaptation: Simplifying complex concepts
  • Retrieval practice: Creating recall questions

Teacher level checklist for safe use:

  • Edit AI generated content before use if required
  • Remove or adapt any sensitive elements
  • Cross reference with your scheme of work
  • Avoid generating highly sensitive content without oversight

AI should be treated as a drafting assistant, not a curriculum designer.

Strategy and Leadership: Building a Safe AI Approach

For school leaders, the challenge is not whether AI is being used, but whether it is being used consistently and safely across the organisation.

  • Policy integration: Embed AI into safeguarding and data policies
  • Staff training: Focus on practical safe use
  • Monitoring and accountability: Include AI in existing review systems
  • Governor oversight: Ensure awareness and challenge

From an inspection perspective, Ofsted evaluates outcomes rather than tools, but AI use will influence curriculum quality, safeguarding and workload.

Implementation Framework: Moving from Use to Safe Practice

  1. Define acceptable use
  2. Start with planning only
  3. Establish review protocols
  4. Train staff using real examples
  5. Monitor and refine
  6. Scale cautiously

Whole school implementation checklist:

  • Clear acceptable use guidance in place
  • No pupil data shared with AI tools
  • Staff trained in safe use
  • Safeguarding lead involved
  • AI use monitored

Conclusion: Safe Use Depends on Professional Oversight

AI in education can support teaching in primary schools, but only within clearly defined boundaries. It offers efficiency benefits but cannot replace professional judgement or safeguarding awareness.

PSHE/RSHE requires accuracy, sensitivity and consistency. The schools that succeed with AI are those that anchor its use in safeguarding, provide clear guidance and maintain leadership oversight.

Used thoughtfully, AI is a valuable support tool. Used without structure, it introduces avoidable risk.

FAQ: AI in PSHE and Primary School Safeguarding

1. Can AI be used for PSHE lesson planning in UK primary schools?

Yes, but only as a support tool. All content must be reviewed and aligned with statutory guidance.

2. What are the main AI safeguarding risks in schools?

Inappropriate content, lack of context, data breaches and over reliance on AI outputs.

3. Is it safe to enter pupil information into AI tools?

No, unless the platform is fully GDPR compliant and approved.

4. How should schools manage AI use in PSHE/RSHE?

By embedding it within safeguarding and data policies, training staff and reviewing outputs.

5. Will Ofsted inspect AI use in primary schools?

No, but it will evaluate safeguarding, curriculum quality and leadership.

6. What is the safest way to introduce AI in PSHE/RSHE teaching?

Start with planning, set boundaries, train staff and monitor use.