News

Identifying AI Specific Risks

Can AI Safely Support PSHE/RSHE Teaching in Primary Schools?

AI in education is already being used by primary teachers to reduce workload, particularly in planning intensive subjects such as PSHE and RSHE. The key question for school leaders is no longer whether AI exists, but whether it can be used safely within safeguarding and statutory expectations.

In many UK primary schools, PSHE is led alongside multiple responsibilities. Planning must be accurate, age appropriate and defensible to parents, governors and inspectors. As a result, teachers are turning to AI tools to support lesson preparation.

However, the critical issue is not access to AI. It is whether its use aligns with safeguarding in schools, statutory RSHE guidance and professional accountability.

request pshe taster pack

Where AI Is Already Being Used in PSHE

In practice, teachers are using AI in primary schools UK for:

  • Generating discussion questions
  • Adapting vocabulary for different ability levels
  • Creating starter activities and plenaries
  • Supporting retrieval practice

This reflects operational reality. Used appropriately, AI can support efficiency. Used without structure, it introduces risk.

Compliance First: What UK Schools Must Not Get Wrong

Any use of AI must sit within existing frameworks, including:

  • Relationships Education and Health Education statutory guidance
  • Keeping Children Safe in Education
  • UK GDPR and data protection legislation
  • Ofsted Education Inspection Framework

From a compliance perspective, three non negotiables apply.

  • All AI generated content must be reviewed and aligned with statutory expectations
  • No identifiable pupil data should be entered into AI tools
  • Accountability remains with the school, not the technology

The Department for Education is clear that AI can support professional practice, but does not replace it.

Safeguarding in Schools: AI Specific Risks

AI safeguarding risks in schools fall into four key categories.

  • Content risk: inaccurate or inappropriate material
  • Context risk: lack of understanding of school community
  • Data risk: potential GDPR breaches
  • Reliance risk: reduced professional judgement

These align directly with responsibilities under Keeping Children Safe in Education. Schools should treat AI as they would any external system.

Safe vs Unsafe AI Prompts in PSHE

Unsafe prompt:

“Create a detailed relationships scenario for Year 6 pupils with realistic issues.”

Why this is risky: It may generate inappropriate or sensitive safeguarding content.

Safer prompt:

“Create general discussion questions about positive friendships for Year 6 pupils in line with UK PSHE guidance. Avoid sensitive or safeguarding related scenarios.”

This distinction is critical when managing AI safeguarding risks in schools.

Where AI Adds Value to PSHE Lesson Planning

When used within boundaries, AI in education supports:

  • Differentiation of questions
  • Vocabulary simplification
  • Idea generation for activities
  • Creation of recall quizzes

It should not be used to generate sensitive safeguarding content without oversight.

Leadership Strategy: Building a Safe Approach

For school leaders, safe implementation requires structure.

  • Embed AI within safeguarding and data policies
  • Train staff on safe prompt use
  • Monitor how AI is used across the school
  • Ensure DSL awareness and oversight

While Ofsted does not inspect AI directly, its use impacts safeguarding, curriculum quality and leadership judgement.

Implementation Framework for Schools

  • Define acceptable use of AI
  • Limit use to planning initially
  • Require review of all outputs
  • Train staff using real examples
  • Monitor and refine practice

Conclusion

AI in primary schools UK can support PSHE lesson planning, but only within clearly defined boundaries. It improves efficiency but does not replace professional judgement.

The highest risk area is the generation of sensitive content without oversight. Schools that succeed with AI are those that anchor its use in safeguarding, provide clear guidance and maintain leadership control.

Used thoughtfully, AI is a valuable support tool. Used without structure, it introduces avoidable risk.

Download PSHE Resources

Request a PSHE taster pack aligned with statutory guidance