AI in Schools: What Should Leaders Buy, Trial or Avoid in 2026?

Artificial intelligence has moved quickly from education conference talking point to real school procurement issue.
A few years ago, many school leaders were asking whether AI belonged in education at all. Now the questions are more practical: which tools should we allow, which should we block, which could reduce workload, and which ones are more trouble than they are worth?
The government is actively encouraging the development of safe AI tools for education. In April 2026, the Department for Education invited EdTech companies and AI labs to help build safe, personalised AI tutoring tools for disadvantaged pupils. (GOV.UK) Earlier this year, the government said up to 450,000 disadvantaged pupils could benefit from AI tutoring tools, with safe tools co-created with teachers expected to be available by the end of 2027. (GOV.UK)
That does not mean every AI product is ready for schools. It does mean schools need a clear way to judge what is useful, what is safe and what should be avoided.
AI is not one thing
One of the problems with AI in schools is that the term is used far too broadly.
An AI marking tool is not the same as an AI tutor. A lesson planning assistant is not the same as a safeguarding system. A chatbot for pupils is not the same as a workload tool for teachers.
Before buying or trialling anything, schools should be clear about the category of tool they are looking at.
Common examples include:
- AI tutoring tools
- Lesson planning assistants
- Quiz and assessment generators
- Feedback and marking tools
- Admin automation tools
- Parent communication tools
- Translation and accessibility tools
- Behaviour or attendance analytics
- Safeguarding monitoring tools
Each comes with different levels of risk. A tool that helps a teacher draft a worksheet is very different from one that interacts directly with pupils or makes recommendations about safeguarding.
Start with the problem, not the product
The worst reason to buy AI is because it sounds innovative.
Schools should begin with a clear problem. Is the aim to reduce teacher workload? Support pupils who need extra practice? Improve communication with families? Help staff analyse attendance patterns? Make resources more accessible?
If the problem is vague, the purchase will probably be vague too.
A good AI procurement question is not “How can we use AI?” It is “Which part of school life is taking too much time, creating too much inconsistency or leaving pupils without enough support?”
Only then should schools ask whether AI is the right answer.
Teacher oversight is non-negotiable
AI tools can produce confident answers that are wrong, biased, incomplete or unsuitable for a particular pupil. That is why teacher oversight matters.
This is especially important with AI tutoring. A pupil may receive an explanation that looks polished but misses the misconception. A tool may give feedback that is technically correct but pitched at the wrong level. It may also fail to notice frustration, anxiety or disengagement in the way a skilled adult would.
AI may support teaching, but it should not quietly replace professional judgement.
Schools should be cautious about any supplier that suggests their tool can operate without meaningful staff involvement. The best tools should make teachers more effective, not remove them from the process.
Data protection has to come early
AI procurement should involve data protection from the start, not after the demo.
Schools need to know exactly what data is collected, where it is stored, how long it is kept, whether it is used to train models, and whether third parties can access it. This becomes even more important if pupils are using the tool directly.
Questions to ask suppliers include:
- What personal data does the tool collect?
- Is pupil work stored?
- Is data used to train or improve the AI model?
- Where is the data hosted?
- Can the school delete data permanently?
- Does the tool integrate with existing school systems?
- What happens if the contract ends?
- Has the supplier completed education-specific security checks?
If a supplier cannot answer these questions clearly, that is a warning sign.
Be wary of impossible claims
Some AI products are being marketed with bold promises: saving hours every week, closing attainment gaps, transforming feedback, personalising every lesson.
Some may help. But schools should be wary of claims that sound too neat.
Education is complex. A tool may reduce time in one area but create new work elsewhere. It may work well for confident teachers but less well for early career staff. It may support some pupils but confuse others. It may produce resources quickly, but still require careful checking.
A good supplier should be honest about limitations. They should be able to explain where the tool works well, where it should not be used and what evidence supports their claims.
AI and pupil skills: a real concern
There is also a wider educational concern. Teachers have warned that pupils’ increasing reliance on AI may affect writing, thinking and creativity. A recent NEU survey reported concerns from secondary teachers that pupils were losing core skills, while also noting that many teachers themselves use AI for tasks such as resource creation and lesson planning.
This is the balance schools have to strike.
AI can support learning, but it should not do the thinking for pupils. It can help a teacher create examples, but pupils still need to wrestle with ideas. It can explain a concept, but it cannot replace discussion, questioning, feedback and human relationships.
Schools need policies that distinguish between appropriate support and academic shortcut.
What schools might trial
There are areas where cautious AI trials may make sense.
For staff, AI may help with first drafts of resources, quiz questions, summaries, translations, admin templates and planning ideas. These uses still require checking, but the risk is lower because adults remain in control.
For pupils, schools may want to start with limited, supervised use. For example, AI could help pupils practise retrieval questions, generate revision prompts, simplify a text, or support accessibility. Any pupil-facing use should be age-appropriate, monitored and clearly explained.
For leaders, AI may support analysis of attendance trends, behaviour records or workload patterns, provided data protection and accuracy are properly managed.
What schools should avoid
Schools should be much more cautious about:
- Tools that interact with pupils without adult visibility
- Products that make safeguarding judgements
- Systems that profile pupils without clear explanation
- Tools that use pupil data to train models without consent or control
- Products that cannot explain how outputs are generated
- AI marking tools used for high-stakes assessment without robust checking
- Any system that encourages pupils to bypass learning rather than deepen it
The issue is not whether AI is good or bad. The issue is whether a particular tool is safe, useful and properly governed.
Build an AI approval process
Many schools now need a simple AI approval process. This does not have to be complicated, but it should be consistent.
A useful process might include:
- What problem does the tool solve?
- Who will use it: staff, pupils or both?
- What data does it collect?
- Has the data protection lead reviewed it?
- How will staff check outputs?
- What training is needed?
- How will impact be measured?
- What are the risks for pupils?
- How easily can the school stop using it?
- Does it fit the school’s teaching and learning approach?
This gives leaders a way to say yes, no or not yet.
AI should earn its place
Schools do not need to rush into every AI product. They also cannot ignore AI completely.
The sensible position is somewhere in the middle. AI should be treated like any other major school purchase: judged on need, safety, value, evidence and impact.
The tools that deserve a place in schools will be the ones that reduce unnecessary workload, support pupils without replacing teachers, protect data properly and fit into real classroom practice.
AI may become a useful part of education. But in schools, usefulness matters more than novelty.