The future is here

  • 1st July 2025

Artificial intelligence is creating risks and opportunities for independent schools, advises Mark Dendy

 

Artificial intelligence, generative AI, machine learning, large language models, call it what you will, AI is here to stay. Thinking of it as futuristic has been out of date for some time. Railing against it and hoping it doesn’t happen is also a bit ‘last year’; it is happening and as school leaders you will increasingly need to understand how, why and what to do about it. Sorry to be blunt, but this ‘AI stuff’ brings all manner of risks, benefits and opportunities that absolutely will need to be carefully managed by your school.

The good stuff

So, let’s scrape the sand away, exhume our heads and appreciate some of the good things AI can do for you, your pupils and staff. In the right context, AI can answer questions, help with drafting assignments, and it can create code, images, audio and videos – all of which have a place in education. In the wider world you will have experienced online chatbots, used navigation apps, and your IT network is almost certainly being monitored by AI for threats.

Teachers can save time when creating lessons or writing pupil reports. AI is amazing at helping to produce content to enrich existing materials and help bring topics to life. It can also be used in more specific circumstances to tailor teaching materials for pupils with specific learning needs. There’s a lot to like about a technology that can refresh how you deliver the curriculum or reduces time on tedious admin tasks.

So, there’s clearly substantial potential for AI to be a force for good in a school setting. Like many technologies, if you understand the risks and embrace the benefits, then it can be a powerful and useful tool. But even if you don’t, it’s coming in any event, so we need to be ‘on the front foot’.

Develop a clear strategy

Given the complexities, it seems reasonable that a school should plan how it goes about adopting AI. Nothing too highfalutin, but a statement of intent with some goals, a nod to the benefits, the potential risks and with a hefty serving of militation will go a long way. This doesn’t have to be more than a few paragraphs but it should cover the school’s considered guidelines for ethical AI use, data privacy, security and (a crowd-pleaser this) use of AI for cheating. Yes, pupils the world over are using AI to write their homework, and educators are using AI to catch them. You can set an essay and a pupil can give AI five bullet points to write it. Equally, a teacher can ask AI to distil the essay into five bullet points to check it meets the brief. And along the way no one learned anything beyond how handy it is to have AI save you some time.

Of course, any discerning educator will make a distinction between a diligent pupil using AI as a tool to learn more about a topic, to explore the issues more deeply, and even to help with drafting original content or understanding how to answer a question. But there’s a difference between that and just using it to bang out some homework with minimal effort (and zero learning benefit in the process).

You could get a basic AI strategy by using a decent prompt in your favourite generative AI tool, such as ChatGPT or Microsoft Copilot, and then knock it about until it feels like yours, and you’ll have joined the AI game in fine style. But only do this if you actually understand what you’re playing with. There is plenty of professional guidance out there and this is an investment when all’s said and done.

Focus on data privacy and security

AI agents are great at trawling through huge amounts of data. Unfortunately that can also expose obscure data which has perhaps been misfiled or not sufficiently secured. AI agents are only as secure as the access rights they’ve been granted over your data. So, having your IT folk set up some dummy accounts to search for interesting material on your network is well worth a couple of days’ effort. This should be repeated on a regular basis.

For example, imagine an old file server has been copied onto your new cloud platform but the security hasn’t been tightly managed. While pupils don’t know it’s there and haven’t gone looking for potentially sensitive information (‘security by obscurity’, one might call it), AI will show no such restraint. And that could return answers to prompts which put you in legal hot water (or flushed with embarrassment in the staff room). Queries such as ‘show me the recent staff appraisals’ should probably return a blank. Anything financial or HR-related leaking through AI risks reputational damage so an investment here is well worth your time and money.

Training and professional development

Teachers and staff must be trained to use the tools you’re allowing to their best effect. Using AI as part of the teachers’ toolkit will be essential, but having some staff embrace it and learn their own ways to deploy AI while others eschew it and leave it to let rip is not the way forward. A considered plan rolled out to all staff will give a level of understanding and control which looking away and crossing your fingers won’t achieve.

Again, take some guidance on this from people who know. What are the best small-scale projects to try first? What are the capabilities and limitations of AI in the school setting? Teachers should be supported as they integrate AI into their teaching practices for the future.

Who and what else to include

After the obvious training of staff, you’re going to need to educate your pupils, parents and other members of your virtual hinterland on where they may or may not use AI. Questions will be asked, and you’ll need to be able to field them. There are plenty of training resources out there for this but tailoring something specific to include in your school’s rules will be necessary and should be part of your plans.

A good theme to help you roll out AI can be ethical use guidelines. Establish the principles to be adhered to and remind people often. For example, teachers might be permitted to use AI to set homework, but pupils might not be permitted to use it to write their essays. Obvious, but you’ll need to restate it every year.

Future use

All the above will simply be the start. You will have a strategy – a code of conduct; everyone’s been given some understanding of how you should (and should not) be using AI in your school. Ethics have been described on tablets, protocols carved in Word. But it won’t end there. AI isn’t a new law that’s going to be adhered to until another takes its place. It’s an organism, evolving daily to make life easier for us, sometimes, and harder for us at other times. So, your plan can’t be written, published and left for posterity. You must include regular reviews, benchmarking against best practices and continuous training for those using it.

You will also have to make choices about which AI tools you allow. There are education-specific ones, but also non-specialised ones like ChatGPT or Google Bard. Also, be mindful that private AI tools may be subject to changes of terms and conditions which could take away your control, but don’t expect a kitemark or ethical compliance promise anytime soon.

Summary

AI is going to have a transformative impact on education. But there’s risk too, and as with any risk, owning it and controlling it is far better than just letting it happen. The benefits can be huge, but you will need an investment of time and money to prevent the future controlling you.

Sorry, but it is not a simple job with a firm conclusion – an AI strategy is an ongoing process. Start now.

 

Mark Dendy is a senior consultant at business management firm Adapta Consulting

Mark Dendy

Keep Updated

Sign up to our weekly newsletter to receive the latest news.