Teach-Secondary-14.1

ABOUT THE AUTHOR James Saunders is the headteacher at Honywood School, Coggeshall, Essex other areas of the tech sphere, children and adults alike are finding their way through curiosity and experimentation. Yet whilst those attributes are an important part of great learning, they aren’t without risk – which is why I’ve felt the need to implement a more structured approach, through policy. Getting in early I can understand why some schools may be holding back on this. As soon as you put a policy in place, you are, in a way, shining a spotlight on the topic that policy covers. You’re setting clear standards and expectations, against which you then can and should be held to account. With so many unknowns surrounding the issue of AI, this may seem like an unnecessarily bold move to make right now. But the fact is, I want our staff and learners to reap the benefits of AI as soon as possible. My predecessor at Honywood took the same approach with personal computing devices; we’ve been issuing iPads to all learners since 2011, and our pandemic experience was considerably eased as a result. Ignoring, or even banning ChatGPT, otter.ai and the like doesn’t fit with our capitals-based curriculum vision. Instead, I want to ensure that we have adequate systems, training and guidance in place to ensure that such tools can be used appropriately, responsibly, and above all, safely. The first iteration of Honywood’s AI policy was shared with the LGB in November 2024. I produced the original draft, thinking about what I wanted to achieve. Those goals included improving teaching and learning outcomes; ensuring KEEP IT S.M.A.R.T. AndyWood shares his advice on building a sound AI policy for schools... SUPPORT LEARNING GOALS Ensure that any integration of AI tools supports and enhances the school’s curriculum objectives. AI should be a supplemental resource that promotes personalised learning, fosters critical thinking and enriches the educational experience, while upholding the integrity of the teaching process. Consult subject leaders to define how AI tools can complement specific subjects and learning outcomes. MANAGE RISKS AND PRIVACY Prioritise safeguarding by addressing the potential risks associated with AI, such as deepfakes, impersonation and misuse of AI tools. Policies should also ensure compliance with GDPR and all other data protection regulations, so as to protect the personal and sensitive information of learners and staff. Collaborate with your IT and safeguarding teams when evaluating and approving AI tools, and provide regular staff training on how to identify and mitigate AI-related risks. ACT TRANSPARENTLY Maintain clarity about where, when and how AI tools will be used within the school, ensuring that all stakeholders, including parents and learners, are informed. Staff should take responsibility for the quality and accuracy of any AI-generated content or feedback used in teaching or assessment. Require staff to label any AI-generated materials, and document all instances of AI usage within lesson plans and other school activities. RESPECT ETHICAL STANDARDS Emphasise the importance of ethical AI use, including active avoidance of bias, respect for intellectual property and promotion of fairness and inclusivity. Establish protocols to ensure that AI tools align with these ethical principles before being adopted. Implement periodic reviews of AI tools to identify and address any potential biases or ethical concerns, while inviting feedback from learners and staff. TRAIN AND MONITOR Provide staff with the necessary training and ongoing support to use AI effectively and responsibly, in a way that complements their professional expertise. Regularly monitor AI’s impact on teaching, learning and administrative tasks, and adapt practices based on outcomes and feedback. Integrate AI training into personal development reviews, and plan biannual evaluations of the policy’s implementation and effectiveness. AndyWood provides strategic leadership for one of the UK’s foremost consultancy and digital service providers, and is a parent governor at Honywood School, Coggeshall, with special responsibility for ICT an ethical and legal use of AI; protection of privacy and data; utilisation of AI to reduce our staff’s administrative and academic workload; and to remain at the forefront of education by integrating AI to enhance and supplement the school’s mission to best support young people. I didn’t ask AI to write it for me, but I did employ the kind of approach that an AI might have used – looking for examples created by others, fromwhich I could learn. Luckily, one of our governors works for a large trust and sent me her copy of theirs to look at. I was also able to call on the expertise of another of our governors, AndyWood, who works in the digital space, and whose ‘SMART’ advice (see panel below) was invaluable. I have no doubt that we’ll need to revise and update our AI policy frequently, in response to both technological developments and our own learning – but with a clear AI policy in place, however embryonic it may be, I’m pleased to report that it feels like the spectre of Skynet is just that little bit further away... 71 teachwire.net/secondary E D T E C H

RkJQdWJsaXNoZXIy OTgwNDE2