Technology & Innovation - Issue 12
LESSONS IN AI Mac Bowley looks at what teachers can do to help young people safely navigate online spaces in a post-AI world A rtificial intelligence is no longer a distant concept. It’s already reshaping howwe live, work and learn. For today’s young people, understanding how to use AI tools responsibly is crucial. As these tools become more common in schools, homes and workplaces, the need for thoughtful, well-supported education around their safe and ethical use will be more important than ever. AI literacy isn’t just about understanding the technology; it’s about equipping learners to critically reflect on the role of AI technologies in society, how they’re used, what they’re capable of and where the risks lie. In other words, it’s about teaching AI safety. Essential conversations The UKAI Safety Institute defines ‘AI safety’ as, “ The understanding, prevention and mitigation of harms fromAI. These harms could be deliberate or accidental; caused to individuals, groups, organisations, nations or globally; and of many types, including but not limited to physical, psychological, social, or economic harms. ” In response to this growing need, we created Experience AI – a free programme co-developed with Google DeepMind to support secondary school teachers in delivering high-quality AI education to students aged 11 to 14. This year we’ve added a new ‘AI safety’ module that introduces students to some of the key challenges posed by AI systems – such as the spreading of misinformation, risks to data privacy and the ethics of responsible use – while equipping educators with everything they need to lead rich and relevant discussions. Whether you’re a computing specialist, or simply looking to support digital resilience in your subject area, the AI safety resources – and Experience AI more broadly – aim to make it easier for you to bring these essential conversations into the classroom. What the resources cover The newAI safety module addresses topics that are already familiar to educators, such as media literacy and online safety, but reframes them through the lens of artificial intelligence. The aim of the lessons is to encourage students to think critically about the systems they encounter in everyday life. Each lesson explores a specific area: • Your data and AI – How data-driven AI systems use data differently to traditional software, and the implications of this for data privacy concerns • Media literacy in the age of AI – Explores the ease with which believable AI- generated content can be created, and the importance of verifying information • Using AI tools responsibly – Encourages critical thinking around howAI is marketed, and develops students’ understanding of their own personal responsibilities and those of developers Each lesson is designed to engage young people in considering both the ethical responsibilities of AI developers and their own prior interactions with AI systems. The lesson materials include short animated videos to introduce key concepts, screen-free activities to reinforce learning and structured discussion prompts intended to spark reflection. Above all, the resources are designed to be used flexibly. Educators can run a full one-hour lesson, deliver a shorter session during tutor time or draw on suggested discussion questions to explore key ideas more informally. There’s no pressure to explore every topic, and the content can be easily tailored “For today’s youngpeople, understandinghowtouseAI tools responsibly is crucial” 48 teachwire.net
Made with FlippingBook
RkJQdWJsaXNoZXIy OTgwNDE2