Teach Secondary 14.4
Can crowd-sourced solutions, like X’s ‘Community Notes’ feature, play a role in those considerations? The inevitable response to that is ‘ At what point do you moderate a piece of content? ’ Malicious community reporting is certainly possible, by organising enough people to state that a given post is factually incorrect. You still need a system that runs things, an algorithm of some kind deciding at what point a piece of content should be marked as potentially contentious. But such solutions won’t work when people on the platforms in question already have polarised or extreme positions. Community Notes and similar measures can only be effective when people are willing to accept differing points of view. What balance should schools strike between educating students about the online content they’re accessing now, and the online experiences they can expect to have in early adulthood? Schools already have some input into child protection and general internet safety efforts – ‘ Don’t speak to strangers online ’, and so on. I see far less emphasis on the importance of critical thinking, which I feel is a missed opportunity. Critical thinking can protect you from cybercrime and cybersecurity incidents, because when we get down to it, both are built on attempts at persuading people to take some form of action. I first learned about bias and propaganda while studying Nazi Germany at school. Right now, there’s a fantastic opportunity for us to move beyond the mid-20th century context and ask students what those same systems and processes might look like now. Because I think you’ll get some interesting answers. Some may conclude that influencers use similar strategies to promote certain consumer brands. They’ll see that there’s a whole spectrum of influence operations, from ‘legitimate advertising’ to exposés of what genuinely bad guys are up to. What’s the common denominator in all this? It’s that somebody is trying to get you to do something, or think a certain way. That’s why critical thinking is so important – it helps develop that ‘Spidey Sense’ for detecting when content and communications don’t feel quite right. For all the tech we can use to identify, say, manipulated videos or deepfakes – and for the avoidance of doubt, those measures are really important – they can’t work effectively on their own. We still need people who can critically evaluate what they’re being shown. What key online knowledge and skills would you like to see taught to all 16-year-olds? ‘Digital hygiene’ is something that will stand anyone in good stead, whether they’re 16 or 76. That includes all the obvious things – like observing good password security habits and the like – but it’s also about about taking just a second or two whenever you see something in a post, email, video or game which gives you pause. That can be difficult. Technology moves fast, with people consuming and producing content at a rapid rate – but it’s a form of mindfulness that canmake you feel better, while also saving you from a great deal of bother. Students should be encouraged to regularly consider ‘ What am I looking at here? Why have I got this? Why am I seeing this? ’ We should want young people to be less, ‘ Hey, there’s this thing I need to react to right now ’ and more, ‘ Hey, there’s this thing A CRITICAL APPROACH “Approaching online material critically will often involve asking a series of questions similar to those I’d use when working as an intelligence analyst: Who’s doing this? Why are they opting to present the material in this particular way? What’s in it for them by doing so? Social media channels will usually have a specific reason for presenting something to you – be it ‘ Buy my latest merch ’, or ‘ I want you to think ‘this’ way, so that you will take this action .’ Taking a moment to actively consider ‘ Why does somebody want me to think this, or do that? ’ is what will ultimately save humanity from the robots.” 57 teachwire.net/secondary E D T E C H
Made with FlippingBook
RkJQdWJsaXNoZXIy OTgwNDE2