“Out of adversity comes opportunity” – Benjamin Franklin
Ok, so having access to many AI-enhanced tools as an educator or a student doesn’t sound like the worst nightmare ever. When we consider how technology can and does permeate our lives, never more has it been more important for educators and students to have strong skills in digital citizenship.
From being aware of social engineering attacks designed to infiltrate our networks to why 2-factor authentication is essential to how to best use AI-enhanced tools to improve our teaching, and student learning or reduce workload, technology has seen an insurmountable rise in the last 11 months.
The problem is, when it comes to the latter of those examples above, it has been equally as important to understand the reasons why we shouldn’t use AI-enhanced tools, as it has been to learn what we should use them for.
Lots of time and interest have been spent on what we can do but not so much on what we shouldn’t do. Sure, we can use Generative AI to:
- help write policies, strategies, and lesson plans
- simplify or summarise large documents
- draft emails and reports
- create transcripts from videos
…but should we? Well, yes we should, but how many of us, and our young people who are using these tools outside of class know about:
- bias?
- use of data?
- impact on privacy?
- ethics?
- importance of literacy?
So, Why is Digital Citizenship So Crucial Now?
We’re in an era where AI isn’t just a sci-fi concept; it’s a classroom reality. It’s not just about knowing how to click buttons; it’s about understanding what those clicks mean. In the hands of informed students, AI becomes a tool for responsible learning, highlighting biases, ensuring privacy, and tailoring education ethically.
Ethics aren’t just a ‘nice to have’; they’re essential for students and educators alike. From bias detection to understanding the ethical dilemmas AI can present, digital citizenship is a multi-faceted responsibility. It’s not just a matter of “can we?” but “should we?”
Let’s not forget Literacy
Literacy isn’t just about reading and writing anymore; it’s also about digital literacy and has been for a long time. The steep rise in the use of Generative AI in recent months has drawn a bigger lens on the importance of this. Understanding algorithms, data usage, and potential biases in AI tools is crucial for sure, but let’s not forget the importance of actual literacy.
The ability to craft effective prompts for AI tools is a skill that combines both traditional and digital literacy. It’s important to have the right skills and informed approaches so you can be a savvy consumer of, but also a savvy producer of, digital content, while also being articulate and thoughtful in your interactions with these tools.
So what should educators do?
We’re not just teachers; we’re digital guides. From promoting accessibility through AI to turning ethical dilemmas into learning opportunities, our role is expansive. We need to be the moral compass in this digital world, teaching students to navigate the sea of information responsibly. The best way we can do this is through regular, timely, and measured professional learning opportunities and discourse.
The TDT in conjunction with ISTE, ASCL, NAHT, and the Confederation of School Trusts got this right in their recent report – “Understanding AI for School: tips for school leaders”
“AI can evoke strong feelings of excitement, fear and confusion. It’s important to make a space for staff to discuss and learn.”
It is a useful, measured and helpful document that reinforces many of the points I’m making here. It also reinforces a key point that I spend a lot of time sharing around which is to be mindful of legislation and key government guidance such as KCSIE (2023).
Perhaps now, after all this time, it’s time for a change?
Digital citizenship isn’t a buzzword; it’s a curriculum necessity. We need lessons on bias detection, privacy protection, ethical considerations, and yes, even prompt design and crafting. And let’s not forget, that putting personal information into Generative AI systems is a GDPR no-no. Educators need CPD to understand these nuances, and students need education to navigate this brave new world responsibly.
As Emma Darcy of Denbigh High School and the Chiltern Learning Trust rightly pointed out on the webinar I hosted yesterday for NetSupport; it’s all about promoting strong Digital Character. But how do we go about embedding these principles into our educational systems? Here are five steps I think schools should begin to consider:
Curriculum Overhaul: Integrate digital citizenship and literacy into the existing curriculum. Make it as fundamental as reading, writing, and arithmetic.
Teacher Training: Invest in continuous professional development (CPD) focusing on the ethical and practical aspects of AI and digital tools, not just on the workload reduction and quick wins – and whilst I’m at it, things that will actually improve learning, not things you can do just because they’re cool.
Student lessons: Regularly conduct lessons that focus on real-world applications and implications of AI, including bias detection, privacy, and ethical dilemmas and not just in Computing lessons.
Parental Involvement: Educate parents on the importance of digital citizenship so they can reinforce these principles at home and be part of a strong home/school digital parenting curriculum.
Regular Audits and Updates: Technology evolves rapidly. Regularly audit the effectiveness of digital citizenship programs and update them to include new developments in the field aligned with your CPD and curriculum planning.
The need for digital citizenship has never been so pressing. We’re not just preparing students for exams; we’re preparing them for life in an increasingly digital world which isn’t going to slow down. That starts with a strong Digital Character, a well-rounded curriculum, and an educational community committed to ethical and responsible technology use.