top of page

Digital Fluency Workshop in the IBDP (TOK workshop)

  • gemkeating87
  • Dec 2
  • 3 min read

Two weeks ago, I stood in front of 125 Year 1 IBDP students.


The goal wasn't to lecture them on the mechanics of ChatGPT, nor was it to hand down a terrifying list of "do nots." We know that simply banning tools doesn't build literacy.


Instead, the goal was to spark a genuine conversation about the world they are inheriting.

We treated the session as a launchpad for our new AI Digital Literacy Hub, but to get them to care about a website, I knew I had to get them to care about the context. We needed to solve the equation: how do we balance the efficiency of AI with the necessity of critical thinking?


Here is what that looked like in practice.


Step 1: Creating a Safe Space for Dialogue


You can't talk about ethics if students feel like they’re being policed. To open the floor, we used Cora Yang and Dalton Flanagan's C.A.R.E. cards.

These cards provided a neutral, thoughtful framework to discuss Changes, Applications, Readiness, and Ethics. It shifted the dynamic immediately; instead of a lecture, it became a forum where students and teachers shared opinions openly without fear of being "wrong."


The Hook: A Literal "Lightbulb" Moment


In Theory of Knowledge (TOK), we constantly ask students to question how they know what they know. I wanted to apply that same lens to Generative AI.

I ran a live experiment with a simple image generation prompt to serve as our "hook":

"Draw a picture of how a lightbulb works for a girl."

The result was immediate—and uncomfortable. The AI generated an image explaining the lightbulb as being "like a mini sun." It was whimsical, simplified, and scientifically hollow.

Then, we changed one variable: "Draw a picture of how a lightbulb works for a boy."

The screen filled with diagrams, scientific labels, and technical context. You could feel the energy shift in the room. This wasn't just a glitch; it was a mirror reflecting societal biases back at us. It was the perfect entry point to show students that AI is not a neutral arbiter of truth—it is a tool that requires a human editor.


Advice from the Experts (Their Teachers)


Before the workshop, I asked the faculty what they really wanted students to know about AI in their specific subjects.


I condensed their feedback into a reality check: "AI text is repetitive, formulaic, and rarely goes into enough depth to get top marks."

But we didn't just focus on the negatives. We looked at what they CAN do to use AI smartly:

  • Use it as a springboard: Generate a "starting text" to overcome the blank page.

  • Use it for inquiry: Ask "research questions" to spark ideas.

  • Use it for breadth: Ask it to perform a general analysis of data.

The "But": We made a special point regarding Language acquisition and followed the IB guidelines there.


The key to this conversation was reminding students that they are interesting, and relying on AI means losing their unique voice.


Gamifying the Reality Check


Once we established that AI can be biased, we looked at where it can be brilliant - and where it can be dangerous.


We didn't want this to be dry, so we used Blooket to gamify the learning. We quizzed them on the transformative potential of tools like AlphaFold (solving 50-year-old biological challenges) and Microsoft Seeing AI. But we balanced this optimism with the Deloitte Australia scandal, where a major firm submitted a government report filled with AI hallucinations.


The message to the students was clear: If professional consultants can be fooled because they stopped thinking critically, so can you.


Launching the Solution: The AI Digital Literacy Hub


Inspiration is the start, but structure is the sustainability. This workshop was the official launch of our AI Digital Literacy Hub, a Google Site designed to be a "North Star" for students navigating this landscape.


I wanted this Hub to be more than a policy document. It includes:

  • The Rules of the Road: Clear breakdowns of IBDP guidelines on academic integrity.

  • Gamified Learning: Scenarios and games to reinforce rules so the content actually sticks.

  • Tools for Self-Management: A curated list of AI tools that support study habits and organization without doing the cognitive heavy lifting.


The Takeaway


We wrapped up the session by opening the floor again. Using neutral conversation starters we asked students about their readiness and ethical concerns.


These students aren't looking for shortcuts; they are looking for guidance. They are ready to use these tools, but they want to know how to use them well.


If we want to inspire digital literacy, we have to move past the fear. We need to invite students into the conversation, show them the bias, celebrate the wins, and give them the resources to navigate the gray areas.

 
 
 

Recent Posts

See All
Using AI to Architect Active Learning

In the current educational landscape, "AI" has become synonymous with students staring at screens—either using chatbots to help draft essays or engaging with personalized learning platforms. While the

 
 
 

Comments


Drop Me a Line, Let Me Know What You Think

© 2035 by Train of Thoughts. Powered and secured by Wix

bottom of page