Photo by Michael Dziedzic on Unsplash

I had the opportunity to hear Dr. Bonnie Stewart speak on AI in education. Dr. Stewart, who comes from a Humanities background rather than a tech-focused one, provides a critical and historical perspective on how AI and other digital tools reshape the educational landscape. Her approach frames AI as part of a “post-digital” world—where digital technologies are so ingrained in our daily lives that they have become normalized. However, this normalization comes with risks that educators must critically examine.

Dr. Stewart emphasized that AI is not neutral. It operates within a framework of data collection and algorithmic influence that often remains invisible to users. When we integrate AI-driven tools into education, we must ask: What are these tools exposing us to? How do they shape learning experiences? Who holds the power in this digital landscape? 

Many digital tools, including AI, collect and analyze student data without clearly stating where it goes or how it is used. As Dr. Stewart states, this shift moves control away from educators and students and consolidates power in the hands of tech vendors and corporate entities. They control find patterns of activities and lives of people. This data can be sold to buyers for means unknown to us.

My school district has established strict guidelines regarding AI use in classrooms. Teachers can sign up for AI tools and experiment with them, but we cannot ask students to do the same. Instead, the district has provided a website where teachers can access an AI chatbot that does not collect student data. MagicSchool AI has the same aspect where I can launch AI tools in my classroom. They access them through a link, and they only need to use their first name or initials to allow them access, eliminating the need to provide information to an unauthorized group.

Licensed under the Unsplash+ License

However, this policy highlights a broader concern: Who has access to student data, and what are they doing with it? Signing into digital tools and AI-powered platforms often means surrendering control over personal information, and shifting power from educators to companies that profit from data collection. As Dr. Stewart pointed out, data has become the new oil—a valuable resource that corporations monetize, sometimes at the expense of privacy and autonomy.

This issue became even more urgent with the recent PowerSchool data breach a couple of months ago, which affected my district and many others. PowerSchool, a widely used student information system, holds sensitive data, including grades, attendance, and personal information. The breach raises serious concerns: Where is this data being exposed? Who has access to it now? What safeguards are in place to prevent this from happening again?

Data breaches emphasize the risks of trusting educational records to third-party vendors without adequate protection. It also reinforces the idea that data collection is not just a matter of convenience—it is a matter of security, privacy, and power. Dr. Stewart’s discussion made me think about how data-driven systems shape our experiences. Consider how social media algorithms curate content based on our interactions. Many of us have noticed that after talking about a particular product, we suddenly see ads everywhere. This is not a coincidence; it results from data collection and predictive algorithms analyzing our behaviours.

In my classroom, I use Blooket as a way for students to create review games based on content covered before tests. However, in line with my district’s policies, I don’t require students to create accounts. Instead, I give them access to my teacher account so they can design their own review games. This practice, which I adopted from another teacher, allows students to engage with the material actively while maintaining their privacy. While this workaround helps, it doesn’t solve the more significant systemic issue: educators and students should have tools that prioritize learning without exploiting user data.

Dr. Stewart’s talk left me thinking about the more significant implications of digital footprints in education and society. As educators, we must remain critical and informed. AI and the digital age are here to stay—but how we integrate it into education is still a choice we can make with awareness and responsibility.

If this intersts you, here is video with Dr. Stewart as the guest speaker of a pre-recoded session on datafication.