Every Learner’s AI Right

The Illusion of Understanding

I remember the first time I held a violin. I was seven, and the instrument felt both foreign and familiar in my hands. It was a wooden vessel waiting for music. My teacher didn’t start with sheet music or scales. She had me listen. She placed my little fingers on the strings and said, “Feel where the note lives.” For months, I learned the geography of sound through vibration before I ever saw a note on paper. This was learning as discovery, as tactile truth.

Twenty-eight years later, as an educator watching AI enter classrooms, I feel that same mixture of anticipation and unease. The promise of AI is extraordinary: personalized tutors for every student, instant feedback, and knowledge organized and accessible. But I’ve also seen what happens when tools promise understanding without requiring the learner to navigate. I’ve watched bright, curious students interact with AI learning assistants and emerge with polished essays and perfect quiz scores, yet they are unable to explain the concepts behind their answers. They’ve gained the product of learning but missed the process.

This illusion of understanding — the risk of seeming to know without truly learning— is the key challenge educators face as opaque AI tools proliferate in education.

The Geography We Cannot Feel

When I test AI learning tools, I often think of my violin students who try to learn from YouTube tutorials. They can mimic bowing techniques and produce approximate tones, but they cannot feel when a string needs tuning or sense the subtle pressure change in the bow hand that transforms a note from thin to resonant. The tutorial gives them the what but not the why.

Regretfully, most educational AI functions this way. Tools like Google’s NotebookLM generate mind maps, quizzes, and summaries with polished output, but often lead to “an illusion of learning.” The organization appears logical, but deeper understanding and cognitive connections are missing. These systems solve peripheral problems, such as gathering resources, but fall short of helping learners construct meaning.

Worse, these systems often work best for those who already know how to learn. The skilled student uses AI to augment their process; the novice risks becoming dependent on it. This creates what Tara Westover, author of the book Educated, identifies as educational polarization. Not just between those with and without access to technology, but between those who can use AI to enhance their understanding and those who are pacified by its shortcuts.

When Tools Become Barriers

The problem deepens when we consider accessibility. The activist Annie Leonard exposed how products are engineered for obsolescence — not just physical obsolescence, but social obsolescence. Having last year’s smartphone becomes embarrassing; using a learning platform without AI now seems inadequate.

In education, we’re creating a similar dynamic. AI tools are often:

  • Proprietary and expensive, locking public schools and lower-income districts out
  • Algorithmically opaque, so neither teachers nor students know why an AI suggests certain resources or grades work a certain way
  • Designed for engagement metrics rather than learning outcomes, optimizing for time-on-task rather than understanding

OpenAI, Google, and Anthropic are offering US schools free AI, which is great. However, they force the adoption of standardized solutions that don’t meet specific security, privacy, or teaching needs and are potentially creating vendor lock-in. Once embedded in proprietary ecosystems, schools face limited bargaining power and migration challenges, risking infrastructure costs if vendors change terms or discontinue services.

Just as “Designed for the Dump” describes how extraction, production, and disposal happen “outside of our field of vision,” much AI development occurs in Silicon Valley boardrooms far from the classrooms where these tools are being used. The result are tools that may dazzle but don’t necessarily educate, and that risk widens the very gaps they promise to close.

## The Choice Before Us

I still teach violin. Last week, a student asked me why we bother learning scales when apps can play any melody perfectly. I took her hand and placed it on my violin as I played a note. “Feel that?” I asked. She nodded. “The vibration, the resonance - that’s yours. No app can give you that. It can only show you what’s possible.”

AI in education should avoid giving mere answers and instead foster true understanding by revealing possibilities and paths. To fulfill its promise, AI’s benefits must empower all students in every classroom.


I

sabelle Plante is Co-founder of Sage.Education and CAO of Startr LLC. She writes about the intersection of technology, pedagogy, and equity.

At Sage.Education, we argue that learners must actively build understanding, not passively receive information. Our maker-minded approach to AI in education counters the illusion of understanding by foregrounding agency, transparency, accessibility, and diversity.

  1. Agency as Objective
    The goal shouldn’t be to make learning effortless, but to make learners more capable. As the learning science tells us, effective learning feels effortful. AI should create desirable difficulty, challenges that stretch learners without overwhelming them. It should encourage what researchers call the generative effect: learners actively constructing explanations rather than passively consuming information.
  2. Transparency as Pedagogy
    AI shouldn’t be a black box. Students should be able to ask not just “what’s the answer?” but “how did you arrive at that answer?” “What sources did you consider?” “What assumptions are embedded in your reasoning?” This turns AI from an oracle into a dialogue partner, one that reveals its workings as part of the learning process.
  3. Accessibility as Infrastructure
    AI educational tools should be freely available to all learners, with open standards and adaptability across contexts. Cost must not be a barrier.
  4. Diverse Representation in Training
    If AI tutors are to serve all students, they must be trained on diverse voices, experiences, and ways of knowing. This goes beyond adding datasets; it means involving educators from different backgrounds in the development process, ensuring the AI doesn’t replicate the biases of its creators.