In August 2025, a biology teacher named Dana Kimura at a public high school outside Portland, Oregon, opened Google NotebookLM for the first time. Her district had signed a Google Workspace for Education contract in 2022: 1,200 Chromebooks for every student, Google Classroom for every course, Gmail for every teacher. NotebookLM, Google's AI-powered research tool, had just been made available to education users of all ages. Dana uploaded her AP Biology curriculum, three textbook chapters on cellular respiration, and a set of her own lecture notes. She clicked "Generate Mind Map."
The result appeared in seconds. A branching diagram of the Krebs cycle, glycolysis pathways, and electron transport chains, with connections drawn between concepts she had spent fifteen years learning to teach. The layout was clean. The hierarchy was sensible. The labels were accurate.
Moving a node was impossible. Adding a connection between two concepts the AI had placed in separate branches produced no response from the interface. When she tried to export the map to edit it in another tool, to rearrange it, annotate it, share a version her students could build on, the only option was a PNG file. A flat image. A picture of thinking, with none of the thinking left in it.
The map could be viewed. It could not be touched.
A teacher with a Google Certified Educator badge and seventeen years of classroom experience, Kimura recognized immediately what the mind map was: a demonstration of what AI could do with her materials, presented in a format that ensured she would need to come back to Google's platform every time she wanted to do it again. The map was not a tool. It was a storefront window.
Thousands of educators are staring through the same glass.

Upper Rhenish Master, "Paradiesgärtlein" (c. 1410). Städel Museum, Frankfurt. The original walled garden: everything cultivated, nothing wild, no gate visible. Public domain.
The Free That Costs Everything
Google Workspace for Education is free. The Chromebooks are cheap, often $199 to $299 per device. Google Classroom is free. NotebookLM is free. Gemini, integrated across the suite since June 2024, is free for education users. The entire stack, from hardware to AI, costs a school district approximately nothing in licensing fees.
The 68 percent of American schools that use Google Classroom as their primary platform did not choose it because of a careful evaluation of pedagogical merit. They chose it because the price was right and the alternative was Microsoft, which bundles its own AI, Copilot, into Teams for Education under a similar calculus. Audrey Watters, the education technology writer and author of Teaching Machines, put it bluntly in a 2024 lecture at the University of Edinburgh: "The history of ed-tech is a history of vendor lock-in sold as pedagogical progress. The tools change. The trap does not."[1] The choice in American public education is not between open and closed. It is between two closed ecosystems competing to see which can make the walls more comfortable.
When the platform is free and the data is priceless, the transaction is not commerce. It is extraction.
Ninety-three percent of U.S. school districts plan to purchase Chromebooks this year.[2] ChromeOS accounts for over 60 percent of the global education device market.[3] Each device is a portal to Google's services and a gate that makes leaving progressively harder. The curriculum materials live in Google Drive. The assignments flow through Google Classroom. The student interactions, every question typed into Gemini, every mind map generated by NotebookLM, every audio overview, every AI summary, flow to servers in Mountain View. The district paid nothing for the tools. The tools are not the product.
The Mind Map That Cannot Think
NotebookLM's mind map feature is a case study in the architecture of lock-in disguised as the architecture of thought.

Tony Buzan (1942–2019), the British psychologist who turned radiant thinking into a visual method used in over 100 countries. Photo: Ascotart, CC BY-SA 4.0, via Wikimedia Commons.
Tony Buzan, who popularized the mind map in his 1974 book Use Both Sides of Your Brain, described the technique as "a mirror of the brain's own thinking process."[4] The value was never in the finished diagram. The value was in the act of constructing it: choosing which concepts connect, deciding which branches matter, reorganizing as understanding deepens. A mind map you cannot edit is a contradiction in terms. It is a poster of someone else's thinking process, laminated and hung on the wall.
Google's implementation generates a map from uploaded sources using Gemini. The output is read-only within the application. There are no customization options: no ability to recolor, restructure, merge, split, or annotate. The sole export format is PNG, a raster image with no underlying data structure, no hierarchy metadata, no machine-readable connections. A PNG of a mind map has the same relationship to thinking that a photograph of a piano has to music.
This limitation is so severe that developers have built third-party Chrome extensions to extract the underlying structure. Jimmy Song, a developer in San Francisco, published NotebookLM Mindmap Extractor on GitHub, a tool that reverse-engineers the DOM to reconstruct parent-child relationships and exports them to FreeMind or OPML formats that actual mind-mapping software can read.[5] The existence of these extensions is not a testament to Google's extensibility. It is an indictment of its design. Users are building escape routes from a tool that was supposed to help them think.
Xmind, a competing mind-mapping application, published a step-by-step guide on its blog titled "How to Export NotebookLM Mind Maps and Edit Them Online." The fact that a competitor can market itself entirely on the basis of making Google's output usable tells you everything about what Google's output is designed to be: a demonstration, not a deliverable.
Compare this to what a plain-text AI workflow produces. A mind map generated as Markdown (indented bullets, wiki-links, YAML metadata) is a file the user owns. It opens in Obsidian, VS Code, or any text editor. It can be versioned in Git, transformed by a script, published by a static site generator, or handed to a student who reorganizes it as an act of learning. The map is not a picture. It is a structure. And structures can be changed.
The difference is not aesthetic. It is architectural. Google's mind map keeps you on Google's platform. A Markdown mind map keeps you in control of your thinking.
The Bundling Pattern
Google did not invent the walled garden. But it has perfected the version that matters most: the one that feels like a public park.
The pattern works in three stages. First, offer a free tool that solves a real problem. Gmail solved email. Google Docs solved collaboration. Google Classroom solved assignment distribution. Each tool is genuinely useful, genuinely free, and genuinely better than what most schools had before.
Second, bundle. Classroom connects to Drive, which connects to Gmail, which connects to Meet, which connects to Chat. Each integration makes the suite more convenient and the alternatives less viable. Switching one tool means rewiring all of them. The switching cost is not a price. It is a topology.
Third, embed AI. Gemini now sits inside Docs, Sheets, Slides, Gmail, and Classroom. NotebookLM processes the materials already stored in Drive. The AI does not work with files. It works with Google's files, on Google's infrastructure, producing outputs in Google's formats. The intelligence layer is the final lock, the feature that makes the data maximally useful inside the garden and minimally portable outside it.
Microsoft runs the same play with a different jersey. Copilot is bundled into Teams for Education, which connects to OneDrive, which connects to Outlook, which connects to OneNote. The integration is seamless. The portability is nonexistent. A teacher who builds a semester of AI-assisted lesson plans in Copilot cannot move those plans to a non-Microsoft environment without manually recreating every one.
The two companies are not competing on openness. They are competing on the comfort of confinement.
Ten Million Students, One Architecture
In 2025, Google reported that its education AI tools reached 10 million students across 180 million Workspace for Education accounts worldwide.[6] Gemini for Education is integrated into over 1,000 U.S. higher education institutions. NotebookLM saw 120 percent quarter-over-quarter growth in monthly active users in late 2024, reaching an estimated 7 million monthly users by January 2025.[7] Forty-three percent of its users are students.
These numbers represent something more significant than market share. They represent the formation of cognitive habits. A student who learns to research by uploading PDFs to NotebookLM and reading the AI-generated summary is not learning to research. A student who studies by listening to NotebookLM's audio overviews instead of constructing her own synthesis is not studying. A student whose mind maps are generated, read-only, and locked in PNG format is not mapping her mind. She is consuming a map of the model's interpretation of her teacher's materials, rendered in a format that cannot be questioned, rearranged, or built upon.
At the University of Missouri, administrators recognized this dynamic when they launched the Show-Me AI initiative in September 2025. Chris Kwak, Mizzou's chief information officer, oversaw the deployment in September 2025 of a system hosted entirely on the university's own infrastructure, purpose-built to keep personal and academic data within the institution's perimeter.[8] The architecture was not Google's. The architecture was theirs. Mizzou built a walled garden with the gate key on the inside.
The distinction matters. A walled garden that the institution controls is a security decision. A walled garden that Google controls is a business model.

Vilhelm Hammershøi, "White Doors" (1905). Ordrupgaard Museum, Copenhagen. Rooms opening onto rooms: the architecture of passage, not containment. Public domain.
The Export Test
There is a simple test for whether a tool respects its users: try to leave.
Export your Google Classroom course. You will get a partial archive: assignment titles, some attachments, none of the student interaction data, none of the AI-generated content. Export your NotebookLM notebooks. You will get the sources you uploaded, which you already had. The mind maps export as PNG. The audio overviews export as MP3 files with no transcript. The study guides export as text that references a notebook you can no longer edit outside the platform.
Now try the alternative. Export an Obsidian vault. You get a folder of Markdown files: every note, every link, every piece of metadata. Open them in any text editor on any operating system. The files do not know they came from Obsidian, because Obsidian was never the point. Shida Li, the co-founder of Obsidian, explained the design decision in a November 2022 interview: "We want your notes to outlive Obsidian."[9] That single sentence is the architectural philosophy that Google's education tools refuse to adopt.
Export a conversation from an open-source AI platform like Sage.is AI-UI. You get structured data: the full exchange, with timestamps, model attribution, and context. Import it elsewhere. Pipe it through a script. Feed it to a different model. The conversation is not locked to the platform that hosted it, because the platform was built on AGPL-3, self-hostable, and designed for exactly the institutions that Google's free tier is designed to capture.
The export test is not a technical detail. It is a philosophical position. Tools that make leaving easy are tools that believe their value comes from what they do. Tools that make leaving hard are tools that believe their value comes from what they hold.
![]()
Jeremy Bentham, "Panopticon" (1791). University College London. The architecture where every occupant is visible from the centre, and no occupant can see the watcher. Public domain.
The Data That Stays Behind
When a teacher uploads curriculum materials to NotebookLM, the documents are stored on Google's servers and processed by Gemini. Google's terms for Workspace for Education state that student data in core services is not used for advertising. This is true and it is also incomplete. The data is used to train and improve Google's AI models unless the administrator opts out, a setting buried in an admin console that most district IT directors have never seen. Leonie Haimson, executive director of Class Size Matters and co-chair of the Parent Coalition for Student Privacy in New York, has tracked Google's education data practices since 2015, filing FTC complaints in 2015 and 2021. "The terms of service change every year," Haimson told The Washington Post in January 2025.[10] "The opt-out settings move. The defaults reset. It is designed so that inaction equals consent."
The interactions themselves (the questions students ask, the summaries they request, the mind maps they generate) represent something more valuable than the curriculum documents. They represent how students think. The patterns of confusion, the sequences of inquiry, the gaps between what was assigned and what was understood. This is pedagogical gold. And it sits on infrastructure that the school does not own, in a jurisdiction the school did not choose, governed by terms of service the school board did not read.
An open-source alternative running on the institution's own servers keeps this data where it belongs: with the institution. Conversation maps (branching, visual records of every student-AI interaction) become the school's pedagogical resource, not a technology company's training corpus. The learning data stays inside the building. The insights stay with the teachers who earned them.
The Garden Gate
Back in Portland, Kimura still uses NotebookLM. The mind maps are faster than anything she can build by hand, and her district's contract with Google makes alternatives a bureaucratic impossibility. But the biology teacher has started doing something the tool was not designed to accommodate. Every Monday, she screenshots the generated maps, prints them, and hands them to her students with a red pen and a single instruction: "Fix this."
The students redraw the connections. They argue about which branches belong where. They add nodes the AI missed and cross out nodes the AI hallucinated. They produce maps that are messy, incomplete, and wrong in interesting ways. They produce maps that are theirs.
It is a workaround. It is also a curriculum. The best use Kimura has found for Google's AI-generated mind map is as a starting point for the human thinking that the tool was supposed to replace. The PNG becomes a rough draft. The red pen becomes the learning.
No teacher should need a workaround. The tool should produce editable, exportable, portable outputs that students can build on: files they own, in formats they control, on infrastructure their school district runs. The technology exists. The architecture exists. Open-source platforms that connect any AI model to any institution's data, on the institution's own terms, without sending student thinking to Mountain View, already exist.
The only thing that does not exist, yet, is the will to walk out of the garden. The gate is not locked. It was never locked. It is simply designed so that staying feels easier than leaving.
That is the wall.
The views expressed are those of the editorial board and do not necessarily reflect the positions of any institution mentioned. Sage.is AI-UI and Sage.Education are products of Startr LLC. Full disclosure and transparency is a feature, not a bug.
Audrey Watters, lecture at the University of Edinburgh, 2024. Author of Teaching Machines: The History of Personalized Learning (MIT Press, 2021). ↩︎
CoSN (Consortium for School Networking), "2025 K-12 IT Leadership Survey," Chromebook procurement data. ↩︎
IDC and Canalys reports on global education device market share. ChromeOS share exceeds 60% in education as of 2025. ↩︎
Tony Buzan, Use Both Sides of Your Brain (E.P. Dutton, 1974). Buzan coined the term "mind map" and described radiant thinking as the brain's natural associative process. ↩︎
Jimmy Song, "NotebookLM Mindmap Extractor," GitHub. Chrome extension that reverse-engineers NotebookLM's DOM to export mind maps in FreeMind and OPML formats. ↩︎
Google, "AI in Education" product announcements, 2025. 10 million students, 180 million Workspace for Education accounts. ↩︎
NotebookLM growth data: 120% quarter-over-quarter MAU increase in Q4 2024, estimated 7 million monthly users by January 2025. 43% student user base per Google reporting. ↩︎
University of Missouri, "Show-Me AI" initiative launch, September 2025. Chris Kwak, CIO, oversaw deployment on university-owned infrastructure. ↩︎
Shida Li, co-founder of Obsidian, interview, November 2022. "We want your notes to outlive Obsidian." ↩︎
Leonie Haimson, executive director of Class Size Matters and co-chair of the Parent Coalition for Student Privacy. FTC complaints filed 2015 and 2021. Quoted in The Washington Post, January 2025. ↩︎
Sage.Education