The Personalization Paradox
Weasel Word: a word or phrase aimed at creating an impression that something specific and meaningful has been said, when in fact only a vague, ambiguous, or irrelevant claim has been communicated.
“BILL MOYERS: Well, I love the vision, but what about the argument that machines, computers, de-humanize learning?
ISAAC ASIMOV: Well, as a matter of fact, it’s just the reverse. It seems to me that it’s through this machine that, for the first time, we’ll be able to have a one-to-one relationship between information source and information consumer, so to speak.
BILL MOYERS: What do you mean?
ISAAC ASIMOV: (...) Now, there’s a possibility of a one-to-one relationship for the many. Everyone can have a teacher in the form of access to the gathered knowledge of the human species.
BILL MOYERS: Through the libraries that are connected to the computer on my desk in my home.
ISAAC ASIMOV: That’s right. Right.”
I'm so glad I decided to write this article, if only for discovering this prescient conversation from yesteryear. Asimov's vision of personalized learning through technology strikes at the heart of today's educational AI debate. Let me highlight two orthogonal tendencies in our modern society that I haven't seen clearly confronted before: the endless promise of 'personalized learning' through technology, and the tight restrictions on student data that could make true personalization possible.
Every edtech conference, AI product launch, and marketing email seems to be shouting about "personalized learning" like it's the educational messiah we've been waiting for. This drumbeat has been so relentless for so long that the term itself has lost meaning. Isaac Asimov in that exchange, was already talking about it - in 1988! That was well before any of us here had any clue what the word “internet” meant, or even, for most, that the word “windows” could, literally, mean anything else than an opening in a wall. Fast forward to today in 2025, and when I see "personalized learning" in a pitch, I can practically predict the substance-free claims that follow.
Since I brought up Windows, Bill Gates himself promised personalized learning in 2015, 2016, 2017, 2019, 2024, and 2025. And this is just after a quick search on his blog. At this point I think it’s fair to say that claiming that we’re finally going to have true personalization of instruction is just something that Bill Gates does - it’s one of his shticks.
To be fair to him, he truly is putting his money where his mouth is. A quick search on his foundation’s website (try “personalized learning” site:gatesfoundation.org in google) reveals he’s been pouring over 35 million dollars in various projects that have “personalized learning” in their title in the past 20 or so years. It’s not that clear what this means though. What exactly is personalized learning?
Meanwhile…
Meanwhile, legitimate concerns over student data privacy have grown significantly. And here lies the paradox: deep, meaningful personalization requires precisely the kind of detailed data that privacy regulations aim to protect. Truly replicating the nuanced understanding a human teacher develops—adapting not just to academic performance but also to habits, interests, hopes, and fears—necessitates access to a vast amount of student information. Current AI systems, Large Language Models (LLMs) added to traditional Intelligent Tutoring Systems (ITS), could technically leverage such data.
But in K–12 education, especially in regions like New York with stringent laws like Ed Law 2D, collecting and utilizing such granular data is, quite rightly, heavily restricted. This forces us to ask: what level of personalization is actually achievable within these constraints? And what are we truly optimizing for?
More importantly, is personalization even the right goal for K-12 education? In a 2016 essay titled Personalized Learning: The Conversations We're Not Having, Monica Bulger wrote that “Support for personalized learning systems often stems from intuitive or anecdotal enthusiasm without a strong base in empirical evidence.” Her 29 page opus makes the case that:
Most personalized learning products lack independent evaluation of their efficacy and quality
The term "personalized learning" has become a catch-all buzzword encompassing everything from custom interfaces to adaptive content delivery
There's significant confusion between truly adaptive systems and merely responsive ones that simply follow pre-determined paths
And little empirical evidence exists demonstrating that data-driven personalization actually improves learning outcomes
Let’s double click on these last two points she’s making. We’re talking once again about Intelligent Tutoring Systems (ITS) of the likes of IXL or iReady. Some of you may know them under the name of computer adaptive learning, or computer adaptive assessment. These software adapt learning paths and assessment dynamically based on the student input. In a 2021 paper for UNESCO titled AI and education: guidance for policy-makers, and that I had shared with you in my AI policy article, Wayne Holmes says of ITS software that:
“...although intuitively appealing, it is important to recognize that assumptions embodied in ITS, and their typical instructionist knowledge-transmission approach to teaching, ignore the possibilities of other approaches valued by the learning sciences, such as collaborative learning, guided discovery learning, and productive failure (Dean Jr. and Kuhn, 2007). In particular, the ‘personalized learning’ provided by ITS typically personalises only the pathways to prescribed content, rather than promoting student agency by personalising the learning outcomes and enabling the student to achieve their own personal ambitions. In addition, although some studies have shown that some ITSs designed by researchers compare well with whole-class teaching (e.g. du Boulay, 2016), and despite the fact that they have been bought into by many education systems around the world, there is actually limited robust evidence that commercial ITS are as effective as their developers claim (Holmes et al., 2018a).”
I thoroughly examined one such claim, and from a prestigious university, in my last article on the lame science of AIED. Go check it out if you haven’t read it! What I found is fascinating in the same way a car crash on the highway is, when no one can help but stare…
In other words, I don't think personalization as it is implied in marketing campaigns is actually a goal worth pursuing. I'm not really interested in using AI to replicate tutoring or automate instruction. That's just dressing up old work with new tools. What excites me is the way AI forces us to rethink instruction and assessment—because written work is no longer a reliable indicator of learning—and how it opens up creative, ambitious projects that were previously out of reach for students.
One could object, of course, that Bulger's and Holmes' papers predate the advent of Large Language Models and the explosion of education-focused AI wrapper apps powered by chatGPT that have the potential to democratize personalized learning and bring transformative change through intelligent systems that empower students across the globe…
See what I did? I just crammed in one sentence every single one of Holmes' "weasel words" that edtech marketers love to use: potential, learning, powered, intelligence, personalized, democratization, transformative. See how easy it is to string these words together into something that sounds impressive while saying absolutely nothing concrete? I plead guilty, by the way, I’ve used them too, but this is precisely the kind of language we need to see through if we want to have honest conversations about what AI can and cannot do in education. Anyway — one would have a valid point, after all I’ve said myself that Large Language Models are reshuffling the cards, so let’s have a closer look at these new tools, which this time, will personalize learning?
The Current Personalization Claim
Let’s examine in more depth what the ”personalized learning” being delivered is all about. I’ll talk about MagicSchool again because it seems to be the one most people are aware of in education settings that has a student facing side. Most of the relevant teacher-facing tools functionalities will also be included in the conversation.
NB: Please do not take this as a rejection of MagicSchool, SchoolAI, and other AI tools for students. I definitely prefer a school providing MagicSchool than nothing, and the law I mentioned earlier, at this point, makes it one of the very few options available anyway. It’s a great tool that allows much that was previously unimaginable, and If I don’t end up recommending when asked my opinion, it’ll probably be because I’d recommend a competing product, which are extremely similar. That being said, it doesn’t mean I can’t criticize their communication style and empty promises. So let’s genuinely try to figure out what they mean by “personalized learning.”
Go to their homepage, scroll down to look for information on student facing tools, and you’ll find this:
Ha! We didn’t have to look too far. If you click on it, you’ll find it right there in the title “Embrace responsible AI for students to personalize learning, engage them in creative ways, and build their AI literacy skills. Sign up free.” Why wouldn’t you? Well, let’s scroll some more before signing up to see if they explain what they mean. I’ll leave the “drive real learning gains” claim and “improve AI literacy” one out of this article because I just wrote at length about the former in my last article, and I plan on covering the latter extensively soon. Otherwise, here’s the exhaustive list of all the clues on that page that could explain what personalized learning looks like.
“MagicSchool for Students unlocks learning opportunities not possible without generative AI - think escape rooms, virtual field trips, and choose-your-own adventure stories. And, finally you have a tool to give feedback as unique as your students.”
Teacher can “tailor [their] instruction based on timely insights.”
“Did we mention you can customize any tool to meet your exact needs?”
And also this:
Once again, having covered ad nauseam fantastical claims on learning gains, I’ll leave aside the 28% improvement claim and focus on how they claimed to have achieved it. I just included all the quotes you can find on their website, so unless their recipe for personalized learning is top secret, and since they clearly make it a major selling point, I have to conclude that this is it - right there in these quotes lies the personalization, the holy grail finally bestowed upon us by the awesome power of generative AI. So naturally, I tried to know more.
Escape rooms and Virtual field trips. I couldn’t find anything once I logged into my account. Not in the teacher platform, not in the student platform. I really did look everywhere, I even tried to google it, rien, nada, zilch! I’m therefore left to speculate… Escape rooms and virtual field trips, aside for not being new at all, seem to me to be more of a group thing? It would have to be darn well produced for me to be engaged on my own in one of these. VR goggles maybe? Something like that… And I could choose anywhere I want to go! and ask anything I’d like? I don’t think it exists, certainly not in MagicSchool, so the personalization must be somewhere else.
Choose-your-own adventure. It was also very hard to find, I also had to google it since I couldn’t find anything on the site. I finally found a YouTube Short that explained how to get it. It turns out the person in the video asks the chatbot to give him a prompt to turn the student chatbot into a choose-your own adventure to work on 5th grade science. I tried to follow the directions, but I didn’t have the option to further customize the chatbot like he does in the video. Maybe it’s only available in the paid version, I don’t know. What I do know is that the video’s example ends up being a 100% text based interaction between a ChatGPT-like chatbot role playing as a choose-your-own 5th grade science questionnaire adventure and a hypothetical student who must be so so impressed.
Tailored instruction based on timely insights. There’s a presentation video if you want. If you’ve ever seen student data analysis software, it definitely won’t rock your socks off. It’s pretty standard LMS monitoring and feedback, solely based on student text input for specific activities. It is useful, it’s not revolutionary. And once again, the only part that is specific to any student lies in the feedback based on whatever the student typed in the software. It doesn’t know the first thing about the student, so I wouldn’t call that personalized.
Customize any teacher tool. Well, if it’s any, I can then pick one at random, right? I went for the Math Story Word Problems, filled the drop down menus for grade (5th), number of questions (3), math standard/topic (fraction multiplication) and finally, the story topic. At this point the tool has zero clue who the students are, nor who I am for that matter. I decided to help a little and told it that one kid was into basketball, another one into Roblox, and a last one into TikTok. Here’s what the power of AI allowed me to assign to my imagined students:
Writing feedback. I actually love this, this is very useful. It allows you to generate thoughtful feedback quickly and at scale to students. If it’s automated, even better, I’m not sure it is, but it probably will. However, I wouldn’t call that personalized either. LLMs are just math, they’re determining probabilistically what should be the answer to a given input. So when you get feedback on a student piece of writing, what you really have is a complicated equation with the student writing and directions on the left side of the equal sign (plus whatever you want to imagine the AI process to be), and on the other side of the equal sign, you get the AI generated feedback. At no point whatsoever is there any info on who the student is.
Language learning tutor. I’ll just show you and I won’t comment, you can look for the personalized learning yourself if you want.
Translate it! As for that one, it’s exactly what you would expect. A small Text box for destination language, a big text box for the text to be translated.
I’m sorry MagicSchool, please don’t take it personal.
So, after all this exploration, what have we found? Marketing that promises personalization but delivers standardization with a light AI veneer. Drop-down menus masquerading as personalized experiences. Chatbots that know what you just typed but nothing about who you are or what you care about.
But here's the irony that makes this all so frustrating: generative AI actually is the most powerful personalization technology ever created—just not in the way it's being advertised.
What Real Personalization Looks Like
I know this firsthand, you probably do too. I've built custom GPTs and Claude Projects that I've trained with specific information about my goals, preferences, and needs. The more details I provide about my specific context and objectives, the better the results. That's genuine personalization—a tool shaped to my unique requirements through ongoing interaction and refinement.
But that's worlds apart from asking Timmy to log into [insert edtech product] and click through a menu-driven chatbot that's supposedly "personalizing" his learning experience. The difference is like comparing a bespoke suit to a department store "one-size-fits-most" t-shirt with a custom name tag slapped on it.
When I use AI, I can decide what it remembers about me, and I can request it forgets certain things we’ve talked about. I could opt out entirely and preserve my privacy, but I won’t do it because the results are sooo superior when the AI knows who you are and what you’re trying to do. Not only this, it’s genuinely more pleasant to work with, just like it’s more pleasant to work with someone you appreciate than a stranger. Very early on it seemed clear to me that ultimately, AI companies would compete to have a monopoly over your digital identity. Once the performance passes a threshold beyond which differences are irrelevant for most tasks, people will gravitate to the AI they enjoy the most spending time with—and changing AI will be a real drag. I deduced this from the evolution of other consumer tech products. People are Mac or PC, iPhone or Android, and changing from one to the other is not an enjoyable endeavor. Not only because you’ll have adapted your way of working to the tech you’re using, but it will know so much about you, just like your phone today has all your photos, contacts, calendar, bookmarks, favorite apps arranged how you prefer etc. Satya Nadella, the CEO of Microsoft, echoed my insight in a recent interview during which he declared he believed that AI models are getting commoditized (timestamp 30:43).
By this he means that the real value is shifting away from the models themselves toward products and applications built on top of them. When questioned about Nadella’s remark, Sam Altman, the CEO of OpenAI, acquiesced. And when pressed to explain what he thought the differentiator would be he answered:
“I think that should be a combination of several different key services. There’s probably three or four things on the order of ChatGPT, and you’ll want to buy one bundled subscription of all of those. You’ll want to be able to sign in with your personal AI that’s gotten to know you over your life, over your years to other services and use it there. There will be, I think, amazing new kinds of devices that are optimized for how you use an AGI.”
Think about it, it makes perfect sense, having your personal assistant with you, turned on whenever you want, who knows about you whatever you wish and that can do what GPT5, or 6 or 7 can do… The strategic key to success for these future competing products will be perceived privacy: how convincing they will be that they won’t use this intimate knowledge for ends you wouldn’t approve of. Just the same way that some people post pictures of everything they do on social media, while others still own a flip phone, the question for you, my friend, will be how much of your privacy are you willing to relinquish to these big companies so that you can profit from these models awesome capabilities? However, what seems clear to me, is that this should, and probably will be, reserved to adults. I don’t see public schools giving access to this kind of data on our kids anytime soon.
Following our own Bent
Aside from being mature enough to consent to the sharing of my personal info with OpenAI and Anthropic, there’s one more aspect of my use of AI that differs greatly from that of students’—as it is imagined in marketing slogans—and which explains why my use is truly personalized and theirs isn’t. It’s that I’m using AI to pursue goals I set for myself, such as writing this article, or coding a html game for my son. My wife is now a badass python vibecoder in her job, but no one asked her to be. School work is very different isn’t it? It’s not that MagicSchool’s chatbot can’t deliver augmented capabilities for students. It certainly can! But only if students truly want to use it, if they see the value in it, if they’re inspired and given free rein to “follow their own bent” as Asimov was saying in this interview I opened with.
AI does change everything—just not in the way it's being sold. What AI really forces us to do is to confront a problem that predates it: product-based assessment is broken. I've said this before in other pieces, but generative AI drives the final nail into the coffin. If a student can produce a polished essay with a well-crafted prompt, what does that essay actually show us about their learning? Not much.
That's why we need to shift to process-based assessment. The goal isn't the final product—it's the path to that product. The decisions, revisions, false starts, and reflections. That's where learning happens. And that's what we need to capture.
With AI, students can produce faster, iterate more, and explore deeper questions—if we let them. There isn’t much we can predict about the years and decades to come, but it seems clear to me that the most valuable skill, already today, is to be able to teach yourself new skills, and in this respect AI has indeed revolutionized the way we can learn. YouTube was, and remains, really cool, even to us the book people, but AI is clearly next level. That is, when used to follow our own bent.
Okay, now I’m going to give the last word to Isaac Asimov ; I teased you enough as it is. If you’d rather read, you can find the exchange on Bill Moyers’s website, but if you want to have a masterclass in quick thinking and crystal-clear visionary ideas, all of that in a Brooklyn accent, and with world-class mutton chops, ladies and gentlemen, please indulge in 7 minutes of intellectual bliss. So long, and thanks for all the fish!
PS I reposted it to Linkedin, hope that's OK. https://www.linkedin.com/posts/cathylbrown_what-is-this-really-about-activity-7314805014537936896-0S2M?utm_source=share&utm_medium=member_desktop&rcm=ACoAAAil6AwBcokoP_ONFLLnIG1-J2ajC3keuMw
Great read, Wess. Well put together enjoyed reading it its nice to find other like minds. I recently wrote an article on something similar, https://www.virtualteacher.com.au/why-ai-personalised-learning-isnt-what-you-think/. At the moment I am pursuing "MY OWN BENT," writing children's books. BENT PURSUING is the same for kids. Get them hooked on this at an early age and they will also TEACH THEMSELVES.