logoalt Hacker News

Sal Khan's AI revolution hasn't happened yet

47 pointsby the-mitrtoday at 5:05 AM58 commentsview on HN

Comments

JimsonYangtoday at 7:42 AM

I think ai has revealed one of the biggest gaps in our education system: the majority of students don’t really care about knowledge-even when tested on it.

To give an example, I have a friend who learned system design through Claude in order to get a job interview (and he got really good at system design)while I have another friend who copies and paste ChatGPT responses in order to get a B on a reflection assignment.

This highlights that there is legit use case for personalized learning and growth via AI-but these are the people who seek knowledge with or without AI. Whereas the majority of students actively tries to do the least as possible on assignments even if they get 0 value out of it

MostlyStabletoday at 5:55 AM

The fact that few students are self motivated enough to use it makes sense....but you are telling me that, in 4 years, _so_ few were motivated to use it that you can't report on whether or not it makes a difference for the minority that do?

I was among those who, when Khanmigo was first announced, were pretty excited about it's potential. I then waited for data on the results....and kept waiting.....and kept waiting. And now four years later this is apparently what we are going to get. I think that this is enough for me to decide that Khanmigo, regardless of whether or not a student actually engages with it, doesn't make much learning difference. At some point, the absence of (reported) data becomes data in itself.

I still believe, in principle, that AI tutors could be massively helpful for learning. But apparently we haven't yet figured out how to take that principle and turn it into reality.

show 2 replies
18altoday at 7:35 AM

The wish for an _AI revolution_ in learning seems to have been granted by a monkey's paw. Articles like this, or [0], or browsing r/teachers [1], or even talking close-ones in college, give a rather grim view of AI use.

A para from from [0] makes it seem that students understand that LLM use doesn't lead to learning, but still do so. Do they not see effort put into learning worthwhile?

  A few months ago, I overheard some college students talking about their classes.
  One was complaining about an assignment they needed to do that night, and
  another incredulously asked why they wouldn’t just have ChatGPT do it. The first
  replied, “This is my major, I actually need to learn stuff in this class. I use
  AI for my other classes.”

I myself use LLMs for learning (using ChatGPT's study mode for instance r.i.p) and can see that there's a right way to use it—you reach for it when you hit a wall, not to avoid the friction of developing an understanding.

From what I understand tho, most of LLM use for learning is just LLM used as a tool for cheating. Even tfa mentions something of the sort:

  few of Musall’s most advanced students have taken advantage of AI to learn new
  topics. But, as far as she can tell, more students are using it to just find
  answers
The article attributes _skill issue_ as part of the problem, but how much of that is a motivation or awareness issue. How do you make student realize that learning is worth it?

[0] https://arstechnica.com/science/2026/04/to-teach-in-the-time...

[1] https://www.reddit.com/r/Teachers/

drivebyhootingtoday at 6:28 AM

90% of teaching kids is actually managing energy and motivation. Of course an obsequious facile robot can’t actually help there.

show 1 reply
tanvachtoday at 5:57 AM

It’s human nature really, and we see this in our jobs, instead of using LLMs to learn to do our jobs better, we replace work with automation.

It’s going to be quite hard to motivate students to learn now that they know answering can be automated.

the-mitrtoday at 7:15 AM

Many of the ed-tech products that are introduced in the classroom are solutions looking for a problem. Not many have much understanding of the context of teaching and learning processes. And yet, as history of ed-tech tells us, gung-ho business optimism will sell these to managers/administrators as THE solution. The most important stakeholders in the process the teachers an students, who are end-users of the said product/technology, are left out in this decision chain.

barrenkotoday at 7:21 AM

I have some two million KA points, and I'm currently using it to guide me around learning basic physics - I've never used Khanmigo not even once. If I need to ask AI type questions I go off site to Claude or GPT or deepseek.

As other people have noted, asking a.k.a <i>typing</i> questions, especially math-type is fatiguing, and there's no substitute for pen and paper and thinking hard.

KA would be better off using AI on the supply side (but heavily curated) to have more assignments, or better assignments in some sections.

But it's important to recognize KA for what it is, and it's an excellent way to have some sort of a basic curriculum, especially when self-studying, and all of the instructors have great teaching personalities, as far as I can deduce from the approach in the videos.

kevlenedtoday at 7:17 AM

Derek Muller of Veritasium has a presentation on why the promises of a revolution in education are never fulfilled. [0]

His hottest take is we're already close to the optimal process for learning, so technology isn't going to improve it. Learning takes work, and no technology can do the work for you.

[0] https://www.youtube.com/watch?v=0xS68sl2D70

roncesvallestoday at 6:53 AM

I don't even think Khan Academy's original teaching revolution quite panned out.

I still remember when Khan Academy first came out, there was talk that teachers would go obsolete because teaching would become centralized and delivered over video.

Khan Academy to me is still just a YouTube channel trying very hard to be something more.

show 2 replies
utopiahtoday at 6:55 AM

Classic technological innovation trope :

  amazing in theory with the perfect user in the perfect use case,

  misused in practice with terrible consequences for society at large.
Sure the one student who already excels, is motivated, understand what the concept to learn is, that actually completing exercises helps them to learn might, possibly, thrive. All other students, the vast majority, will try to "game" the (terrible) evaluation system to get good grades by cheating WHILE avoiding the very challenge that make the learning possible.

Who could have guessed.

10keanetoday at 7:18 AM

what is the point of teaching anyway when fundational knowledge are becoming obsolete?

i think what should be taught is the metacognative ability - like how to retrieve knowledge, how to ask the right questions towards a certain goal. knowledge itself are easily accessible with ai. now the difficult part is the ability to discern actual knowledge from llm halucination bs, the ability to retrieve the required knowledge given a scenario.

this still requires some foundational grounding — you can't detect bullshit with zero context. but the balance shifts from memorization to retrieval, iteration, verification. honestly i think it is more about critical thinking and philosophy.

LaFolletoday at 6:59 AM

I think the teacher's job is not just to teach but more importantly to ingnite curiosity in students on new fields / subjects. If a genuine interest / curiosity in subject is missing then no amount or medium of tutoring can help.

AI is great for the curious. But its not yet there where it can proactively engage with students to generate interest.

augment_metoday at 6:13 AM

You can be very AI-skeptic in various ways and still think that this is a fair take. I teach and supervise students as master's level courses, and about 15% of them have intrinsic motivation to learn. These students have set up their own AI tutors with prompts and know way more than me in certain areas of the field, they are extremely ahead of their class.

The issue in my country is that you equate education with getting a safe job. 20 years ago, you needed a high-school degree in social science to get a government job. 10 years ago you needed a bachelor in social sciences to get the same job. 5 years ago you needed a bachelor in economy/engineering to get the same job. Now, because of recessions this is stretching to masters degrees.

You can't expect people who just want a job and a comfortable life and NEED to go to uni for this to want to be curious and want to learn.

show 2 replies
gcanyontoday at 5:59 AM

This honestly seems like a flawed approach. Kids don't show up in the first grade, or the sixth, 9th, or really 12th, as the initiators of their educational journey.

An AI-based education system should have embedded in it "I am here to teach this person Geometry. Here is a list of the topics to cover, with a breakdown of steps for each including an intro section, a study section, a test section, and the meta material to go along with it.

That would work.

show 2 replies
uhoh-itsmaciektoday at 5:51 AM

>Kristen DiCerbo, the organization’s chief learning officer, said AI can only respond to students based on what they ask. And it turns out, she said, “Students aren’t great at asking questions well.”

Ignoring whether or not this is a good idea in the first place, what about inverting the loop? Have the robot drive the interaction.

show 3 replies
flexagoontoday at 6:13 AM

There is an LLM button on every other website now. "Chat with your lesson", "chat with your food", "chat with your photos". People are not clicking them because they are just visual noise at this point.

show 2 replies
sigmoid10today at 6:11 AM

Weird. No mention of the technical aspects, essentially just blaming average students for not being engaged enough with their simplistic ChatGPT clone. No wonder they have not dared yet to give out actual usage metrics. If I was given the choice between an inferior product that probably lags significantly behind on all features and one of the standard offerings from OpenAI, Google or Anthropic, I'd question why I should use this thing too. According to their website, they position Khanmigo like this:

>Unlike other AI tools such as ChatGPT, Khanmigo doesn’t just give answers. Instead, with limitless patience, it guides learners to find the answer themselves. In addition, Khanmigo is the only AI tool that is incorporated with Khan Academy’s world-class content library that covers math, humanities, coding, social studies, and more.

The first differentiation is literally just prompting (if at all). Nowadays you can tell any chatbot to behave that way. The second one may have been an edge before tool use was widely common, but with all chatbots now having access to the internet and code execution, it seems like this has also become a dud. This product was a nice idea on paper, but the fast technical evolution of the field has largely left it in the dust.

show 1 reply
suttontomtoday at 6:09 AM

>Khan Academy recently announced an overhaul of its product that provides students additional academic practice. Now Khanmigo is incorporated directly as a way students can get advice as they’re working through specific problems. A spokesperson said the organization made this change because “students were not seeking out Khanmigo’s help as much as we had hoped.”

Dear Lord, how is this any different from Microsoft sticking Copilot or Google sticking Gemini in every single offering? They're literally saying that people aren't using the chat bot enough so they're going to force it on people inside the product.

show 1 reply
ericdtoday at 6:38 AM

I'll probably open source and Show HN the AI tutor I've been working on for my kids at some point, but working on it has given me a little insight into the problem.

The biggest thing is motivation. First off, if Khanmigo requires them to type and read everything, that's going to get tiring fast for most kids. But I don't know how you could do voice in a school setting - mine uses STT/TTS, but with 20 kids in a room, it'd be chaos - STT accuracy and diarization with 2 is already really challenging.

Motivation is helped a bit by following their interest, but it seems like KA is having trouble guiding the kids when they prompt it that way. That was a pretty big issue with mine early on - the kids would talk to it for an hour about whatever topic they were interested in at the time, but it would never branch into something new.

The tutor I'm working on solves it by having a concept graph that covers a lot of learning, from the basics like math, dinosaurs, etc to other developmental topics like 6 year old boundary-pushing humor, and two LLM threads - one that handles the conversational turns, and another one in the background that strategizes and steers the conversational thread by looking at the concept graph connections and considering how ready they are for each, and then injecting steering notes into the conversational thread. Basically system 1 and system 2 thinking. And after sessions, it'll make a basic plan of where to start next time, and what might be interesting to offer up.

I mentioned this in another comment, but I've been really pleasantly surprised at the quality of the tutoring, especially when it bridges into new topics - one of my sons is really into slay the spire, and at different times it’s used that as a launching-off point into probabilities, decision trees, python code of the algorithms he thinks about as he's facing different enemies, and general strategies on different facets, and my other son was really into sharks, which it has bridged into extinct sharks like megalodon, how scientists derive how it looks given cartilage's lower propensity to fossilize, bridging to dinosaurs and their fossils, the K-PG extinction event, how food scarcity filtered for smaller animals like the ancestors of birds, and our small mammalian ancestors. And a whole bunch of other topics.

It's been pretty great in that way, but my biggest open question at the moment is how to get them to engage with it on their own on a more regular basis - they go to it occasionally for random questions, but to get good coverage of that huge knowledge graph would take much more. And fundamentally, I think that human engagement still just has a number of important aspects to it that it's lacking, and I'm not sure if it's possible to replace those well enough.

show 2 replies
vascotoday at 5:53 AM

On one hand I will grow old knowing I'll always have a job because a lot of kids never will have researched anything in their lives and won't know how to deal with anything an LLM can't solve. On the other hand between this and most kids having had a 2 year covid gap in their learning, who the heck is going to pay my retirement and be my doctor when I'm old?

show 1 reply
BrenBarntoday at 6:44 AM

> She says there’s been more enthusiasm for the product among administrators than teachers in her school.

That is a warning sign if ever there was one.

yabutlivnWoodstoday at 5:56 AM

Kristen DiCerbo quote from the article

> “Students aren’t great at asking questions well.”

In my interactions with my kids public school and their teachers, they're goal is ram content down their throat and test for retention, not foster an environment open to questions

Had a teacher claim straight up they don't believe the system works and are just in teaching for benefits and summer vacation

IMO Sal Khan's revolution hasn't happened because the adults in charge right now are ignorant and inept but incredibly vain nonetheless

show 3 replies
krainboltgreenetoday at 5:57 AM

I think it's particularly telling that the teacher dropped the product and the superintendent is saying it's going well. Maybe I'm biased from going to New Orleans public schools, but I have my doubts about how tapped in the superintendent is of the overall strategy.

That said I do think it's particularly hilarious that KA's strategy to students not wanting to use the product is to make the product more integral to the experience.

croestoday at 5:52 AM

> It doesn’t necessarily make students motivated to learn or fill in gaps in knowledge needed to ask questions.

Who would have thought?