logoalt Hacker News

mgraczyktoday at 12:58 AM7 repliesview on HN

The sad reality is that this is probably not a solvable problem. AI will improve more rapidly than the education system can adapt. Within a few years it won't make sense for people to learn how to write actual code, and it won't be clear until then which skills are actually useful to learn.

My recommendation would be to encourage students to ask the LLM to quiz and tutor them, but ultimately I think most students will learn a lot less than say 5 years ago while the top 5% or so will learn a lot more


Replies

JumpCrisscrosstoday at 1:53 AM

> AI will improve more rapidly than the education system can adapt

We’ll see a new class division scaffolded on the existing one around screens. (Schools in rich communities have no screens. Students turn in their phones and watches at the beginning of the day. Schools in poor ones have them everywhere, including everywhere at home.)

show 1 reply
quantumHazertoday at 11:23 AM

> it won’t make sense to learn how to code.

Sure. So we can keep paying money to your employer, Anthropic, right?

ethmarkstoday at 1:32 AM

> most students will learn a lot less than say 5 years ago while the top 5% or so will learn a lot more

If we assume that AI will automate many/most programming jobs (which is highly debatable and I don't believe is true, but just for the sake of argument), isn't this a good outcome? If most parts of programming are automatable and only the really tricky parts need human programmers, wouldn't it be convenient if there are fewer human programmers but the ones that do exist are really skilled?

show 2 replies
DANmodetoday at 2:14 AM

For what it’s worth: OpenAI seems to be encouraging this with their “Study” mode

on some ChatGPT interfaces.

gerdesjtoday at 1:28 AM

An LLM is a tool and its just as mad as slide rules, calculators and PCs (I've seen them all although slide rules were being phased out in my youth)

Coding via prompt is simply a new form of coding.

Remember that high level programming languages are "merely" a sop for us humans to avoid low level languages. The idea is that you will be more productive with say Python than you would with ASM or twiddling electrical switches that correspond to register inputs.

A purist might note that using Python is not sufficiently close to the bare metal to be really productive.

My recommendation would be to encourage the tutor to ask the student how they use the LLM and to school them in effective use strategies - that will involve problem definition and formulation and then an iterative effort to solve the problem. It will obviously involve how to spot and deal with hallucinations. They'll need to start discovering model quality for differing tasks and all sorts of things that look like sci-fi to me 10 years ago.

I think we are at, for LLMs, the "calculator on digital wrist watch" stage that we had in the mid '80s before the really decent scientific calculators rocked up. Those calculators are largely still what you get nowadays too and I suspect that LLMs will settle into a similar role.

They will be great tools when used appropriately but they will not run the world or if they do, not for very long - bye!

show 7 replies
andrei_says_today at 1:28 AM

> Within a few years it won't make sense for people to learn how to write actual code

Why?

Because LLMs are capable of sometimes working snippets of usually completely unmaintainable code?

show 2 replies
Madmallardtoday at 6:45 AM

Bold claim by the Anthropic employee drinking their own Koolaid

show 1 reply