logoalt Hacker News

imiricyesterday at 6:38 PM2 repliesview on HN

> every AI coding bot will learn your new language as a matter of course after its next update includes the contents of your website.

How will it "learn" anything if the only available training data is on a single website?

LLMs struggle with following instructions when their training set is massive. The idea that they will be able to produce working software from just a language spec and a few examples is delusional. It's a fundamental misunderstanding of how these tools work. They don't understand anything. They generate patterns based on probabilities and fine tuning. Without massive amounts of data to skew the output towards a potentially correct result they're not much more useful than a lookup table.


Replies

Zakyesterday at 7:02 PM

They don't understand anything, but they sure can repeat a pattern.

I'm using Claude Code to work on something involving a declarative UI DSL that wraps a very imperative API. Its first pass at adding a new component required imperative management of that component's state. Without that implementation in context, I told Claude the imperative pattern "sucks" and asked for an improvement just to see how far that would get me.

A human developer familiar with the codebase would easily understand the problem and add some basic state management to the DSL's support for that component. I won't pretend Claude understood, but it matched the pattern and generated the result I wanted.

This does suggest to me that a language spec and a handful of samples is enough to get it to produce useful results.

dmdyesterday at 7:28 PM

It's wild to me the disconnect between people who actually use these tools every day and people who don't.

I have done exactly the above with great success. I work with a weird proprietary esolang sometimes that I like, and the only documentation - or code - that exists for it is on my computer. I load that documentation in, and it works just fine and writes pretty decent code in my esolang.

"But that can't possibly work [based on my misunderstanding of how LLMs work]!" you say.

Well, it does, so clearly you misunderstand how they work.

show 2 replies