logoalt Hacker News

fpoling12/07/20254 repliesview on HN

Pick up a book about programming from seventies or eighties that was unlikely to be scanned and feed into LLM. Take a task from it and ask LLM to write a program from it that even a student can solve within 10 minutes. If the problem was not really published before, LLM fails spectacularly.


Replies

crawshaw12/07/2025

This does not appear to be true. Six months ago I created a small programming language. I had LLMs write hundreds of small programs in the language, using the parser, interpreter, and my spec as a guide for the language. The vast majority of these programs were either very close or exactly what I wanted. No prior source existed for the programming language because I created it whole cloth days earlier.

show 2 replies
handoflixue12/08/2025

It's telling that you can't actually provide a single concrete example - because, of course, anyone skilled with LLMs would be able to trivially solve any such example within 10 minutes.

Perhaps the occasional program that relies heavily on precise visual alignment will fail - but I dare say if we give the LLM the same grace we'd give a visually impaired designer, it can do exactly as well.

show 1 reply
anjel12/07/2025

Sometimes its generated, and many times its not. Trivial to denote, but its been deemed non of your business.

ahepp12/07/2025

You've done this? I would love to read more about it