As a neutral observation: it’s remarkable how quickly we as humans adjust expectations.
Imagine five years ago saying that you could have a general purpose AI write a c compiler that can handle the Linux kernel, by itself, from scratch for $20k by writing a simple English prompt.
That would have been completely unbelievable! Absurd! No one would take it seriously.
And now look at where we are.
> a simple English prompt
And that’s where my suspicion stems from.
An equivalent original human piece of work from an expert level programmer wouldn’t be able to do this without all the context. By that I mean all the all the shared insights, discussion and design that happened when making the compiler.
So to do this without any of that context is likely just very elaborate copy pasta.
Indeed, it's the Overton window that has moved. Which is why I secretly think the pro-AI side is more right than the anti-AI side. Makes me sad.
You're right. It's been pretty incredible. It's also frustrating as hell though when people extrapolate from this progress
Just because we're here doesn't mean we're getting to AGI or software developers begging for jobs at Starbucks
Wasn't there a fair amount of human intervention in the AI agents? My understanding is, the author didn't just write "make me a c compiler in rust" but had to intervene at several points, even if he didn't touch the code directly.
I totally agree, but I think a lot of the push-back is that this is presented as better than it actually is.
[dead]
Now consider how much of the original C compiler's source code it was trained on and still managed to output a worse result?