> I didn’t write any piece of code there. There are several known issues, which I will task the agent to resolve, eventually. Meanwhile, I strongly advise against using it for anything beyond a studying exercise.
Months of effort and three separate tries to get something kind of working but which is buggy and untested and not recommended for anyone to use, but unfortunately some folks will just read the headline and proclaim that AI has solved programming. "Ubiquitous hardware support in every OS is going to be a solved problem"! Or my favourite: instead of software we will just have the LLM output bespoke code for every single computer interaction.
Actually a great article and well worth reading, just ignore the comments because it's clear a lot of people have just read the headline and are reading their own opinions into it.
You're validly critiquing where it is now.
The hype people are excited because they're guessing where it's going.
This is notable because it's a milestone that was not previously possible: a driver that works, from someone who spent ~zero effort learning the hardware or driver programming themselves.
It's not production ready, but neither is the first working version of anything. Do you see any reason that progress will stop abruptly here?
I don’t get this response. This is amazing! What percentage of programmers can even write a buggy FreeBSD kernel driver? If you were tasked at developing this yourself, wouldn’t it be a huge help to have something that already kind of works to get things started?
Programmers have always been in search of an additional layer of abstraction. LLM coding feeds exactly into this impulse.
> instead of software we will just have the LLM output bespoke code for every single computer interaction.
That's sort of the idea behind GPU upscaling: You increase gaming performance and visual sharpness by rendering games at lower resolutions and use algorithms to upscale to the monitor's native resolution. Somehow cheaper than actually rendering at high resolution: Let the GPU hallucinate the difference at a lower cost.
[dead]
The author specifically said that they did not read the code or even test the output very thoroughly. It was intentionally just a naive toy they wanted to play around with.
Nothing to do with AI, or even the capabilities of AI. The person intentionally didn't put in much effort.