logoalt Hacker News

theflyinghorselast Saturday at 5:11 PM6 repliesview on HN

I keep thinking about perhaps LLMs would make writing code in these lower-level-but-far-better-performing languages in vogue. Why have claude generate a python service when you could write a rust or C3 service with compiler doing a lot of heavy lifting around memory bugs?


Replies

dotancohenlast Saturday at 5:37 PM

  > Why have claude generate a python service when you could write a rust or C3 service with compiler doing a lot of heavy lifting around memory bugs?
The architecture of my current project is actually a Python/Qt application which is a thin wrapper around an LLM generated Rust application. I go over almost every line of the LLM generated Rust myself, but that machine is far more skilled at generating quality Rust than I currently am. But I am using this as an opportunity to learn.
show 1 reply
notimetorelaxlast Saturday at 5:22 PM

Having worked with rust in the past couple years, I can say that it hands down much better fit for LLMs than Python thanks to its explicitness and type information. This provides a lot of context for LLM to incrementally grow the codebase. You still have to watch it, of course. But the experience is very pleasant.

klysmlast Saturday at 5:22 PM

Because there’s more python on the internet to interpolate from. LLMs are not equally good at all languages

show 2 replies
sonniglast Saturday at 8:01 PM

I think the same. It sounds quite more practical to have LLMs code in languages whose compilers provide as much compile-time guardrails as possible (Rust, Haskell?). Ironically in some ways this applies to humans writing code as well, but there you run into the (IMO very small) problem of having to write a bit more code than with more dynamic languages.

HighGoldsteinlast Saturday at 6:06 PM

It seems cynically fitting that the future we're getting and deserve is one where we've automated the creation of memory bugs with AI.

gitaarikyesterday at 4:05 AM

You still want to be able to easily review the LLM generated code. At least I want to.