logoalt Hacker News

babelfishyesterday at 7:21 PM5 repliesview on HN

Sure, for humans. Not sure they'll be the primary readers of code going forward


Replies

tombertyesterday at 7:34 PM

I'm pretty sure that will be true with AI as well.

No accounting for taste, but part of makes code hard for me to reason about is when it has lots of combinatorial complexity, where the amount of states that can happen makes it difficult to know all the possible good and bad states that your program can be in. Combinatorial complexity is something that objectively can be expensive for any form of computer, be it a human brain or silicon. If the code is written in such a way that the number of correct and incorrect states are impossible to know, then the problem becomes undecidable.

I do think there is code that is "objectively" difficult to work with.

show 3 replies
tracerbulletxyesterday at 10:20 PM

Entropy and path dependence are unavoidable laws of mathematics. Not even evolution can avoid them.

pydryyesterday at 7:51 PM

AIs struggle with tech debt as much if not more than humans.

Ive noticed that theyre often quite bad at refactoring, also.

pyluayesterday at 9:50 PM

I think someday it will be completely unreadable for humans. Ai will have its optimized form.

nkohariyesterday at 7:50 PM

Because LLMs are designed as emulators of actual human reasoning, it wouldn't surprise me if we discover that the things that make software easy for humans to reason about also make it easier for LLMs to reason about.