logoalt Hacker News

butztoday at 3:12 PM4 repliesview on HN

How about we use all that AI and start doing some serious optimizations to existing software? Reduce memory requirements by half, or even more.


Replies

TeMPOraLtoday at 3:29 PM

Plenty of people do.

AI is one of the few major general technological breakthroughs, comparable to the Internet and electricity. It's potentially applicable to everything, which is why right now everyone is trying to apply it to everything. Including developing new optimization algorithms, optimizing optimizing compilers, optimizing applications, optimizing systems, optimizing hardware, ...

Big AI vendors are at the forefront of it, because they're the ones who actually pay for the AI revolution, so any efficiency improvement saves them money.

show 1 reply
undersuittoday at 6:15 PM

Improving LLM memory contention will allow LLMs to use more memory.

echelontoday at 5:23 PM

We are.

I'm writing a metric ton of Rust code with Claude Code.

cyanydeeztoday at 3:15 PM

LLMs are intrinsically deaigned for token production, which is typically inversely related to optimization and efficoency.