logoalt Hacker News

pmarreck04/23/20253 repliesview on HN

Locally-running LLM's might be good enough to do a decent enough job at this point... or soon will be.


Replies

nthingtohide04/23/2025

One more line of thinking is : Should each product have an mini AIs which tries to capture my essence useful only for that tool or product?

Or should there be an mega AI which will be my clone and can handle all these disparate scenarios in a unified manner?

Which approach will win ?

Kiro04/23/2025

They are not necessarily cheaper. The commercial models are heavily subsidized to a point where they match your electricity cost for running it locally.

show 1 reply
recursive04/23/2025

The energy in my phone's battery is worth more to me than the grid spot-price of electricity.