logoalt Hacker News

dale_glassyesterday at 2:50 PM2 repliesview on HN

Not really.

I can generate images or get LLM answers in below 15 seconds on mundane hardware. The image generator draws many times faster than any normal person, and the LLM even on my consumer hardware still produces output faster than I can type (and I'm quite good at that), let alone think what to type.


Replies

butlikeyesterday at 8:34 PM

An LLM gives AN answer. If you ask for not many more than that it gets confused, but instead of acting in a human-like way, it confidently proceeds forward with incorrect answers. You never quite know when the context got poisoned, but reliability drops to 0.

There's many things to say on this. Free is worthless. Speed is not necessarily a good thing. The image generation is drivel. But...

The main nail in the coffin is accountability. I can't trust my work if I can't trust the output of the machine. (and as a bonus, the machine can't build a house. It's single purpose).

show 1 reply
beepbooptheoryyesterday at 2:57 PM

Is "faster" really what we are talking about right now? It could be a lot faster to take a helicopter to work everyday too, versus riding a bike.

Also, why are people moving mountains to make huge, power obliterating datacenters if actually "its fine, its not that much"?

show 3 replies