logoalt Hacker News

the_arunyesterday at 10:46 PM3 repliesview on HN

[flagged]


Replies

ploumyesterday at 10:52 PM

Well it seems that, those days, instead of SUM(expense1,expense2) you ask an LLM to "make an app that will compute the total of multiple expenses".

If I read most of the news on this very website, this is "way more efficient" and "it saves time" (and those who don’t do it will lose their job)

Then, when it produces wrong output AND it is obvious enough for you to notice, you blame the hardware.

show 2 replies
bri3dyesterday at 10:50 PM

Somewhere along the line, the tensor math that runs an LLM became divergent from every other Apple device. My guess is that there's some kind of accumulation issue here (remembering that floating-point accumulation does not usually commute), but it seems genuinely broken in an unexpected way given that Apple's own LLM also doesn't seem to work on this device.

lxgryesterday at 11:02 PM

LLMs are applied math, so… both?