logoalt Hacker News

energy123last Thursday at 11:51 PM1 replyview on HN

It is expensive. But if I'm correct about o1, it means user mistrust of LLMs is going to be a short-lived thing as costs come down and more people use o1 (or better) models as their daily driver.


Replies

sdesolyesterday at 12:33 AM

> mistrust of LLMs is going to be a short-lived thing as costs come down and more people use o1

I think the biggest question is, is o1 scalable. I think o1 does well because it is going back and forth hundreds if not thousands of times. Somebody mentioned in a thread that I was participating in that they let o1 crunch things for 10 minutes. It sounded like it saved them a lot work, so it was well worth it.

Whether or not o1 is practical for the general public is something we will have to wait and see.

show 1 reply