logoalt Hacker News

cmiles8yesterday at 5:18 PM8 repliesview on HN

LLMs are an amazing advancement. The tech side of things is very impressive. Credit where credit is due.

Where the current wave all falls apart is on the financials. None of that makes any sense and there’s no obvious path forward.

Folks say handwavy things like “oh they’ll just sell ads” but even a cursory analysis shows that math doesn’t ad up relative to the sums of money being invested at the moment.

Tech wise I’m bullish. Business wise, AI is setting up to be a big disaster. Those that aimlessly chased the hype are heading for a world of financial pain.


Replies

swalshyesterday at 5:35 PM

Hard disagree, I'm in the process of deploying several AI solutions in Healthcare. We have a process a nurse usually spends about an hour on, and costs $40-$70 depending on if they are offshore and a few other factors. Our AI can match it at a few dollars often less. A nuse still reviews the output, but its way less time. The economics of those tokens is great. We have another solution that just finds money, $10-$30 in tokens can find hundreds of thousands of dollars. The tech isn't perfect (that's why we have a human in the loop still) but its more than good enough to do useful work, and the use cases are valuable.

show 4 replies
famouswafflesyesterday at 5:32 PM

>Folks say handwavy things like “oh they’ll just sell ads” but even a cursory analysis shows that math doesn’t ad up relative to the sums of money being invested at the moment.

Ok, so I think there's 2 things here that people get mixed on.

First, Inference of the current state of the art is Cheap now. There's no 2 ways about it. Statements from Google, Altman as well as costs of 3rd parties selling tokens of top tier open source models paint a pretty good picture. Ads would be enough to make Open AI a profitable company selling current SOTA LLMs to consumers.

Here's the other thing that mixes things up. Right now, Open AI is not just trying to be 'a profitable company'. They're not just trying to stay where they are and build a regular business off it. They are trying to build and serve 'AGI', or as they define it, 'highly autonomous systems that outperform humans at most economically valuable work'. They believe that, to build and serve this machine to hundreds of millions would require costs order(s) of magnitudes greater.

In service of that purpose is where all the 'insane' levels of money is moving to. They don't need hundreds of billions of dollars in data centers to stay afloat or be profitable.

If they manage to build this machine, then those costs don't matter, and if things are not working out midway, they can just drop the quest. They will still have an insanely useful product that is already used by hundreds of millions every week, as well as the margins and unit economics to actually make money off of it.

show 1 reply
blehnyesterday at 6:13 PM

The business trajectory will be like Uber. A few big companies (Google, OpenAI) will price their AI services at a loss until consumers find it to be indispensable and competitors run out of money, then they'll steadily ramp up the pricing to the point where they're gouging consumers (and raking in profits) but still a bit cheaper or better than alternatives (humans in this case).

gdulliyesterday at 5:38 PM

> Folks say handwavy things like “oh they’ll just sell ads” but even a cursory analysis shows that math doesn’t ad up relative to the sums of money being invested at the moment.

We should factor in that messaging that's seamless and undisclosed in conversational LLM output will be a lot more valuable that what we think of as advertising today.

show 1 reply
xnxyesterday at 5:36 PM

Don't confuse OpenAI financials with Google financials. OpenAI could fold and Google would be fine.

show 1 reply
kylehotchkissyesterday at 6:03 PM

> Tech wise I’m bullish. Business wise, AI is setting up to be a big disaster. Those that aimlessly chased the hype are heading for a world of financial pain.

I'm not going to pretend to be on the cutting edge of news here, but isn't this where on-device models becomes relevant? It sounds like Apple's neural engine or whatever in the M5 have seen noteworthy performance improvements, and maybe in a few more generations, we don't need these openai-sized boondoggles to benefit from the tech?

ActorNightlyyesterday at 7:05 PM

>None of that makes any sense and there’s no obvious path forward.

The top end models with their high compute requirements probably don't but there is value in lower end models for sure.

After all, its the AWS approach. Most of AWS services is stuff you can easily get for cheaper if you just rent an EC2 and set it up yourself. But because AWS offers very simple setup, companies don't mind paying for it.

hackingonemptyyesterday at 8:48 PM

Most of the researchers outside big tech only have access to a handful of consumer GPUs at best. They are under a lot of pressure to invent efficient algorithms. The cost coming down by orders of magnitude seems like a good bet.