logoalt Hacker News

keiferskilast Friday at 7:58 AM15 repliesview on HN

We just had a realization during a demo call the other day:

The companies that are entirely AI-dependent may need to raise prices dramatically as AI prices go up. Not being dependent on LLMs for your fundamental product’s value will be a major advantage, at least in pricing.


Replies

andersmurphylast Friday at 8:19 AM

Yup. Also regardless of price they need to spend more and more as the project collapses under the inevitable incidental complexity of 30k lines of code a day.

It's similar to how if you know what you're doing you can manage a simple VPS and scale a lot more cost effectively than something like vercel.

In a saturated market margins are everything. You can't necessarily afford to be giving all your margins to anthropic and vercel.

show 1 reply
zozbot234last Friday at 10:31 AM

> The companies that are entirely AI-dependent may need to raise prices dramatically as AI prices go up.

It's not that clear. Sure, hardware prices are going up due to the extremely tight supply, but AI models are also improving quickly to the point where a cheap mid-level model today does what the frontier model did a year ago. For the very largest models, I think the latter effect dominates quite easily.

show 4 replies
accruallast Friday at 12:59 PM

> Not being dependent on LLMs for your fundamental product’s value

I think more specifically not being dependent on someone else's LLM hardware. IMO having OSS models on dedicated hardware could still be plenty viable for many businesses, granted it'll be some time before future OSS reaches today's SOTA models in performance.

michaelbuckbeelast Friday at 9:27 AM

What's weird though is the bifurcation in pricing in the market: aka if your app can function on a non-frontier level AI you can use last years model at a fraction of the cost.

Cthulhu_last Friday at 1:24 PM

That'll be (part of) the big market correction, but also speaking broadly; as investor money dries up and said investors want to see results, many new businesses or products will realise they're not financially viable.

On a small scale that's a tragedy, but there's plenty of analysts that predict an economic crash and recession because there's trillions invested in this technology.

muppetmanlast Friday at 9:46 AM

No shit. People are just figuring this out now?

This is the “Building my entire livelihood on Facebook, oh no what?” all over again.

Oh no sorry I forgot, your laptops LLM can draw a potato, let me invest in you.

show 2 replies
anonyfoxlast Friday at 11:31 AM

in fact I am betting opposite. frontier models are getting not THAT much better anymore at all, for common business needs at least. but the OSS models keep closing the gap. which means if trajectories hold there will be a near future moment probably where the big provider costs suddenly drop shaerply once the first viable local models consistently can take over tasks normally on reasonable hardware. Right now probably frontier providers rush for as much money as they possible can before LLMs become a true commodity for the 80% usecases outside of deep expert areas they will have an edge over as specialist juggernauts (iE a cybersecurity premium model).

So its all a house of cards now, and the moment the bubble bursts is when local open inference has closed the gap. looks like chinese and smaller players already go hard into this direction.

show 1 reply
michaeljelast Friday at 9:37 AM

Absolutely. Pricing exposure is the quiet story under all the waves of AI hype. Build for convenience → subsidise for dependence → meter for margin is a well-worn playbook, and AI-dependent companies are about to find out what phase three feels like.

Hyperscalers are spending a fortune so we think AI = API, but renting intelligence is a business model, not a technical inevitability.

Shameless link to my post on this: https://mjeggleton.com/blog/AIs-mainframe-moment

finaardlast Friday at 9:10 AM

How is that surprising? We've been taking that into account for any LLM related tooling for over a year now that we either can drop it, or have it designed in a way that we can switch to a selfhosted model when throwing money at hardware would pay for itself quickly.

It's just another instance of cloud dependency, and people should've learned something from that over the last two decades.

show 1 reply
strife25last Friday at 10:19 AM

Marginal costs matter in this world.

onion2klast Friday at 12:46 PM

The companies that are entirely AI-dependent may need to raise prices dramatically as AI prices go up

Or they'll price the true cost in from the start, and make massive profits until the VC subsidies end... I know which one I'd do.

show 1 reply
bjornroberglast Friday at 1:25 PM

I wonder if it could be that they won't because the real mechanism is that AI wrapper pricing power is weak (switching costs near zero) but state of the art models makes it difficult to lower prices due to higher cost.

thih9last Friday at 2:59 PM

Also: AI dependance could be explicit AI API usage by the product itself, but also anything else, like: AI assisted coding, AI used by humans in other surrounding workflows, etc.

show 1 reply
sevenzerolast Friday at 12:11 PM

This was as clear as the sky when the first llm based businesses popped up. How did you realize this only now?

show 2 replies
sidewndr46last Friday at 12:21 PM

Not really, the next move is to establish standards groups requiring the use of AI in product development. A mix of industry and governmental mandates. What you view are viewing as COGS instead becomes instead a barrier to entry.