logoalt Hacker News

picardotoday at 2:09 PM0 repliesview on HN

I'm curious how the unit economics actually play out here compared to traditional search. With Google, the compute cost to serve a query is negligible, so even low-CPM ads are profitable.

With an LLM, the inference cost per query is orders of magnitude higher. Unless thy have a way to command significantly higher CPMs -- perhaps by arguing intent signal is stringer in a conversation than a keyword search -- it feels like a difficult margin to sustain.