logoalt Hacker News

vlovich123yesterday at 10:28 PM1 replyview on HN

The cost of copying SOTA models though is super cheap and doesn’t take super long.


Replies

aurareturnyesterday at 10:43 PM

How do you distill when OpenAI and Anthropic inevitably move to tasks running in the cloud? IE. Go buy this extremely hard to get concert ticket for me.

Distilling might only be effective in the chat bot dominant era. We are about to move to an agents era.

Furthermore, I’m guessing distilling will get harder and harder. Claude Code leak shows some primitive anti distilling methods already. There’s research showing that models know when it’s being benchmarked. Who’s to say Anthropic and OpenAI aren’t able to detect when their models are being distilled?