logoalt Hacker News

HDThoreauntoday at 7:58 AM1 replyview on HN

Only if training new models leads to better models. If the newly trained models are just a bit cheaper but not better most users wont switch. Then the entrenched labs can stop training so much and focus on profitable inference


Replies

kuschkutoday at 8:18 AM

If they really have 40-60% gross margins, as training costs go down, the newly trained models could offer the same product at half the price.