logoalt Hacker News

neuronexmachinatoday at 1:55 PM1 replyview on HN

For model cards in general, I have a suspicion that grok's training includes a fair amount of distillation off their competitors' models. That should be disclosed in a model card, and one of the reasons they likely don't want to release one.


Replies

Barbingtoday at 5:21 PM

Fair suspicion:

  ‘Savitt asked Musk if his artificial intelligence company, xAI, had ever “distilled” technology from OpenAI. Distillation is way of using one A.I. technology to create another, and it is not allowed by OpenAI’s terms of service.

  “Generally A.I. companies distill other A.I. companies,” Musk answered.
“Is that a ‘yes’?” Savitt asked. Musk answered, “Partly.”

  Distillation has become an increasingly important issue as companies like OpenAI and Anthropic have complained that Chinese companies are distilling their systems.’
https://www.nytimes.com/live/2026/04/30/technology/openai-tr...