logoalt Hacker News

javawizardlast Saturday at 8:41 PM1 replyview on HN

> But the other way around is not possible due to the closed nature of GPT-5.

At risk of sounding glib: have you heard of distillation?


Replies

dust42last Saturday at 10:27 PM

Distilling from a closed model like GPT-4 via API would be architecturally crippled.

You’re restricted to output logits only, with no access to attention patterns, intermediate activations, or layer-wise representations which are needed for proper knowledge transfer.

Without alignment of Q/K/V matrices or hidden state spaces the student model cannot learn the teacher model's reasoning inductive biases - only its surface behavior which will likely amplify hallucinations.

In contrast, open-weight teachers enable multi-level distillation: KL on logits + MSE on hidden states + attention matching.

Does that answer your question?