logoalt Hacker News

swyx07/31/20251 replyview on HN

do LoRAs conflict with your distillation?


Replies

sangwulee07/31/2025

The architecture is the same so we found that some LoRAs work out-of-the box, but some LoRAs don't. In those cases, I would expect people to re-run their LoRA finetuning with the trainer they've used.