logoalt Hacker News

songodongoyesterday at 11:58 AM2 repliesview on HN

And you can easily prompt your way out of the typical LLM style. “Written in the style of Cormac McCarthy’s The Road”


Replies

capnrefsmmatyesterday at 12:27 PM

No, that doesn't really work so well. A lot of the LLM style hallmarks are still present when you ask them to write in another style, so a good quantitative linguist can find them: https://hdsr.mitpress.mit.edu/pub/pyo0xs3k/release/2

That was with GPT4, but my own work with other LLMs show they have very distinctive styles even if you specifically prompt them with a chunk of human text to imitate. I think instruction-tuning with tasks like summarization predisposes them to certain grammatical structures, so their output is always more information-dense and formal than humans.

Der_Einzigeyesterday at 8:15 PM

This still doesn't remove all the slop. You need sampler or fine-tuning tricks for it. https://arxiv.org/abs/2510.15061