logoalt Hacker News

drbigtoday at 8:41 AM3 repliesview on HN

The most interesting is the realization that if the LLM's input is only the output of a professional (human), then by definition the LLM cannot mimic the process the (human) professional applied to get from whatever input they had to produce the output.

In other words an LLM can spit out a plausible "output of X", however it cannot encode the process that lead X to transform their inputs into their output.


Replies

simianwordstoday at 9:22 AM

i don't get what the point of what you are saying is? i can ask it to explain how to solve an integral right now with steps.

i can ask it to tell me how to write like a person X right now.

show 2 replies
Eddy_Viscosity2today at 8:48 AM

Is it not possible for the process of input to output be inferred by the llm and therefore applied to new inputs to create appropriate outputs.

show 1 reply
weird-eye-issuetoday at 8:51 AM

Replace "LLM" with "student" and read that again. You don't just blindly give students output, you teach them, like what you are supposed to do with an LLM.

show 3 replies