logoalt Hacker News

Turskaramayesterday at 7:33 AM1 replyview on HN

Both the context and the prompt are just part of the same input. To the model there is no difference, the only difference is the way the user feeds that input to the model. You could in theory feed the context into the model as one huge prompt.


Replies

__loamyesterday at 11:37 AM

Sometimes I wonder if LLM proponents even understand their own bullshit.

It's all just tokens in the context window right? Aren't system prompts just tokens that stay appended to the front of a conversation?

They're going to keep dressing this up six different ways to Sunday but it's always just going to be stochastic token prediction.

show 3 replies