> Concepts like
> finite context windows
like a human has
> or the fact that the model is "frozen" and stateless,
much like a human adult. Models get updated at a slower frequency than humans. AI systems have access to fetch new information and store it for context.
> or the idea that you can transfer conversations between models are trivial
because computers are better-organized than humanity.
> much like a human adult.
it doesn't sound like you really understand what these statements mean. if LLMs are like any humans it's those with late stage dementia, not healthy adults
> much like a human adult.
I do hope you're able to remember what you had for lunch without incessantly repeating it to keep it in your context window