logoalt Hacker News

chpatricklast Sunday at 12:24 PM2 repliesview on HN

By that definition n-gram Markov chain text generators also include previous state because you always put the last n grams. :) It's exactly the same situation as LLMs, just with higher, but still fixed n.


Replies

famouswaffleslast Sunday at 2:52 PM

We've been through this. The context of a LLM is not fixed. Context windows =/ n gram orders.

They don't because n gram orders are too small and rigid to include the history in the general case.

I think srean's comment up the thread is spot on. This current situation where the state can be anything you want it to be just does not make a productive conversation.