logoalt Hacker News

azurewraithtoday at 8:36 PM1 replyview on HN

the state engine is the part that can't hallucinate. even with simple steps/prompting the review model can miss things... it's still an LLM making a judgement call at the end of the day.

the state engine doesn't judge, it enforces... with code and not transformers ^_^

if a tool (or any other guardrail) isn't valid at a given state the model call gets rejected before the model sees the result. that's the gap between "a model said this is okay" vs. "the system structurally prevents this"


Replies

esafaktoday at 8:59 PM

I don't understand. Let's stay my state is whether we are in conformance with repo patterns. Walk me through how you don't/can't hallucinate, given that you need an LLM to determine the state. For state variables that don't need LLMs, you can simply use tests and commit hooks, no?

show 1 reply