I think hallucinate is a good term because when an AI completely makes up facts or APIs etc it doesn't do so as a minor mistake of an otherwise correct reasoning step.
its more like conspiracy theory. when you're picking a token youre kinda like putting a gun to the LLM's head and demanding, "what you got next?"
its more like conspiracy theory. when you're picking a token youre kinda like putting a gun to the LLM's head and demanding, "what you got next?"