An LLM can always output steps, but it doesn’t mean they are true, they are great at making up bullshit.
When the “how many ‘r’ in ‘strawberry’” question was all the rage, you could definitely get LLMs to explain the steps of counting, too. It was still wrong.
An LLM can always output steps, but it doesn’t mean they are true, they are great at making up bullshit.
When the “how many ‘r’ in ‘strawberry’” question was all the rage, you could definitely get LLMs to explain the steps of counting, too. It was still wrong.