logoalt Hacker News

8organicbitslast Tuesday at 11:52 PM0 repliesview on HN

If the engineer doesn't understand the proof system then they cannot validate that the proof describes their problem. The golden rule of LLMs is that they make mistakes and you need to check their output.

> writing proof scripts is one of the best applications for LLMs. It doesn’t matter if they hallucinate nonsense, because the proof checker will reject any invalid proof

Nonsense. If the AI hallucinated the proof script then it has no connection to the problem statement.