logoalt Hacker News

PunchyHamsterlast Tuesday at 10:54 PM1 replyview on HN

Prediction: Fuck no.

AI is unreliable as it is. It might make formal verification a bit less work intensive but the last possible place anyone would want the AI hallucinations are in verification.


Replies

bkettlelast Tuesday at 11:00 PM

The whole point I think, though, is that it doesn’t matter. If an LLM hallucinates a proof that passes the proof checker, it’s not a hallucination. Writing and inspecting the spec is unsolved, but for the actual proof checking hallucinations don’t matter at all.

show 1 reply