Simple solution - run the same query on 3 different LLMs with different search integrations, if they concur chances of hallucination are low.