logoalt Hacker News

pc86last Thursday at 5:21 PM0 repliesview on HN

This only makes sense if the percentage of LLM hallucinations is much higher than the percentage of things written on line being flat wrong (it's definitely not).