logoalt Hacker News

cannedbreadlast Wednesday at 11:24 PM2 repliesview on HN

When I upload my drawing set, how often should I expect it to hallucinate? And how much of the real stuff does it flag?


Replies

aakashprasad91last Wednesday at 11:46 PM

Hallucinations still happen occasionally, but we bias heavily toward high-confidence findings so noise stays low. On typical projects we surface a few hundred coordination issues that are real, observable conflicts across sheets rather than speculative checks. We’re actively improving precision by learning from every false positive customers flag. We show you the drawings, specs, etc. so you can verify it yourself not just trust the AI.

shuanglylast Thursday at 12:03 AM

We do extensive preprocessing to ensure AI receives accurate context, data, and documents for review, and we’re continuously refining this, so accuracy keeps improving every day. Right now the accuracy isn't super stable yet across projects, but we've had findings with > 90% accuracy results