logoalt Hacker News

rtkwe01/21/20251 replyview on HN

As we build and understand them now there's pretty good structural reasons to believe that LLMs cannot be tweaked or tuned to be more than incidentally truthful.


Replies

verdverm01/21/2025

If you use an LLM only solution, sure, but there are many more opportunities if you expand the system to include more.

We could, for example, use the existing tools for copyright detection, on the output and refuse to send it to the client side. It's just another moderation check in that perspective

show 1 reply