logoalt Hacker News

cinntailelast Sunday at 5:36 AM0 repliesview on HN

You can prompt the LLM to not just give you the answer. Possibly even ask it to consider the problem from different angles but that may not be helpful when you don't know what you don't know.