logoalt Hacker News

russfinkyesterday at 10:26 PM0 repliesview on HN

One trick I have tried is asking the LLM to output a specification of the thing we are in the middle of building. A commenter above said humans struggle with writing good requirements - LLMs have trouble following good requirements - ALL of them - often forgetting important things while scrambling to address your latest concern.

Getting it to output a spec lets me correct the spec, reload the browser tab to speed things up, or move to a different AI.