logoalt Hacker News

theletterfyesterday at 6:13 PM1 replyview on HN

Prove it.


Replies

threethirtytwoyesterday at 10:47 PM

I can do more, I told the AI to make it "better":

   The decision to stop hiring technical writers usually feels reasonable at the moment it’s made. It does not feel reckless. It feels modern. Words have become cheap, and documentation looks like words. Faced with new tools that can produce fluent text on demand, it is easy to conclude that documentation is finally solved, or at least solved well enough.

   That conclusion rests on a misunderstanding so basic it’s hard to see once you’ve stepped over it.

   Documentation is not writing. Writing is what remains after something more difficult has already happened. Documentation is the act of deciding what a system actually is, where it breaks, and what a user is allowed to rely on. It is not about describing software at its best, but about constraining the damage it can do at its worst.

   This is why generated documentation feels impressive and unsatisfying at the same time. It speaks with confidence, but never with caution. It fills gaps that should remain visible. It smooths over uncertainty instead of marking it. The result reads well and fails quietly.

   Technical writers exist to make that failure loud early rather than silent later. Their job is not to explain what engineers already know, but to notice what engineers have stopped seeing. They sit at the fault line between intention and behavior, between what the system was designed to do and what it actually does once released into the world. They ask the kinds of questions that slow teams down and prevent larger failures later.

   When that role disappears, nothing dramatic happens. The documentation still exists. In fact, it often looks better than before. But it slowly detaches from reality. Examples become promises. Workarounds become features. Caveats evaporate. Not because anyone chose to remove them, but because no one was responsible for keeping them.

   What replaces responsibility is process. Prompts are refined. Review checklists are added. Output is skimmed rather than owned. And because the text sounds finished, it stops being interrogated. Fluency becomes a substitute for truth.

   Over time, this produces something more dangerous than bad documentation: believable documentation. The kind that invites trust without earning it. The kind that teaches users how the system ought to work, not how it actually does. By the time the mismatch surfaces, it no longer looks like a documentation problem. It looks like a user problem. Or a support problem. Or a legal problem.

   There is a deeper irony here. The organizations that rely most heavily on AI are also the ones that depend most on high-quality documentation. Retrieval pipelines, curated knowledge bases, semantic structure, instruction hierarchies: these systems do not replace technical writing. They consume it. When writers are removed, the context degrades, and the AI built on top of it begins to hallucinate with confidence. This failure is often blamed on the model, but it is really a failure of stewardship.

   Responsibility, meanwhile, does not dissolve. When documentation causes harm, the model will not answer for it. The process will not stand trial. Someone will be asked why no one caught it. At that point, “the AI wrote it” will sound less like innovation and more like abdication.

   Documentation has always been where software becomes accountable. Interfaces can imply. Marketing can persuade. Documentation must commit. It must say what happens when things go wrong, not just when they go right. That commitment requires judgment, and judgment requires the ability to care about consequences.

   This is why the future that works is not one where technical writers are replaced, but one where they are amplified. AI removes the mechanical cost of drafting. It does not remove the need for someone to decide what should be said, what must be warned, and what should remain uncertain. When writers are given tools instead of ultimatums, they move faster not because they write more, but because they spend their time where it matters: deciding what users are allowed to trust.

   Technical writers are not a luxury. They are the last line of defense between a system and the stories it tells about itself. Without them, products do not fall silent. They speak freely, confidently, and incorrectly.

   Language is now abundant.
   Truth is not.

   That difference still matters.