logoalt Hacker News

alexgotoilast Thursday at 7:02 PM0 repliesview on HN

The pattern here feels pretty old: every time something shows up that lets people go much faster, we use it to crash harder first. When cars showed up, people didn’t suddenly become more careful because they could now move at 50 km/h instead of 5 – they just plowed into things faster until seatbelts, traffic rules and driver training caught up.

LLMs in coding feel similar. They don’t magically remove the need for tests, specs, and review; they just compress the time between “idea” and “running code” so much that all the missing process shows up as outages instead of slow PRs. The risk isn’t “AI writes code”, it’s orgs refusing to slow down long enough to build the equivalent of traffic lights and driver’s ed around it.