logoalt Hacker News

simonwyesterday at 9:32 PM1 replyview on HN

I've talked to a team that's doing the dark factory pattern hinted at here. It was fascinating. The key characteristics:

- Nobody reviews AI-produced code, ever. They don't even look at it.

- The goal of the system is to prove that the system works. A huge amount of the coding agent work goes into testing and tooling and simulating related systems and running demos.

- The role of the humans is to design that system - to find new patterns that can help the agents work more effectively and demonstrate that the software they are building is robust and effective.

It was a tiny team and they stuff they had built in just a few months looked very convincing to me. Some of them had 20+ years of experience as software developers working on systems with high reliability requirements, so they were not approaching this from a naive perspective.

I'm hoping they come out of stealth soon because I can't really share more details than this.


Replies

observationistyesterday at 9:45 PM

You'd think at some point it'll be enough to tell the AI "ok, now do a thorough security audit, highlight all the potential issues, come up with a best practices design document, and fix all the vulnerabilities and bugs. Repeat until the codebase is secure and meets all the requisite protocol standards and industry best practices."

We're not there yet, but at some point, AI is gonna be able to blitz through things like that the way they blitz through making haikus or rewriting news articles. At some point AI will just be reliably competent.

Definitely not there yet. The dark factory pattern is terrifying, lol.

show 2 replies