I had an experience yesterday launching on Show HN that really threw me. The product triggered people's "privacy sense" immediately.
My first reaction was defensive. I took it personally. I thought: Do you really think I’m a scammer? I pour my soul into meticulously crafting products to delight users, not to trick them. Why would I trash all that effort and disrespect my own goals by doing something as stupid as stealing data? It felt insulting that people assumed malice when I was just trying to build something useful.
But after sitting with it, I realized those initial comments—the ones I wanted to dismiss as paranoia—were actually right. Not about me, but about the environment we operate in.
There are enough shady companies, data brokers, and bad actors out there who abuse user trust with impunity. We’ve all seen big corporations bury invasive tracking in their terms of service. As a builder, I don't operate in that world; I’m just focused on making things work. But for users, that betrayal is their baseline reality. They have been trained to expect the worst.
I realized I hadn’t factored that into the launch. I didn’t explicitly state "Your data remains yours" because to me, it was obvious. Why would I want your data? But in an industry that has systematically mined, stolen, and abused user boundaries for a decade, you can’t blame people for checking for the exits. They aren't being "ninnies"; they are being wise.
If I were using a new tool that had access to my workflow, I would want explicit assurance that my IP wasn't being siphoned off. I just forgot to view my own product through the lens of a weary stranger rather than the optimisitc builder who wrote the code.
This is especially true now because the landscape has changed. There was an old PG essay about how ideas are cheap and execution is everything. That’s shifting. AI has made execution cheap. That means ideas are prime again.
Because execution is distributed and fast, first-mover advantage, brand, and reputation matter more than ever. Your prompts and your workflow are your IP.
So, privacy isn't just a compliance box; it's a competitive requirement. I don't think we need full-NSA-level paranoia for every tool, but we do need to recognize the environment we are launching into. The "security purists" were right to push back: I didn't think about that aspect enough, and in 2025, trust is the only currency that matters.
Execution still matters. Using AI in the background or not, how the idea is implemented and delivered still matters just as much as it always did. People use the term “AI slop” because someone thought execution didn’t matter and AI could do it all for them… and it was terrible. This goes back to what you said about trying to delight the user, this is execution… AI or not.
As for the rest, I think this line is key:
> But I was thinking more like a private user who already trusts what I use by default (because I built it)
Of course you trust what you wrote, but do you trust what everyone else writes? Put yourself in the shoes of your potential customers. Most people don’t know you, your values, your intent… all they have to go by is what you tell them. Also remember, people lie. So don’t just tell them, prove it.
And to your point with AI, the ability for someone to make something that was seemingly done with care and to delight, just so they can steal data, has never been easier. Your take away was execution is cheap, but maybe the take away should be data harvesting is now cheap, and that data is more valuable than ever, so people are right to be wary of anything that is accessing their data.
2026! Blasted AI copy edit to reduce spiciness also took it back in time