I feel like a trust factor that A) the person is willing to improve software quality even if it means rewriting it later down the line if it was generated by AI in the first term & B) the project is trying to be sustainable.
I don't know but I feel like trust is the real bottleneck and I used to be happy about it but nowadays I feel like there is even a sense of distrust within the HN community where earlier I used to believe it was a more tightknit community but right now, with all political developments and bots and AI use itself for comments in HN.
I think what's gonna happen is not just that we have to trust somebody but rather we have to trust our trust in them if that hopefully makes sense.
We have to trust that we are trusting the right guy in a world where trust feels like being eroded and this is a decently bit of an uphill battle
It's also a community thing imo. People are more likely to trust the trust if others do too, We offload our judgement to others thinking that if they liked it then I am more willing to do so too
So if your project gets trusted by a community, it can snowball but it needs the earlier momentum which I feel like a lot of projects aren't gonna reach since there's only enough snow (attention/trust for the most part)
The biggest question is how to start the snowball effect reasonably.