There's a nearly fool-proof solution: manually verify every submission.
You can use automated systems as a first line of defense against spam, and then hire people to manually verify every submission that makes it through. You can even use that as opportunity to ensure a certain quality of submission, even if it was submitted by a person.
Any legitimate submissions that get caught in the initial spam filter can use a manual appeal process (perhaps emailing and pleading their case which will go into a queue to be manually reviewed).
Sure, it's not necessary easy and submissions may take some time to appear on the site, but there would be essentially zero spam and low-quality content.
The article never talked about bot-generated products, only bot generated comments and upvotes. How does manual review address this exactly?
The bots are commenting and voting, not submitting products.
> You can use automated systems as a first line of defense against spam, and then hire people to manually verify every submission that makes it through. You can even use that as opportunity to ensure a certain quality of submission, even if it was submitted by a person.
The problem is, once you do manual upfront moderation, you lose a lot of the legal protections that UGC-hosting sites enjoy - manual approval means you are accepting the liability for anything that is published.