I couldn't agree more, and not just on HN but the world at large.
For the general populace including many tech people who are not ML researchers, understanding how convolutional neural nets work is already tricky enough. For non tech people, I'd hazard a guess that LLM/ generative AI is complexity-indistinguishable from "The YouTube/Tiktok Algorithm".
And this lack of understanding, and in many cases lack of conscious acknowledgement of the lack of understanding has made many "debates" sound almost like theocratic arguments. Very little interest in grounding positions against facts, yet strongly held opinions.
Some are convinced we're going to get AGI in a couple years, others think it's just a glorified text generator that cannot produce new content. And worse there's seemingly little that changes their mind on it.
And there are self contradictory positions held too. Just as an example: I've heard people express AI produced stuff to not qualify as art (philosophically and in terms of output quality) but at the same express deep concern how tech companies will replace artists...
> Just as an example: I've heard people express AI produced stuff to not qualify as art (philosophically and in terms of output quality) but at the same express deep concern how tech companies will replace artists...
I don't think this is self contradictory at all.
One may have beliefs about the meaning of human produced art and how it cannot -- and shouldn't -- be replaced by AI, and at the same time believe that companies will cut costs and replace artists with AI, regardless of any philosophical debates. As an example, studio execs and producers are already leveraging AI as a tool to put movie industry professionals (writers, and possibly actors in the future) "in their place"; it's a power move for them, for example against strikes.