> programs built on a simplified version of brain neural networks
Not even close. "Neural networks" in code are nothing like real neurons in real biology. "Neural networks" is a marketing term. Treating them as "doing the same thing" as real biological neurons is a huge error
>that train on a corpus of nearly everything humans expressed in writing
It's significantly more limited than that.
>and that can pass the Turing test with flying colors, scares me
The "turing test" doesn't exist. Turing talked about a thought experiment in the very early days of "artificial minds". It is not a real experiment. The "turing test" as laypeople often refer to it is passed by IRC bots, and I don't even mean markov chain based bots. The actual concept described by Turing is more complicated than just "A human can't tell it's a robot", and has never been respected as an actual "Test" because it's so flawed and unrigorous.
>Not even close. "Neural networks" in code are nothing like real neurons in real biology
Hence the simplified. The weights encoding learning and inteconnectedness and nonlinear activation and distributed representation of knowledge is already an approximation, even if the human architecture is different and more elaborate.
Whether the omitted parts are essential or not, is debatable. “Equations of motion are nothing like real planets" either, but they capture enough to predict and model their motion.
>The "turing test" doesn't exist. Turing talked about a thought experiment in the very early days of "artificial minds". It is not a real experiment.
It is not a real singural experiment protocol, but it's a well enough defined experimental scenario which for over half a century, it was kept as the benchmark of recognition of artificial intelligence, not by laymen (lol) but by major figures in AI research as well, figures like Minsky, McCarthy and others engaged with it.
That researchers haven't done Turing-test studies (taking the setup from turing and even called them that) is patently false. Including openly testing LLMs:
https://aclanthology.org/2024.naacl-long.290/
https://www.pnas.org/doi/10.1073/pnas.2313925121
https://arxiv.org/pdf/2503.23674
https://arxiv.org/pdf/2407.08853
https://arxiv.org/abs/2405.08007
https://www.sciencedirect.com/science/article/pii/S295016282...