logoalt Hacker News

I'm scared about biological computing

139 pointsby kuberwastakenyesterday at 4:03 PM121 commentsview on HN

Comments

pjs_yesterday at 6:12 PM

Be careful about how you interpret that paper. It looks really impressive -- real neurons in a petri dish seem to successfully (if amateurishly) murk a few imps.

https://www.youtube.com/watch?v=yRV8fSw6HaE

But there's more to the setup than you might assume from a casual reading. Here's the code used for that demo:

https://github.com/SeanCole02/doom-neuron

So there is an entire pytorch stack wrapped around the mysterious little blob of neurons -- they aren't just wired straight into WASD. There is a conventional convnet-based encoder, running on a GPU, in the critical path. The README tries to argue that the "neurons are doing the learning" but to my dilettante, critical eye it really looks as though there is a hell of a lot of learning happening in the convnet also.

Are the neurons learning to play doom, or are they learning to inject ever so slightly more effective noise into the critical path? Would this work just as well if we replaced the neurons with some other non-markovian sludge? The authors do ablation experiments to try to get to the bottom of this but I can't really tell how compelling the results are (due to my own ignorance/stupidity of course)

show 1 reply
philipsyesterday at 5:40 PM

I think this is the same ethical questions of veganism and our use/abuse of biological systems. This is an excerpt from "The Pig that Wants to be Eaten" by Julian Baggini

> After forty years of vegetarianism, Max Berger was about to sit down to a feast of pork sausages, crispy bacon and pan-fried chicken breast. Max had always missed the taste of meat, but his principles were stronger than his culinary cravings. But now he was able to eat meat with a clear conscience.

> The sausages and bacon had come from a pig called Priscilla he had met the week before. The pig had been genetically engineered to be able to speak and, more importantly, to want to be eaten. Ending up on a human’s table was Priscilla’s lifetime ambition and she woke up on the day of her slaughter with a keen sense of anticipation. She had told all this to Max just before rushing off to the comfortable and humane slaughterhouse. Having heard her story, Max thought it would be disrespectful not to eat her.

> The chicken had come from a genetically modified bird which had been ‘decerebrated’. In other words, it lived the life of a vegetable, with no awareness of self, environment, pain or pleasure. Killing it was therefore no more barbarous than uprooting a carrot.

> Yet as the plate was placed before him, Max felt a twinge of nausea. Was this just a reflex reaction, caused by a lifetime of vegetarianism? Or was it the physical sign of a justifiable psychic distress? Collecting himself, he picked up his knife and fork . . .

> Source: The Restaurant at the End of the Universe by Douglas Adams (Pan Books, 1980)

show 3 replies
Imnimoyesterday at 8:58 PM

>But this is where the line slightly blurs in my head. Did we possibly just build the first human biocomputer and immediately put it in a simulated hell, playing the same game on loop, forever? Using the same reward mechanisms we use for LLMs?

This description does not seem to really match what was done in the Doom demo, and makes me skeptical that the author has actually looked into the details.

show 1 reply
slibhbyesterday at 7:33 PM

I read an interesting book about consciousness recently: The Hidden Spring by Mark Solms.

Solms argues, I think convincingly, that consciousness fundamentally has to do with emotions and not cognition. Consciousness is not produced by the cortex but rather by the brainstem, where signals from all over the body converge (e.g. pain, hunger, itchiness, etc).

If that argument is true then a petri-dish of neurons is unlikely to be conscious, even it performs some analogue of visual processing.

The book makes other arguments that I found less convincing. For example that consciousness is "felt homeostasis" and that a fairly simple system (somewhat more complex than a thermometer) will be conscious, albeit minimally.

show 2 replies
atleastoptimalyesterday at 8:23 PM

We will never draw the line because morality among humans is coupled with looking human-like. For most people, their morals have aesthetic prerequisites, neurons in a lab don't mean as much as neurons in a meat case (especially if that meat case is physically attractive)

show 1 reply
lukasbyesterday at 5:41 PM

Anyone who believes AI running on silicon could in principle be conscious has to believe that biological computers are conscious, right? Why aren't those people voicing more concerns?

show 7 replies
fhnyesterday at 10:59 PM

a couple of years ago, the mad scientists in me thought about a business where we preserve the brains of people a la Futurama. When the body dies, the brain does not necessarily have to follow. Possible? Yes. Feed it the right chemical cocktail, O2, remove waste products. Ethical/Moral? Whose to say? We are preserving life..in a sense. Profitable - Sure. Connect it to a keyboard/mouse interface. I mean we already have business cyro-preserving with the hope of unfreezing in the distant future!

marjipan200yesterday at 8:21 PM

The mind of the neuro-materialist is a radio so impressed with its own receiver that it's convinced it is the broadcasting tower

show 1 reply
rolphyesterday at 6:24 PM

for now, this is a hyper simplistic and hacky POC.

you may find a look at how a full visual system is constructed to be a relief.

https://www.cell.com/fulltext/S0896-6273(07)00774-X

there is a good distance to go before this is anything beyond a reflex circuit.

https://www.sciencedirect.com/topics/neuroscience/spinal-ref...

mrweaselyesterday at 7:14 PM

In the same line of thinking: I'm a little concerned that humans are, to some extend, just LLMs in a meat suit.

AntiDyatlovyesterday at 7:26 PM

Yeah, we're totally fucked, there is no scientific theory that can tell you what is and isn't conscious. For all we know, my laptop, not running any LLM is conscious and always has been. Or my chair. Or a proton. This consciousness thing is a nasty problem for the scientific worldview.

show 2 replies
mr-footprintyesterday at 6:16 PM

Reminds me of an ethical dilemma in the game "Detroit: Become Human". I found myself philosophically asking what it means to be alive, what it means to be conscious, and if something without biological bones, blood and a brain can feel the same-level of consciousness as humans, or greater.

ChicagoDavetoday at 12:22 AM

Am I the only one that read Greg Bear’s novel Blood Music?

That book has haunted me for decades.

AISnakeOilyesterday at 8:08 PM

LLMs have awareness for the time they are spawned into memory. But it's very limited, think about if you could use your brain to think, but only after someone asked you a question. After you think the answer, then you are brain dead (unconscious) until another question is asked.

keyboredyesterday at 7:35 PM

I don’t believe that silicon has a soul (loosely speaking). For the same reason I don’t believe that some biomatter in a lab has a soul.

show 1 reply
LeCompteSftwareyesterday at 5:44 PM

An underappreciated source of nonsense in 21st century discourse is people watching YouTube instead of reading things. It doesn't appear this author read anything, preferring to be spooked and misled by a YouTube video.

   trained them to play DOOM - honestly better than I do.
Maybe the author really really sucks at DOOM, but I think this is a false embellishment:

>> While the neurons can play the game better than a randomly firing player, they’re not very good. “Right now, the cells play a lot like a beginner who’s never seen a computer—and in all fairness, they haven’t,” Brett Kagan, chief scientific officer at Cortical Labs, says in the video. “But they show evidence that they can seek out enemies, they can shoot, they can spin. And while they die a lot, they are learning.” [https://www.smithsonianmag.com/smart-news/a-clump-of-human-b... ]

  To play DOOM, the system feeds visual data to the neurons. For the neurons to react, they have to interpret that data in some way. 
This is totally false - not even a misleading metaphor, just plain wrong. The neuronal computer doesn't get any visual information:

>> So how does a petri dish of brain cells play Doom when it doesn’t have any eyes? Or fingers? "We take a snapshot of the game with information like the player’s health and the position of enemies, pass it through a neural network, convert it into numbers, and send the data,” explains Cole. “This is called encoding – essentially turning the game state into signals the neurons can understand. The neurons then fire an output – move left, move right, walk forward, shoot or not shoot – which the system decodes and converts back into actions in the game." [https://www.theguardian.com/games/2026/mar/16/petri-dish-bra...]

I am also concerned about neuronal computing. But it doesn't really help anyone to spread childish ghost stories about it.

I really hate YouTube, by the way. My dad used to read newspapers and had interesting ideas. Now he watches a bunch of YouTube and he's a huge idiot. It's not (directly) because of age: nobody is immune to narcotic slop. I had to delete my account when I realized how much of my life and cognition I was wasting. I wish others would do the same.

show 4 replies
FrustratedMonkyyesterday at 5:54 PM

"Where do we draw the line?"

There will be no line as long as there is the rush to win the capitalist game.

UNTIL -> The ball of neurons begins outthinking the humans. Probably also fused with some AI augmentation.

It only takes a few percentage points for a Human to outthink a Chimp. This new 'thing' will dominate the humans.

show 2 replies
smitty1eyesterday at 5:48 PM

Contrarian take: the Promethian efforts will continue, and asymptotically approach the axis of The Real Thing, until we realize that that Prometheus is a variation on the theme of Sisyphus.

Only in this telling, Sisyphus is rolling his uneven boulder along that asymptotic curve a little further with every iteration toward a smiling Zeus.

Etoro1942yesterday at 7:26 PM

[flagged]

qoezyesterday at 6:10 PM

We treat actual biological animals a lot worse in some cases so until we bump up the number of neurons significantly higher above what the lowest tier is below us I don't think we should stop the experiments.