logoalt Hacker News

roughlyyesterday at 2:18 AM7 repliesview on HN

I’m more optimistic about the possibility of beneficial AGI in general than most folks, I think, but something that caught me in the article was the recourse to mammalian sociality to (effectively) advocate for compassion as an emergent quality of intelligence.

A known phenomenon among sociologists is that, while people may be compassionate, when you collect them into a superorganism like a corporation, army, or nation, they will by and large behave and make decisions according to the moral and ideological landscape that superorganism finds itself in. Nobody rational would kill another person for no reason, but a soldier will bomb a village for the sake of their nation’s geostrategic position. Nobody would throw someone out of their home or deny another person lifesaving medicine, but as a bank officer or an insurance agent, they make a living doing these things and sleep untroubled at night. A CEO will lay off 30,000 people - an entire small city cast off into an uncaring market - with all the introspection of a Mongol chieftain subjugating a city (and probably less emotion). Humans may be compassionate, but employees, soldiers, and politicians are not, even though at a glance they’re made of the same stuff.

That’s all to say that to just wave generally in the direction of mammalian compassion and say “of course a superintelligence will be compassionate” is to abdicate our responsibility for raising our cognitive children in an environment that rewards the morals we want them to have, which is emphatically not what we’re currently doing for the collective intelligences we’ve already created.


Replies

dataflowyesterday at 3:57 AM

> Nobody rational would kill another person for no reason, but a soldier will bomb a village for the sake of their nation’s geostrategic position.

I think you're forgetting to control for the fact that the former would be severely punished for doing so, and the latter would be severely punished for not doing so?

> Nobody would throw someone out of their home or deny another person lifesaving medicine, but as a bank officer or an insurance agent, they make a living doing these things and sleep untroubled at night.

Again, you're forgetting to control for other variables. What if you paid them equally to do the same things?

show 3 replies
adamisomyesterday at 2:30 AM

a CEO laying off 3% scales in absolute numbers as the company grows

should, therefore, large companies, even ones that succeed largely in a clean way by just being better at delivering what that business niche exists for, be made to never grow too big, in order to avoid impacting very many people? keep in mind that people engage in voluntary business transactions because they want to be impacted (positively—but not every impact can be positive, in any real world)

what if its less efficient substitutes collectively lay off 4%, but the greater layoffs are hidden (simply because it's not a single employer doing it which may be more obvious)?

to an extent, a larger population inevitably means that larger absolute numbers of people will be affected by...anything

show 4 replies
chrisweeklyyesterday at 2:49 AM

Beautifully expressed.

parineumyesterday at 2:28 AM

> Nobody would throw someone out of their home or deny another person lifesaving medicine

Individuals with rental properties and surgeons do this every day.

show 3 replies
inkyotoyesterday at 3:09 AM

I would argue that corporate actors (a state, an army or a corporation) are not true superorganisms but are semi-autonomous, field-embedded systems that can exhibit super-organism properties, with their autonomy being conditional, relational and bounded by the institutional logics and resource structures of their respective organisational fields. As the history of humanity has shown multiple times, such semi-autonomous with super-organism properties have a finite lifespan and are incapable of evolving their own – or on their own – qualitatively new or distinct, form of intelligence.

The principal deficiency in our discourse surrounding AGI lies in the profoundly myopic lens through which we insist upon defining it – that of human cognition. Such anthropocentric conceit renders our conceptual framework not only narrow but perilously misleading. We have, at best, a rudimentary grasp of non-human intelligences – biological or otherwise. The cognitive architectures of dolphins, cephalopods, corvids, and eusocial insects remain only partially deciphered, their faculties alien yet tantalisingly proximate. If we falter even in parsing the intelligences that share our biosphere, then our posturing over extra-terrestrial or synthetic cognition becomes little more than speculative hubris.

Should we entertain the hypothesis that intelligence – in forms unshackled from terrestrial evolution – has emerged elsewhere in the cosmos, the most sober assertion we can offer is this: such intelligence would not be us. Any attempt to project shared moral axioms, epistemologies or even perceptual priors is little more than a comforting delusion. Indeed, hard core science fiction – that last refuge of disciplined imagination – has long explored the unnerving proposition of encountering a cognitive order so radically alien that mutual comprehension would be impossible, and moral compatibility laughable.

One must then ponder – if the only mirror we possess is a cracked one, what image of intelligence do we truly see reflected in the machine? A familiar ghost, or merely our ignorance, automated?

show 1 reply
goatloveryesterday at 2:31 AM

Also sociopaths are more capable of doing those things while pretending they are empathetic and moral to get positions of power or access to victims. We know a certain percentage of human mammals have sociopathic or narcissistic tendencies, not just misaligned groups of humans that they might take advantage of by becoming a cult leader or war lord or president.

watwutyesterday at 8:34 AM

> soldier will bomb a village for the sake of their nation’s geostrategic position.

Soldier does that to please the captain, to look manly and tough to peers, to feel powerful. Or to fulfill a duty - moral mandate on itself. Or out of hate, because soldiers are often made to hate the ennemies.

> Nobody would throw someone out of their home or deny another person lifesaving medicine

They totally would. Trump would do it for pleasure of it. Project 2025 authors would so it happily and sees the rest of us as wuss. If you listen to right wing rhetorics and look at voters, many people will hapilly do just that.