logoalt Hacker News

staticshockyesterday at 8:36 PM10 repliesview on HN

The eloquence with which this point gets (repeatedly) made is continuing to improve each next time I read it. However, I still feel like we haven't nailed it. That is, we are not yet at the "aphorism" stage of the discourse (e.g. "the medium is the message", "you ship your org chart", "9 mothers can't make a baby in a month"), in which the most pointed version of this critique packs a punch in just a few words that resonate with the majority of people. That kind of epistemological chiseling takes years, if not decades. And AI certainly won't do it for us, because we don't know how to RL meaning-making.

Edit: 9 babies → 9 mothers


Replies

bla3yesterday at 9:42 PM

> "can't make 9 babies in a month"

It's "9 women can't make a baby in one month".

show 2 replies
ctvdevyesterday at 9:33 PM

> That is, we are not yet at the "aphorism" stage of the discourse

we learn by doing

show 3 replies
nemomarxtoday at 2:35 AM

Isn't it the vehicle metaphor about bicycles for the mind? Not fully crystallized yet but I feel like someone will

embedding-shapeyesterday at 9:44 PM

How about "Intelligence amplification, not artificial intelligence"?

Also could be shortened to "IA, not AI", and gets even more fun when you translate it to Spanish: "AI, no IA".

viccisyesterday at 10:48 PM

>the medium is the message

If you asked 100 Americans what this aphorism means, I strongly doubt a single one could capture McLuhan's original meaning.

show 1 reply
alphabeta3r56today at 1:59 AM

Taste/judgement cannot an AI beget

IceDaneyesterday at 10:19 PM

Outsource manual labor, not your brain.

thomastjefferytoday at 12:45 AM

Meaning is abstract. We can't express meaning: we can only signify it. An expression (sign) may contain the latent structure of meaning (the writer's intention), but that structure can only be felt through a relevant interpretation.

To maintain relevance, we must find common ground. There is no true objectivity, because every sign must be built up from an arbitrary ground. At the very least, there will be a conflict of aesthetics.

The problem with LLMs is that they avoid the ground entirely, making them entirely ignorant to meaning. The only intention an LLM has is to preserve the familiarity of expression.

So yes, this kind of AI will not accomplish any epistemology; unless of course, it is truly able to facilitate a functional system of logic, and to ground that system near the user. I'm not going to hold my breath.

I think the great mistake of "good ole fashioned AI" was to build it from a perspective of objectivity. This constrains every grammar to the "context-free" category, and situates every expression to a singular fixed ground. Nothing can be ambiguous: therefore nothing can express (or interpret) uncertainty or metaphor.

What we really need is to recreate software from a subjective perspective. That's what I've been working on for the last few years... So far, it's harder than I expected; but it feels so close.

show 3 replies
xnxyesterday at 8:44 PM

This concept won't reach that point because when you chisel too hard it crumbles. There are countless lower level tasks that typical programmers no longer learn how to do. Our capacity for knowledge is not unlimited so we offload everything we can to move to the next level of abstraction.

show 4 replies