logoalt Hacker News

hnthrowaway0315today at 4:05 PM1 replyview on HN

AI can also do alignment and pull from its vast training dataset for design and "thinking" -- because 99% of the problems in this world were already solved, multiple times, maybe not in the exactly same format, but in a very similar format.

I also see that in the future humans will adapt to AI, instead of the opposite. Why? Because it's a lot easier for humans to adapt to AI, than the opposite. It's already happening -- why do companies ask their employees to write complete documentation for AI to consume? This is what I called "Adaption".

I can also imagine that in the near future, when employment plummets, when basic income become general, when governments build massive condos for social housing -- everything new will be required to adapt to AI. The roads, the buildings, everything physical is going to be built with ease-of-navigation by AI in consideration. We don't need a Gen AI -- that is too expensive and too long term for the Capitalist class to consider. We only need a bunch of AI agents and robots coordinated in an environment that is friendly to them.


Replies

randcrawtoday at 6:11 PM

Rather than coining a new word like adaption, I'd call this acculturation. It's reshaping not only SW dev but natural language too -- how we read and write and how we speak.

Everyone knows that AI-written slop isn't worth actually reading. So when reading mass media content we skim over each paragraph's opening phrases rather than read it deliberately, sentence by sentence. We also do this while writing notes, dropping determiners, acronymming common phrases, and making references to characters/scenes in popular media. Now with the rise of vocal interfaces and ever shorter rounds of engagement, all this abbreviating will only exponentiate.