logoalt Hacker News

chemotaxis11/03/202512 repliesview on HN

> I look forward to the "personal computing" period, with small models distributed everywhere...

One could argue that this period was just a brief fluke. Personal computers really took off only in the 1990s, web 2.0 happened in the mid-2000s. Now, for the average person, 95%+ of screen time boils down to using the computer as a dumb terminal to access centralized services "in the cloud".


Replies

wolpoli11/04/2025

The personal computing era happened partly because, while there were demands for computing, users' connectivity to the internet were poor or limited and so they couldn't just connect to the mainframe. We now have high speed internet access everywhere - I don't know what would drive the equivalent of the era of personal computing this time.

show 6 replies
jayd1611/04/2025

I don't know, I think you're conflating content streaming with central compute.

Also, is percentage of screentime the relevant metric? We moved TV consumption to the PC, does that take away from PCs?

Many apps moved to the web but that's basically just streamed code to be run in a local VM. Is that a dumb terminal? It's not exactly local compute independent...

show 2 replies
JumpCrisscross11/04/2025

> using the computer as a dumb terminal to access centralized services "in the cloud"

Our personal devices are far from thin clients.

show 5 replies
api11/04/2025

There are more PCs and serious home computing setups today than there were back then. There are just way way way more casual computer users.

The people who only use phones and tablets or only use laptops as dumb terminals are not the people who were buying PCs in the 1980s and 1990s, or they were they were not serious users. They were mostly non-computer-users.

Non-computer-users have become casual consumer level computer users because the tech went mainstream, but there's still a massive serious computer user market. I know many people with home labs or even small cloud installations in their basements, but there are about as many of them as serious PC users with top-end PC setups in the late 1980s.

torginus11/04/2025

I dislike the view of individuals as passive sufferers of the preferences of big corporations.

You can and people do self-host stuff that big tech wants pushed into the cloud.

You can have a NAS, a private media player, Home Assistant has been making waves in the home automation sphere. Turns out people don't like buying overpriced devices only to have to pay a $20 subscription, and find out their devices don't talk to each other, upload footage inside of their homes to the cloud, and then get bricked once the company selling them goes under and turns of the servers.

show 4 replies
MSFT_Edging11/04/2025

I look forward to a possibility where the dumb terminal is less centralized in the cloud, and more how it seems to work in the expanse. They all have hand terminals that seem to automatically interact with the systems and networks of the ship/station/building they're in. Linking up with local resources, and likely having default permissions set to restrict weird behavior.

Not sure it could really work like that IRL, but I haven't put a ton of thought into it. It'd make our always-online devices make a little more sense.

npilk11/04/2025

But for a broader definition of "personal computer", the number of computers we have has only continued to skyrocket - phones, watches, cars, TVs, smart speakers, toaster ovens, kids' toys...

I'm with GP - I imagine a future when capable AI models become small and cheap enough to run locally in all kinds of contexts.

https://notes.npilk.com/ten-thousand-agents

show 1 reply
seemaze11/04/2025

I think that speaks more to the fact that software ate the world, than locality of compute. It's a breadth first, depth last game.

positron2611/04/2025

Makes me want to unplug and go back to offline social media. That's a joke. The dominant effect was networked applications getting developed, enabling community, not a shift back to client terminals.

show 1 reply
WhyOhWhyQ11/04/2025

I guess we're in the kim-1 era of local models, or is that already done?

pksebben11/04/2025

That 'average' is doing a lot of work to obfuscate the landscape. Open source continues to grow (indicating a robust ecosystem of individuals who use their computers for local work) and more importantly, the 'average' looks like it does not necessarily due to a reduction in local use, but to an explosion of users that did not previously exist (mobile first, SAAS customers, etc.)

The thing we do need to be careful about is regulatory capture. We could very well end up with nothing but monolithic centralized systems simply because it's made illegal to distribute, use, and share open models. They hinted quite strongly that they wanted to do this with deepseek.

There may even be a case to be made that at some point in the future, small local models will outperform monoliths - if distributed training becomes cheap enough, or if we find an alternative to backprop that allows models to learn as they infer (like a more developed forward-forward or something like it), we may see models that do better simply because they aren't a large centralized organism behind a walled garden. I'll grant that this is a fairly polyanna take and represents the best possible outcome but it's not outlandishly fantastic - and there is good reason to believe that any system based on a robust decentralized architecture would be more resilient to problems like platform enshittification and overdeveloped censorship.

At the end of the day, it's not important what the 'average' user is doing, so long as there are enough non-average users pushing the ball forward on the important stuff.

show 2 replies
btown11/03/2025

Even the most popular games (with few exceptions) present as relatively dumb terminals that need constant connectivity to sync every activity to a mainframe - not necessarily because it's an MMO or multiplayer game, but because it's the industry standard way to ensure fairness. And by fairness, of course, I mean the optimization of enforcing "grindiness" as a mechanism to sell lootboxes and premium subscriptions.

And AI just further normalizes the need for connectivity; cloud models are likely to improve faster than local models, for both technical and business reasons. They've got the premium-subscriptions model down. I shudder to think what happens when OpenAI begins hiring/subsuming-the-knowledge-of "revenue optimization analysts" from the AAA gaming world as a way to boost revenue.

But hey, at least you still need humans, at some level, if your paperclip optimizer is told to find ways to get humans to spend money on "a sense of pride and accomplishment." [0]

We do not live in a utopia.

[0] https://www.guinnessworldrecords.com/world-records/503152-mo... - https://www.reddit.com/r/StarWarsBattlefront/comments/7cff0b...

show 1 reply