logoalt Hacker News

As AI gobbles up chips, prices for devices may rise

167 pointsby geoxyesterday at 10:52 PM224 commentsview on HN

Comments

torginustoday at 7:21 AM

It's somewhat alarming to see that companies (owned by a very small slice of society) producing these AI thingies (whose current economic is questionable value and actual future potential is up to hot debate), can easily price the rest of humanity out of computing goods.

show 7 replies
vee-kaytoday at 1:57 AM

For last 2 years, I've noticed a worrying trend: the typical budget PCs (especially Laptops) are being sold at higher prices with lower RAM (just 8GB) and lower-end CPUs (and no dedicated GPUs).

Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.

New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.

Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).

It is as if the industry has decided to focus on AI and nothing else.

And this will be a huge setback for humanity, especially the students and scientific communities.

show 2 replies
SunlitCattoday at 10:16 AM

I really wonder when the point will be reached at which the South Korean government steps in and starts to take a closer look at the growing long-term supply commitments that companies like OpenAI are indirectly driving with major memory manufacturers such as SK hynix and Samsung Electronics.

Allocating a very large share of advanced memory production, especially HBM and high-end DRAM, which are critical for almost all modern technology (and even many non-tech products like household appliances) to a small number of U.S. centric AI players risks distorting the global market and limiting availability for other industries.

Even within Samsung itself, the Mobile eXperience (MX) Business (smartphones) is not guaranteed preferential access to memory from Samsung’s Device Solutions (DS) Division, which includes the Memory Business. If internal customers are forced to source DRAM elsewhere due to pricing or capacity constraints, this could eventually become economically problematic for a country that relies very heavily on semiconductor and technology exports.

show 1 reply
motbus3today at 10:20 AM

For me, there is concerning a flag about all of this.

I know this is not always true, but on this case, crucial folks say the margins for end user are too low and they have demand for AI.

I suppose they do not intend to bring a new AI focused unit because it is not worth it or they believe the hype might be gone before it they are done. But what intrigues me is why they would allow other competitors to step up in a segment they dominate? They could raise the prices for the consumers if they are not worried about competition...

There is a whole "not-exactly" ai industry labeled as AI that received a capital t of money. Is that what they are going for?

show 2 replies
goku12today at 9:25 AM

I understand the issue with all the devices. But what about the rest of the things that depend on these electronics, especially DRAMs? Automotive, Aircraft, Marine vessels, ATC, Shipping coordination, traffic signalling, rail signalling, industrial control systems, public utility (power, water, sewage, etc) control systems, transmission grid control systems, HVAC and environment control systems, weather monitoring networks, disaster altering and management systems, ticketing systems, e-commerce backbones, scheduling and rostering systems, network backbones, entertainment media distribution systems, defense systems, and I don't know what else. Don't they all require DRAMs? What will happen to all of them?

show 2 replies
Ekarostoday at 7:23 AM

Outside say video and image editing and maybe lossless audio. Why is this much ram even needed in most use cases? And I mean actually thinking about using it. Computer code unless you are actually doing whole Linux kernel, is just text. So lot of projects probably would fit in cache. Maybe software companies should be billed for user's resources too...

show 1 reply
jazzyjacksontoday at 12:18 AM

Question: are SoCs with on die memory be effected by this?

Looks like the frame.work desktop with Ryzen 128GB is shipping now at same price it was on release, Apple is offering 512GB Mac studios

Are snapdragon chips the same way?

show 3 replies
elthor89today at 7:43 AM

If all manufacturers jump into serving the ai market segment.

Can this not be a opportunity for new entrants to start serving the other market segments?

How hard is it to start and manufacture memory for embedded systems in cars, or pc?

show 3 replies
zdc1today at 9:01 AM

Thankfully we're at a stage where a 4 year old second-hand iPhone is perfectly usable, as are any M-series Macs or most Linux laptops. Sucks for anyone needing something particularly beefy for work; but I feel that a lot of purchases can be delayed for at least a year or two while this plays out.

show 1 reply
l9otoday at 7:24 AM

It feels like a weird tension: we worry about AI alignment but also want everyone to have unrestricted local AI hardware. Local compute means no guardrails, fine-tune for whatever you want.

Maybe the market pricing people out is accidentally doing what regulation couldn't? Concentrating AI where there's at least some oversight and accountability. Not sure if that's good or bad to be honest.

show 1 reply
compounding_ittoday at 5:13 AM

Software has gotten bad over the last decade. Electron apps were the start but these days everything seems to be so bloated, right from the operating systems to browsers.

There was a time when apple was hesitant to add more ram to its iPhones and app developers would have to work hard to make apps efficient. Last few years have shown Apple going from 6gb to 12gb so easily for their 'AI' while I consistently see the quality of apps deteriorating on the App Store. iOS 26 and macOS 26 are so aggressive towards memory swapping that loading settings can take time on devices with 6gb ram (absurd). I wonder what else they have added that apps need purging so frequently. 6gb iphone and 8gb M1 felt incredibly fast for the couple of years. Now apparently they are slow like they are really old.

Windows 11 and Chrome are a completely different story. Windows 10 ran just fine on my 8th gen pc for years. Windows 11 is very slow and chrome is a bad experience. Firefox doesn't make it better.

I also find that gnome and cosmic de are not exactly great at memory. A bare minimum desktop still takes up 1.5-1.6gb ram on a 1080p display and with some tabs open, terminal and vscode (again electron) I easily hit 8gb. Sway is better in this regard. I find alacrity sway and Firefox together make it a good experience.

I wonder where we are heading on personal computer software. The processors have gotten really fast and storage and memory even more so, but the software still feels slow and glitchy. If this is the industry's idea of justifying new hardware each year we are probably investing in the wrong people.

show 2 replies
loudandskittishtoday at 7:40 AM

Love all the variations of "8GB of RAM should be enough for anybody" in here.

show 1 reply
xbmcusertoday at 6:37 AM

Might not be the best thing for US but rest of the world needs China to reach parity on node size with TSMC to crash the market.

show 1 reply
arjietoday at 5:25 AM

DRAM spot prices are something like what they were 4 years ago. Having RAM for cheap is nice. But it doesn't cost an extraordinary amount. I recently needed some RAM and was able to pick up 16x32 DDR4 for $1600. That's about twice as expensive as it used to be but $1600 is pretty cheap for 512 GiB of RAM.

A 16 GiB M4 Mac Mini is $400 right now. That covers any essential use-case which means this is mostly hitting hobbyists or niche users.

show 3 replies
memoriuaysjyesterday at 11:33 PM

the first stages of the world being turned into computronium.

next stage is paving everything with solar panels.

czhu12today at 5:58 AM

It seems… fine? Hasn’t DRAM always been a boom and bust industry with no real inflation — in fact massive deflation — over the past 30 years?

Presumably the boom times are the main reason why investment goes into it so that years later, consumers can buy for cheap.

agilobtoday at 7:32 AM

It's going to be interesting for Google Chrome team when new laptops will be equipped with 8Gb RAM by default.

Culonavirustoday at 9:49 AM

I mean yea, but this is THE wrong site to post stuff like this. Half the people here are the AI cock and the other half is riding it.

deadbabetoday at 4:16 AM

Are we finally going to be forced to use something like CollapseOS, when the supply chains can no longer deliver chips to the masses?

netbioserroryesterday at 11:25 PM

Positive downstream effect: The way software is built will need to be rethought and improved to utilize efficiencies for stagnating hardware compute. Think of how staggering the step from the start of a console generation to the end used to be. Native-compiled languages have made bounding leaps that might be worth pursuing again.

show 2 replies
johneayesterday at 11:09 PM

"May rise"?

Prices are already through the roof...

https://www.tomsguide.com/news/live/ram-price-crisis-updates

show 2 replies
CTDOCodebasestoday at 10:24 AM

"May"

cglantoday at 2:04 AM

At this current pace, if "the electorate" doesn't see real benefits to any of this. 2028 is going to be referendum on AI unfortunately.

Whether you like it or not, AI right now is mostly

- high electricity prices - crazy computer part prices - phasing out of a lot of formerly high paying jobs

and the benefits are mostly - slop and chatgpt

Unless OpenAI and co produce the machine god, which genuinely is possible. If most people's interactions with AI are the negative externalities they'll quickly be wondering if ChatGPT is worth this cost.

show 2 replies
shmerltoday at 12:57 AM

> She said the next new factory expected to come online is being built by Micron in Idaho. The company says it will be operational in 2027

Isn't Micron stopping all consumer RAM production? So their factories won't help anyway.

show 1 reply
vittoreyesterday at 11:46 PM

I've been ruminating on this past two years, with life before AI most of the compute staying cheap and pretty much 90% idle , we are finally getting to the point of using all of this compute. We probably will find more algorithms to improve efficiency of all the matrix computations, and with AI bubble same thing will happen that happened with telecom bubble and all the fiber optic stuff that turned out to be drastically over provisioned. Fascinating times!

show 2 replies
arnaudsmtoday at 9:33 AM

[deleted]

show 1 reply
29athrowawaytoday at 1:15 AM

AI needs data and data that comes from consumer devices.

show 1 reply
shevy-javatoday at 12:35 AM

I now consider this a mafia that aims to milk us for more money. This includes all AI companies but also manufacturers who happily benefit from this. It is a de-facto monopoly. Governments need to stop allowing this milking scheme to happen.

show 5 replies
bschmidt97979today at 4:40 AM

[flagged]

bschmidt97979today at 5:14 AM

[flagged]

appreciatorBustoday at 2:14 AM

Oh no, a manufactured thing with long lead times has more demand than forecast and might be in short supply for a few years, surely a greater disaster has never befallen mankind!

show 1 reply
kankerlijeryesterday at 11:17 PM

Well thank th FSM that the article opens right up with buy now! No thanks, I'm kind of burnt out on mindless consumerism, I'll go pot some plants or something.

show 1 reply