At what point do the OEMs begin to realize they don’t have to follow the current mindset of attaching a GPU to a PC and instead sell what looks like a GPU with a PC built into it?
Exactly. With the Intel-Nvidia partnership signed this September, I expect to see some high-performance single-board computers being released very soon. I don't think the atx form-factor will survive another 30 years.
At this point what you really need is an incredibly powerful heatsink with some relatively small chips pressed against it.
So basically going back to the old days of Amiga and Atari, in a certain sense, when PCs could only display text.
It's funny how ideas come and go. I made this very comment here on Hacker News probably 4-5 years ago and received a few down votes for it at the time (albeit that I was thinking of computers in general).
It would take a lot of work to make a GPU do current CPU type tasks, but it would be interesting to see how it changes parallelism and our approach to logic in code.
Maybe at the point where you can run Python directly on the GPU. At which point the GPU becomes the new CPU.
Anyway, we're still stuck with "G" for "graphics" so it all doesn't make much sense and I'm actually looking for a vendor that takes its mission more seriously.
I mean, that's kind of what's going on at a certain level with the AMD Strix Halo, the NVIDIA GB10, and the newer Apple machines.
In the sense that the RAM is fully integrated, anyways.
The vast majority of computers sold today have a CPU / GPU integrated together in a single chip. Most ordinary home users don't care about GPU or local AI performance that much.
In this video Jeff is interested in GPU accelerated tasks like AI and Jellyfin. His last video was using a stack of 4 Mac Studios connected by Thunderbolt for AI stuff.
https://www.youtube.com/watch?v=x4_RsUxRjKU
The Apple chips have both power CPU and GPU cores but also have a huge amount of memory (512GB) directly connected unlike most Nvidia consumer level GPUs that have far less memory.