logoalt Hacker News

hsbauauvhabzbtoday at 9:09 AM5 repliesview on HN

I don’t know how resolution maps to ram in x11 but I assume at least one byte per pixel. Based on that assumption, there’s no chance you’d even be able to power a 4k monitor with 8mb of ram, let alone the rest of the system.


Replies

PaulRobinsontoday at 11:08 AM

Correct, 4k is very modern by these standards. But then I'm old, so perhaps it's all about perspective.

Back in the days when computers had 8MB of RAM to handle all that MS-DOS and Windows 3.1 goodness, we were still in the territory of VGA [0], and SVGA [1] territory, and the graphics cards (sorry, integrated graphics on the motherboard?! You're living in the future there, that's years away!), had their own RAM to support those resolutions and colour depths.

Of course, this is all for PCs. By the mid-1990s you could get a SPARCstation 5 [2] with a 24" Sun-branded Sony Trinitron monitor that was rather more capable.

[0] Maxed out at 640 x 480 in 16-colour from an 18-bit colour gamut

[1] The "S" is for Super: 1280 x 1024 with 256 colours!

[2] https://en.wikipedia.org/wiki/SPARCstation_5

p_ltoday at 10:14 AM

This was the main driver of VGA memory size for a time - if you spent money on 2MB card instead of a 1MB, you could have higher resolution or bit depth.

if you had a big enough framebuffer in your display adapter, though, X11 could display more than your main ram could support - the design, when using "classic way", allowed X server to draw directly on framebuffer memory (just like GDI did)

bigfishrunningtoday at 2:25 PM

Good thing 4k monitors didn't exist in 2000

direwolf20today at 10:55 AM

X11 was designed to support bit depths down to 1 bit per pixel.

show 1 reply
argsndtoday at 9:21 AM

Presumably every pixel is 32 bits rather than just 8. So the count starts at 33.2MB just for the display.

show 2 replies