logoalt Hacker News

ntoskrnl_exeyesterday at 6:31 PM1 replyview on HN

I think you're mixing monitors and TVs together.

CRT TVs only supported vertical refresh rates of 50Hz or 60Hz, which matched the regional mains frequency. They used interlacing and technically only showed half the frame at a time, but thanks to phosphor decay this added a feeling of fluidity to the image. If you were able to see it strobe, you must have had an impressive sight. And even if they supported higher refresh rates, it wouldn't matter, as the source of the signal would only ever be 50/60Hz.

CRT monitors used in PCs, on the other hand, supported a variety of refresh rates. Only monitors for specific applications used interlacing, customer grade ones didn't, which means you could see a strobing effect here if you ran it at a low frequency. But even the most analog monitors from the 80s supported atleast 640x480 at 60Hz, some programs such as the original DOOM were even able to squeeze 70Hz out of them by running at a different resolution while matching the horizontal refresh rate.


Replies

myself248yesterday at 10:49 PM

For some reason I remember 83Hz being the highest refresh rate supported by my XGA CRT, but I think it was only running at SVGA (800x600) in order to pull that rate.

Some demos could throw pixels into VRAM that fast, and it was wild looking. Like the 60Hz soap-opera effect but even more so.

I still feel that way looking at >30fps content since I really don't consume much of it.