logoalt Hacker News

hedorayesterday at 7:27 PM1 replyview on HN

I wonder if an FPGA is still necessary. 4k/8k are running way over 60 fps these days. Presumably a gpu could do a decent job emulating the phosphor.

In related news, atari 2600 emulators are keeping 4-8 cores > 50% busy these days. How else do you get accurate ntsc “red blur”, and capacitor effects from blinking pixels?


Replies

ChuckMcMtoday at 1:09 AM

I suppose it would depend on how you wanted to simulate it. In my case I was targeting taking the signal from an unmodified test instrument that thought it was talking to a CRT and using that to figure out what display it wanted. That would be equivalent to taking the X/Y/Intensity lines from the mainboard of a Vectrex and just doing what the vector scope would have done. I drilled down enough to find the non-linear, temperature dependent, curve of phosphor decay times on the CRT used in some HP gear. It was pretty wild. If you buy third party kits they don't even bother simulating phosphor. Instead they just take the signals, figure out the information content of the display, and put that on an LCD. (Monochrome generally)