logoalt Hacker News

tjohns04/02/20250 repliesview on HN

The Apple Vision Pro definitely uses dynamic foveated rendering. It only fully renders what’s at the center of your vision, but it should be adjusting where this is in real-time based on the gaze tracking data. (It’s easy to observe this if you’re giving a demo and watching an external display.)

Especially given that gaze tracking is the primary input method in their UI - they spent a lot of time to get this right.

I feel like something might have been wrong with op’s demo unit?