logoalt Hacker News

accrualtoday at 4:19 PM2 repliesview on HN

It looks like these cameras are infrared and intended to see gestures from the wearer. There are some more details in the linked article:

https://www.macrumors.com/2024/06/30/new-airpods-to-feature-...

> The IR cameras can detect environmental image changes, facilitating a broader range of gestures to improve user interaction. For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience."

I wonder if the signal could be integrated into AR glasses or headset to provide a wider FOV to the wearer.


Replies

ASalazarMXtoday at 5:06 PM

> For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience.

Geez, if only the Apple Vision had some kind of gyroscope and accelerometer so it could detect head motion without relying on external hardware...

nozzlegeartoday at 5:27 PM

> It looks like these cameras are infrared and intended to see gestures from the wearer.

I had an APV for a while, controlling it with just gestures was sweet. If they're looking to bring those kinds of gestures to other Apple devices via AirPods (i.e. beyond just bringing more gestures to the AirPods), I'm intrigued.