Here's the camera used.[1]
It's not exotic. It's a 1936 × 1216 Sony sensor with a C-mount lens. That's below current phone camera resolution. It's monochrome, which makes sense in this application.
They have bigger collecting optics than a phone, and you get better sensitivity without the color filters.
I'm not clear on how they get their "down" reference. It's clear how they get heading; that's easy if you can see the stars. But you need an accurate horizon or vertical to get latitude and longitude. One degree of error in the vertical is maybe 100 km of error in position. How good are drone AHRS systems today in attitude? They have a correction system that works if you fly in a circle, but that just corrects for constant misalignment between camera and down reference.
[1] https://www.alliedvision.com/fileadmin/pdf/en/Alvium_1800_U-...
One degree of error in the vertical for the drone's control system, if it's hovering by blowing air downward at 5 meters per second, would be a ground speed of 87 mm/s (sin(1°)×5m/s) in whichever direction the tilt is. Also without any correction in the propeller speed it would result in a loss of altitude averaging 0.76mm/s (2.7 m/hour, (1 - cos(1°) 5m/s). But that could also be caused by something like a mild downdraft, while the horizontal drift could be caused by an imperceptibly weak breeze.
So I don't really know how this is normally done. If you can set the drone on the ground for a few minutes, you should be able to get a very good reference up-vector, but I don't know how long the MEMS gyros can preserve that up-vector without GNSS once it takes off.
At sea you can probably look at the horizon with a camera unless it's foggy.