I’m trying to find a product that will enable me to track the position and orientation of my camera for photogrammetry.
I’d like RTK level of GPS accuracy, and the 14 mm noted for your RTK products would be great. But I also need yaw, pitch and roll. Finally I need to log the GPS and orientation data and ideally, I could monitor when my camera takes a photo.
Your SparkFun RTK Express Plus (#18590) looks like a good option with the built-in IMU. I couldn’t find any data on what IMU is inside of it and I have a few specific questions:
Will it log the yaw, pitch and roll data along with the longitude, latitude and altitude?
Will it also log data coming in from the Qwiic connector? I would like to use that to record the 3.3V signal from the camera that signals the shutter release
Is a time stamp logged along with each of the above data elements?
What is the nominal accuracy for roll, pitch and yaw expected for quasi-static orientation (I’ll be briefly standing still while taking each photo)?
I do not need the position and orientation in real time, but I need to be able to extract that afterwards (back at the office) and use the shutter release times to interpolate camera X,Y,Z,roll,pitch,yaw at the moment each photo was taken.
I was thinking that a working system could be built with:
• These products mounted on the camera:
o SparkFun RTK Express Plus (#18590)
o GNSS Multi-Band L1/L2 Helical Antenna (SMA) BT-560 (#17383) – I’d use a short cable and bracket rather than directly attaching the antenna to the unit
Hola, yo estoy con ese tema hace un tiempo, y lo resolvi de la sigiente manera, en la base tengo una faceta RTK y en el avion un GPK de Seagul, este tiene con coenctor para la zapata del flash de la camara.
Y utilizo los datos de la IMU del piloto automatico tamben sincronizados con la zapara de flash. Todo este sistema me dio muy buenos resultados. Lo que no estoy pudiendo hacer en lograr que la faceta me grabe a 10hz en modo base. Si lo hace en modo rover.
The RTK Express Plus contains an IMU that has 3-axis accel and 3-axis gyro. The IMU is internally coupled with the GNSS (and wheel tick inputs) to provide a location when GNSS fails (think cars or scooters and tunnels or urban environments). It’s not really meant for user consumption or device orientation, but the data is available. The raw data from these sensors can be output and logged to the file using the ESF-RAW and ESF-MEAS messages. These messages come in at about 500Hz and they are individually timestamped. You would be able to log them, but then you would need to use u-center or a python script to extract the time, X/Y/Z, and roll/pitch/yaw from the individual ESF messages.
The RTK Product line has an external DATA port connector. This port can be configured (https://docs.sparkfun.com/SparkFun_RTK_ … ux-channel) for external trigger mode whereas if a pulse is detected, a message is recorded to the log with current time/date. This is a very accurate event trigger with a 3.3V input. I suspect this could be wired to your camera easily enough.
Accuracy: GPS receivers can provide very accurate heading/pitch/roll when moving. Sitting still, they’re pretty bad.
I suspect the RTK Express Plus will get you most of what you need, but the position/orientation of your camera will be a bit of extra work to log and then extract. Assuming you just need the tilt of your camera (accel X/Y/Z only), it would be fairly straight forward to capture the camera trigger event, then pull the nearest ESF-MEAS data record from the log.
My other concern would be the location of the antenna to your lens. I’m not exactly sure what you’re attempting but as the accuracy increases to 14mm, that means it’s the location of the RF reception point (often referred to as the antenna reference point). If the antenna is mounted 12" to the side of the camera, you’d be capturing that location, not where your camera is.
¡Muchas gracias! Ese es un gran consejo. ¿Hay alguna razón por la que no usar un segundo GPK de gaviota como estación base? (Por favor, disculpe cualquier error en el traductor de Google al español).
I don’t mind writing a python script to extract the IMU data after the fact.
That’s a good point that for quasi-static positions I may be able to get good values for the 3D tilt of the camera by using just the accelerometers rather than the gyros.
Regarding the offset from the antenna to the lens, I thought I’d do calibration of the system in advance to find the offset from the image plane to the antenna using photogrammetric targets in known positions measured relative to the antenna. I’d have to repeat that for every camera and lens combination, and I’d have to ensure that the antenna fits into a fixed, repeatable position on the camera, but all of that should be doable.