Hi there,
I’m relatively new at working with IMUs, so please bear with me. I have a couple of questions re. the ICM-20948 IMU:
The raw Z-axis acceleration values that I’m getting from the IMUs range from ~2000 at rest to ~32000 at their peak (from animal data collected in the wild). I’ve read mixed comments online, some saying that these values are in units of milli-g, and others saying that they are in units of LSB/g and need to be divided by a sensitivity factor to give units of g. I was hoping to get some confirmation around this.
The IMUs are also outputting heading acceleration. Does anyone happen to know how this would be being calculated / if we can trust it? For added context, we have accompanying GPS data and are hoping to do some dead reckoning.
Thank you very much in advance!
When the IMU is at rest, it reports 1 g acceleration along a strictly vertical axis.
That is all you need to work out the scale factor for conversion to g values.
Use 1 g = 9.8 m/s/s to convert to m/s/s.
Thank you for your reply.
So essentially the scale factor will vary - there is no “norm” for what the raw aZ values should be at rest and I should disregard the sensitivity scale factors given in the ICM-20948 documentation?
Thanks again.
The sensor is not calibrated. For accurate measurements, both the scale factors and the offsets need to be carefully determined for each of the three axes and applied to the raw data.
The sensitivity values given by the manufacturer in the data sheet are typical, averaged over a some number of sensors. However, the scale factors that you determine, by following the linked tutorial, should not differ from the sensitivity values in the data sheet by more than a few %.