is the idg300 null drift adjustable? the datasheet didnt not seem to suggest anyway to do so as compared to adxrs gyroscopes.
in short, i realised that the bias is way off the specified? the datasheet of idg300 says that it should be 1.5v +/-.1v but mine is way off, when i placed the triads’ sensitive axis in the east/west direction, this is what i get
p=448.8922, r=475.9041, y= 425.1369
p= 448.8828, r= 475.8208, y= 425.1369
(i am at down under , so earth rotation should be insignificant)
i then used the dmm to look at the yaw rate output, it reads 1.37v, so if we assume buffering/impedance transfer was done appropriately, so where is the angular rate coming from???
Have you checked the tolerances and the variations with temperature? They are quite large and might explain the value you are getting for the static output. Has your DMM been calibrated recently?
With respective sensitive orientation pointing to east-west,
p=448.8922, r=475.9041, y= 425.1369
p= 448.8828, r= 475.8208, y= 425.1369
You are right about the noise! It is scary!
IDG300 has a 3.3V supply line. LSB = 3.2mV. With an initial calibration tolerance of +/-100mV, is not it like 2*30+ LSB variation?
the pitch and row are orthogonally in the same chip, inductively, it cannot be temperature induced; with up to 80lsb variation, there is this other 50lsb missing, in the idg 300 datasheet, it reads that there is a over specified temperature +/-300mV, that will 2*90+LSB (where is the stipulated temperature compensation?)
In that case, isn’t it (the idg300 gyro) pretty useless? In the extreme scenario of 120lsb variation. With 2mV / deg/s (specified sensitivity), so the initial bias offset can be 200deg/s!!! That is real bad!
I did not know that a DMM has to be calibrated. Perhaps you are talking about a particular make. Sad to say, my el cheapo DMM from China lacks that. Or is there a specific procedure which you enlighten me? Even if i do prove that the DMM reading is inaccurate, the ADC on the ARM cannot be wrong?
I just had a quick look at the data sheet, I’m not sure if it is temperature-compensated.
In industry, all test instruments should be calibrated (checked for accuracy and adjusted if necessary) at least once a year. How do you know if your DMM is accurate?
Board stack comes fully assembled as shown. All sensors are internally temperature compensated.All readings are available through any terminal program in either ASCII, binary format, or with the improved 6DOF v4 IMU Mixer demo application (source code also available).
leon_heller:
I just had a quick look at the data sheet, I’m not sure if it is temperature-compensated.
In industry, all test instruments should be calibrated (checked for accuracy and adjusted if necessary) at least once a year. How do you know if your DMM is accurate?
Leon
i suppose you are talking about a primitive regressive calibration?
With the IMU 5DOF that I was tinkering around with, I would zero-calibrate the IMU at rest before use, and just subtract that zero reference value from the other measurements. Otherwise how would you know where your zero is in that huge range?
There’s a thread we started a while back that has some information about the accel and gyro drifts (at least in my setup), as well as sample datasets if anyone’s interested in tinkering:
I’m still interested in making a little daughter board for the IMU 5DOF that contains some buffers to increase the speed that the ADC can read their values. Maybe this will help with some of the bias?