High drift with Optical Tracking Odometry Sensor

I am using SparkFun PAA5160E1 board and getting results that have a very high level of drift. Its hard to tell if this is the inertial compensation not working properly or something else. Is their any way to access the raw PAA5160E1 data without interference from the IMU so I can evaluate the inertial compensation.

Also the “QwiicOTOS” class seems to have lots functions. is there any way to find out what they do? I cannot find any kind of manual or user guide.

Did you do the calibration?
You might be looking for these files SparkFun_Qwiic_OTOS_Arduino_Library/src at main · sparkfun/SparkFun_Qwiic_OTOS_Arduino_Library · GitHub

Also: this thread relates to accuracy, but some of the tips might help here too SparkFun Optical Tracking Odometry Sensor (OTOS) Accuracy - #4 by SparkFro

Could you please elaborate on which measurement is drifting, and at what rate? For example, perhaps the heading is gradually changing by a few degrees per second? If so, ensure that you are running the IMU calibration in your code while the sensor is stationary and flat to the ground.

If you need more help, could you also please inform us which library you’re using? We have libraries for Arduino, Python, and Java, so just want to make sure we’re sharing the right code with you.

Thanks!

Hi Yes, I have done the calibration, although I can see no indication of how long the device needs to be static for to allow the calibration to work properly.

I had found the GitHub stuff before but I am not programmer so none of it makes any sense. I looking for something in a language I can understand.

Yes, I have done a calibration.

I did a test by moving the sensor around a rectangular track several times, always going to same four corner points. The results was that the X, Y position of the corners trended off, all in much the same direction, but at different rates. The error even over a small period of time was considerable.

If I click the more info on the installed library then I get:
GitHub - sparkfun/SparkFun_Qwiic_OTOS_Arduino_Library: Arduino library for the SparkFun Optical Tracking Odometry Sensor

The IMU calibration can take up to 612ms, as explained in the calibration example.

Could you please provide numerical data for us? How big is your rectangular track? How far off are your measured coordinates? How much do you mean by “considerable”? The optical sensor has a rated accuracy of 3-5%, so if your track length is 100 inches for example, the measured location could be off by up to 5 inches. Without numeric data, it’s impossible for us to know whether the error you’re seeing is within tolerance, or if some other problem is occurring.

Thank you!

Thanks for your reply.

Yes, I understand the calibration function takes a set number of samples. I have not set a number so I assume its taking the default 255. What is not clear is why you may choose a different number of samples or weather the device needs some time before these samples are taken to settle.

The rectangle I used was quite small, it was contained on a sheet of A4 paper, and I did not go to the edges. The issues is less the drift than the fact that it was trending off in set direction. I have since sent the device with with its Red Board off to a college in Japan. He has done some more controlled tests using a robot arm to actuate the device in repeated straight line with similar results. If you could tell me how to upload data I can send you my simple test results. The “upload” button only seems to accepted images and the data is in text file.

With regard to accuracy, 3-5% is not really enough for our application. However, my college has significant experience in inertial measurement, and if we can understand the nature of the error, it may be possible to get the accuracy we need. For example by combining several sensors together.

This is why I am also keen to get hold of the raw data before it has been inertially compensated. Can you tell me if this is possible?

What is not clear is why you may choose a different number of samples

Some applications may require a faster calibration, so reducing the number of samples would help with that at the cost of less accurate calibration. It’s available as an option, but if accuracy is more important for your application, use the default of 255.

If you could tell me how to upload data

Yes, I believe this forum only supports image uploads. For raw data, I would suggest uploading the file to a cloud hosting service like Google Drive or Dropbox, then link to it here (please ensure it has public access).

3-5% is not really enough for our application

3-5% is the rated accuracy of the optical sensor itself, however there is also a calibration in the firmware that can reduce the error down to <1%. That calibration has only been tested extensively on one type of surface (the foam tiles used in FIRST Tech Challenge). Some very limited testing has been done on other surfaces, and that calibration seems to improve the accuracy on other surfaces as well; but it may not be as effective, or may possibly make the accuracy worse than without the calibration.

This is why I am also keen to get hold of the raw data before it has been inertially compensated. Can you tell me if this is possible?

Sort of. The raw data from the PAA5160 is made available via the I2C registers, however were are not allowed to provide any documentation about it. You’re welcome to read through the firmware to try to understand it. Also be aware that the raw data was made available only as a debugging tool for internal use, and the data changes every update cycle (2.4ms), so you may end up missing some data if you don’t poll fast enough. We unfortunately cannot provide any addition support for getting the raw data from the PAA5160.

Hi, thanks for this. I have zipped up the data file and added it to Adobe cloud so hopefully you will be able to get it from here:

The data is:
Elapsed time (ms),
X (mm), Y (mm), Heading (Deg),
X Velocity (mm/s), Y Velocity (mm/s), Rate of Rotation (Deg/s),
X Acceleration (mm2/s), Y Acceleration (mm2/s), Rotational Acceleration (Deg2/s)

All rounded to 1 mm.

With regard to the extra calibration, I assume you are referring to Example 3? If there were done on foam tiles, then would the calibration hold for other surfaces or does it need to be calibrated for each surface, and then re-zeroed at start-up?

I suspect reading registers like that is beyond my coding ability.

Thank you for the data! I put it into a spreadsheet and plotted it:

I also plotted the tracked motion, and I can see the inconsistency you’re talking about:

Could you please share more about how this data was collected? Was the sensor mounted to the robot arm you mentioned, or was the sensor moved by hand? Given the variation in the heading during the first 2/3 of the data, it looks to me like the sensor was being moved by hand. If so, then poor accuracy is expected, because the sensor only performs well when rigidly mounted to something like a robot chassis.

Also, if you’re testing with a piece of paper, please tape it down to ensure the paper doesn’t move. The firmware runs a sensor fusion algorithm to combine the optical data with the accelerometer, so if the paper moves, that will also negatively affect the accuracy.

With regard to the extra calibration, I assume you are referring to Example 3?

No, sorry, different calibration. The IMU calibration is what’s shown in Example 3, but I was referring to the resolution calibration documented here. It’s not a simple process, it can take many hours to do properly.

Hi thanks again for your efforts.

You are correct, this is my simple hand-held test. I collected the data by marking a rectangle on squared paper, adding posts to the board to give it a 10 mm fixed stand off, and then I used it like a mouse to trace around the rectangle, with occasional short cuts across the middle for good measure. I deliberately rotated the board about half way through. The paper was fixed to the desk.

This was not meant to be a rigorous test, just a quick looksee, to make sure it was working. Nonetheless the results were a surprise. I was not expecting to see the level of drift. I was expecting to see my rectangle with some scatter around the basic shape. I agree it would have been a surprise if had produced the stated 3% to 5% with such an ad hock test.

One concern was weather I was moving it too quickly, since the rectangle was small the movements may have been to jerky for it.

Thanks for the link to the extra calibration method. I will have a good look through it. I will also see if I can get hold of some of the robot data collected data. I posted the board and sensor to my college in Japan who has the desk top robot so he has the data.

Understood, thank you! Since you were testing by hand, the data you shared actually seems reasonable to me, and the amount of error is actually within the advertised tolerance.

Below is the first loop of your rectangle. When the sensor returned to the origin, the coordinates were (-5, -7), or about 8.6mm from the start location. I calculated the the total path length to be 714mm, so that’s an error of 8.6 / 714 = 1.2%

I looked at the other times when the sensor was returned to the starting location, and it was always around 1%, which is actually pretty good for the sensor given that it was being moved by hand.

A4 paper is 210mm x 297mm, and your data shows the tracked path to be about 150mm x 200mm, which makes sense if you were keeping away from the edges of the paper.

I would be curious to see what happens when your colleague tests the sensor with the robot arm. Assuming a high-quality arm, I would expect the accuracy to be a bit better than your test by hand. However the sensor can typically achieve around 0.5% accuracy at best, so I wouldn’t expect a huge improvement.

Based on all this, it seems to me that the sensor is performing as expected. Your data shows approximately 1% accuracy, which is what’s expected from this sensor. If that’s insufficient for your application, I completely understand! However that’s the limit of what this sensor is actually capable of, so if you need better accuracy, I would suggest looking into alternative solutions, such as encoders.

I hope this helps! Sorry if the accuracy is worse than you were hoping for, but that is within the advertised tolerance, so there does not appear to be any issue with the sensor itself.

1 Like

Also, in case it helps, here’s a link to the spreadsheet I put together with your data to create those plots and calculate the error:

Thanks again for your help, and for the spreadsheet. As discussed I have obtained a sample of the robotic test data. The robot is a good quality industrial pick and place device. The person who did the test described it as follows:

The instructions given to the robot to move the odometer were pretty simple. I had the robot start from one position, wait 10 minutes, then move 100mm in (roughly) the odometer’s x axis. The robot waits 10 minutes, then continues along the same line another 100mm, and so on, up to a total of 500mm travelled, after which it moves back linearly to the starting position.

The graph shows the odometer data for this movement. The red x on the right is the starting point. As you can observe, the position data shows 5 straight lines with sharp angles between them. The straight paths correspond to the 100mm movements, and the corners are positions where the robot was instructed to wait for 10 minutes. The final long straight line corresponds to the 500mm movement back to the start.*

Ideally, the position data should be a straight line. The data presented here indicates a drift in the rotation angle measurement of the odometer, as the expected straight line is instead curved, and furthermore, the end point is not identical to the start point. Since the corners are strongly defined, there doesn’t appear to be any significant drift in the x or y measurements over the course of 10 minutes. (This was also tested and verified by measuring odometer data for 10 minutes without moving it at all.) Furthermore, from the figure and the rotational data from the corresponding log, the odometer drift appears to not be fixed but slowly increase.

The base data for this is here:

I looking at the data I tend to agree that the basic device is probably doing what its supposed to, the issue looks like level drift in the gyro.

Looking at the calibration process, this is an option for us, and we will have a look to see how we can do using an appropriate rail like surface.

Thank you for sharing that data as well!

Your colleague is correct that the root problem is the heading angle drifting. By the end of the test, the angle was off by about 45 degrees:

The data indicates this test took place over the course of about 1.3 hours. The gyroscope appears to have an accurate angle for over 15 minutes before the drift became significant. The highest rotation rate appeared to get up to about 0.025dps. The IMU datasheet specifies the angular rate change versus temperature to be 0.01dps/C, so it’s possible that behavior is explained by the room temperature changing by a few degrees. Based on my experience with this particular IMU, that performance seems fairly typical, though I admittedly haven’t tested the drift performance extensively.

The firmware simply integrates the angular rotation rate measured by the onboard IMU’s gyroscope. All gyroscopes will drift over a long period of time, so this behavior is expected (see bias instability). The IMU calibration simply measures and compensates for the bias/offset at the time when the calibration is run, but it will not compensate for long-term drift, so I don’t think this is something you’d be able to address with better IMU calibration. Some IMUs include a magnetometer to compensate for this drift, but the Optical Odometry Sensor does not include a magnetometer since it was only designed to operate for a few minutes at a time, not hours.

Are you able to share what your final application is for the sensor? For how long does it need to track its motion? How accurate do you need the tracking to be? What kind of environment is it operating in? What will it be mounted to, and on what surface will it be moving over? Is the sensor expected to rotate during operation, or maintain a single orientation? Do you have the ability to add other sensors? I might have some ideas to help, but I want to better understand your application to make the best recommendations.

1 Like

Thanks again. I agree we need to take some steps to compensate for the gyro drift.

The application is railway push trolley where we want a fully non-contacting measurement system. It will operate outdoors, summer and winter, and performs a range of measurement. These need a good measurement of velocity (about 1%) and a similarly accurate step distance between samples. We don’t need absolute distance offsets. We do intend to put more than one sensor on the trolley.

It looks like this may be a viable solution if we do the detailed calibration and use more than one sensor. I assume we can use the Qwiic connector to chain them?

Ah, so something like this? Handcar - Wikipedia Very interesting!

In that case, here’s something to try: disable the gyroscope, and maybe the accelerometer too. By default, the Optical Odometry Sensor performs sensor fusion between the optical data and IMU, however I think disabling the IMU data will help in your application for a couple reasons:

  1. You’re confined to a single degree of freedom by the rails, so there’s no need for the rotation data, and that eliminates the drift problem.
  2. If your track is not perfectly flat (eg. going up and down hills), the accelerometer will incorrectly measure an acceleration that is actually just gravity.

Here’s Arduino code to disable the gyroscope and accelerometer. Add it to the setup() function after calling myOtos.begin(). I haven’t tested this myself, but I think it should work.

sfe_otos_signal_process_config_t config;
myOtos.getSignalProcessConfig(&config); // Get current config
config.enRot = false; // Disable gyroscope rotation data
config.enAcc = false; // Disable accelerometer data
myOtos.setSignalProcessConfig(&config); // Set new config

// If the gyroscope and accelerometer are disabled, you can also remove this line:
// myOtos.calibrateImu();

A couple things to note:

  1. The optical sensor is only capable of tracking up to 2.5m/s, so it could fail to measure properly above that speed.
  2. I haven’t done much testing with the accelerometer disabled, so I’m not sure the accuracy will be affected. The firmware runs a Kalman filter, which is tuned differently with the accelerometer enabled or disabled. With it enabled, <1% accuracy is achievable. With it disabled, I don’t know what accuracy to expect.
  3. The position wraps at +/-10m, so if your track is longer than 10m, the value will wrap around to -10m. You will need to modify your code to unwrap the measurement.

I’d be curious to have your colleague test again with this new setting to see how it performs. I would suggest doing 2 tests, one with the accelerometer enabled, and one with it disabled, to see how the accuracy compares.

Hope this helps!

1 Like

Hi, thanks again. Technology has come on a bit since those hand-cats but basically that sort of thing. We will give it a go with the simplified approach and see how we get on. I am happy to share the results. Since we are wandering off topic perhaps we should carry on this conversation off line? The community guidance says to avoid personal email address to avoid bots finding them so you can contact me as follows:

I am working for Patko Co. (this site is in Japanese only but auto-translate makes a reasonable fist of it)
株式会社パトコー

Or you can contact me through LinkedIn:
(2) David Thompson | LinkedIn

Sounds good! Please feel free to post your results here, I am interested to know how it performs with those changes.

No worries about wandering a bit, I think it’s still relevant to discuss your application here since it’s related to the original question. Though you’re also welcome to make a new post with the Community tag if you want to highlight the stuff you’re doing!

Thanks!

1 Like