IMU for Sign Language (Data) Gloves

Hello, I am attempting to utilize a combination of an IMU and Pressure Sensor “Qwiic Pressure Sensor - LPS25HB” for data gloves. This is to capture hand and arms movements in space. This goal is essentially a budget-constrained motion capture for the hands and upper body.

I am looking at two of the IMUs, which are “SparkFun VR IMU Breakout - BNO080 (Qwiic)” and “SparkFun 9DoF Razor IMU M0”

I watched some videos https://youtu.be/tFtgKl-UYNg and https://youtu.be/EsgKAawwT9A?t=455.

In this video (https://youtu.be/EsgKAawwT9A?t=455) LadyAda states the BNO055 (Now the BNO080) uses sensor fusion to give you quaternions and Euler angles and “just gives you the vector” so you don’t need to do the math. Is this true of any of the IMUs or is this really something different.

In order of importance

  • I would like to limit or eliminate the math.

  • I would like to limit or eliminate calibrations.

  • I would really like to utilize the Qwiic system to limit soldering.
  • I think that if I utilize “SparkFun Qwiic Adapter” on a breakout IMU board, I can achieve Qwiic results.

    I am playing catch-up on DIY (30+ years removed my electronics background). What might be the best choice or something else altogether?

    Thank you for you assistance.

    Hi MrEdwards,

    This sounds like a really neat but fairly ambitious project to take on with IMUs and other sensors to track an object in space. Getting rotational vectors and angles with an IMU works really well for a static object to control its orientation like that video shows, but once you throw in actually moving an object in space, it gets very tricky very quickly.

    If you want to just start with orientation and detecting motion, the [BNO080 Qwiic Breakout will work well for that as it does handle a lot of the math for you in regards to Euler angles and quaternions for monitoring orientation and heading. We have an [Arduino Library available for this to get you started with code and the [Hookup Guide will go over the hardware and the functions in the library. You will need a microcontroller like an Arduino to run the code for the BNO080. If you want to stick with the Qwiic system, the [SparkFun RedBoard Qwiic is a good choice since it has a Qwiic connector already on the board.

    The pressure sensor you mention will not really work too well for this since it is not measuring the pressure applied by, say, a finger, but instead is measuring atmospheric pressure. If you are trying to measure the pressure applied by a finger or hand, a force sensitive resistor like [this could work. The [Force Sensitive Resistor Hookup Guide will cover the basics of those sensors and has some example code as well.

    We also have this [Qwiic Flex Glove Controller that has two flex sensors mounted to it so you can fairly easily integrate this type of sensor into a glove or other wearables application. Just like the previous products, we have a [Hookup Guide to get started with the basics.

    The one portion of this project that I do not think we really have a good solution for is monitoring an object in space. An IMU does a great job with handling orientation and detecting movement, but it does not really work well for tracking an object’s movement in a 3D space. For example, most modern VR headsets controllers use a combination of an IMU and IR LEDs paired with a camera or other visual sensor to track the orientation and position of the controller.

    With the right code, you could potentially create “gestures” that are defined by very specific movement from a standstill but tracking something like an arm moving from point A to point B is very difficult since the IMU is only returning values for acceleration, heading and orientation. If you wanted to get started with creating “gestures” I would recommend playing around with the Special Functions for the BNO080. We cover those in some more detail in the Hookup Guide I have linked above.

    I hope this information helps you decide on a direction to go with this project and if SparkFun has the right parts to get started. Let us know if you have any follow-up questions and we would be happy to help as much as we can.](Qwiic Flex Glove Controller Hookup Guide - SparkFun Learn)](SparkFun Qwiic Flex Glove Controller - SEN-14666 - SparkFun Electronics)](Force Sensitive Resistor Hookup Guide - SparkFun Learn)](Force Sensitive Resistor - Small - SEN-09673 - SparkFun Electronics)](SparkFun RedBoard Qwiic - DEV-15123 - SparkFun Electronics)](Qwiic VR IMU (BNO080) Hookup Guide - SparkFun Learn)](GitHub - sparkfun/SparkFun_BNO080_Arduino_Library: An Arduino Library for the BNO080 IMU combination triple axis accelerometer/gyro/magnetometer packaged with an ARM Cortex M0+ running powerful algorithms.)](SparkFun VR IMU Breakout - BNO086 (Qwiic) - SEN-22857 - SparkFun Electronics)

    This sounds like a really neat but fairly ambitious project to take on with IMUs and other sensors to track an object in space.

    Yes, it’s a little ambitious but attainable. I’ve seen online where others have done what I’m trying to do, they just hadn’t documented how, so I’m inferring based on the information that is available.

    I believe that I have settled on the “SparkFun 9DoF Razor IMU M0” because it gives you information in quaternions and does not appear to require a lot of programming to get what you want. In addition, I don’t see where it requires a calibration practice, other than allowing the device to remain still.

    … but instead is measuring atmospheric pressure.

    Yes, this is exactly what I am looking for. I originally thought that an IMU would give my X,Y and Z data but nothing in my research shows that I can get Z or that I can get Z relative to the ground plane. So, my hope is that I can utilize the barometer to calibrate relative to my ground surface, then I know the height by air pressure where the sensor is located.

    I will use the Qwiic system but I will need to use the "SparkFun Qwiic Adapter” to connect the Razor IMU to my other devices.

    I plan to send the data from my barometer and IMU to the AdaFruit LE 32u4 to be transmitted via Bluetooth. I’ve been reading and watching videos but one thing I have yet to determine how you shuttle the information from the IMU and barometer to the processor. I can find information about connecting via I2C but how do you get the data from one device to another.

    Thank you for all of your help

    In order of importance

    I would like to limit or eliminate the math.

    I would like to limit or eliminate calibrations.

    I would really like to utilize the Qwiic system to limit soldering.

    Limit math. Limit calibrations. No soldering. What are you a humanities major? Sorry my dickishness takes over on Friday nights.

    No but in all seriousness, are you looking to do broad hand gestures, or looking to do actual sign language? I know that we’re trying to limit math involved in the process, but I feel like you’re going to get a lot more accurate sign language input if you were to do something like setup multiple force sensitive resistors at each joint for the hand. You can use sensors similar to these:

    https://www.sparkfun.com/products/10264

    or make your own out of different conductive materials. Personally I think that making FSRs that go over each part of the finger joint is going to give a lot more definitive resolution for finger movement, which from what I understand of ASL is pretty important in addition to the gesture in general.

    If I were trying to build this out, not only would I include a 9dof, but I would forgo the height sensor and go with a microcontroller setup to 10-15 force sensitive resistors sewn throughout the fingers and thumb of the glove and setup a simple test curve for the FSRs based off an “average” (~my~) hand. I know this would involve more math, but I am confused as to how we’re going to interpret sign gestures when all we’re seeing is relative position of a 9DOF board - not anything else.

    I am looking to do actual sign language. Due to budget constraints, I am going down the road of using Velostat. I plan to start with 5 finger sensors (gross measurement) and work my way up to 14 sensors per hand (precision measurement). The 14 sensors will be two sensors per finger (+10) and the areas between fingers as well as the thumb (+4).

    If I were trying to build this out, not only would I include a 9dof, but I would forgo the height sensor and…

    When I noted the goal was sign language, I meant more than finger-spelling. Where the hands are in space will require more than a 9dof. After I get the finger position, I’ll need to know the hand is located relative to the rest of the body. This will include the elbow and shoulder. I am starting with the hands. There are many similar signs and where the hand is located at the time makes a difference (check out the ASL signs for father, grandfather, mother, grandmother)

    I know this would involve more math, but I am confused as to how we’re going to interpret sign gestures when all we’re seeing is relative position of a 9DOF board - not anything else.

    You are right. I should have added more information to make it clear how I planned to get from A to Z.

    Yes, I would like to limit the math and the calibrations. If the sensor/processor can do most if not all the heavy lifting, I am fine with that. If I must get chest deep in math, then I’ll do that (with assistance). The calibrations worried me a lot because reading the specs on the BNO055 suggested frequent calibrations would be necessary. I wasn’t entirely clear on calibrations on the BNO085 but having to calibrate whenever you leave the room (change locations) and the elaborate method for calibrating did not sit well with me.

    In my quest for brevity, I left out information that may have been helpful to others.

    I appreciate your ideas and help. It was very helpful and confirms the direction I decided upon is a good one.

    Thank you