bluetooth RSSI

For my project, I need to determine whether I am getting closer to or further from a signal source. The signal would need to be detected through walls from one room to another. I don’t have RF experience. I thought the best solution might be to interface a bluetooth module with a microcontroller and then have another microcontroller that has a bluetooth RSSI capability that would indicate the relative proximity to the bluetooth module. I have some questions:

  1. Would Bluetooth be much more reliable in this scenario than using a radio like the xbee? Less interference? Would WiFi operating at the same location possibly interfere with a signal strength reading?

  2. Many SparkFun bluetooth modules are about $65 but the Bluetooth SMD Module - RN-42 is only $15. Why is this so much cheaper and would this be suitable for an application in which you only want to know signal strength?

  3. The biggest problem is probably figuring out how to create a bluetooth RSSI indicator. Any advice? Would it be necessary to have another bluetooth module to find the RSSI? I have only seen general RF detectors for sale that you could interface to an ADC that would output a voltage proportional to signal strength. I don’t know how well this would work and if there would be interference from other signals.

Thank you. :wink:

RSSI is a poor indication of distance, especially for indoor use.

Most of these kinds of radios provide RSSI for the last received data frame, so it’s independent of interference, essentially.

The XBee for example, provides RSSI as a data byte in various messages and control info. And also as a pulse width modulated output bit.

why is RSSI a poor indication of distance? Any better ideas?

saymyname:
why is RSSI a poor indication of distance? Any better ideas?

First the attenuation through walls and such far outweigh the space loss. Unless the the transmitter/receiver pair are always going through the same walls, that you’ve calibrated the loss through, estimating distance w/RSSI is going to have huge error bars. Second in your environment the multipath will create large variations in RSSI, and thus large errors in distance estimates. Third the cross polarization of the antennae, which probably will not be kept in the same alignment vs time (the receiver is moving) will cause some unpredictable variation in RSSI. And that’s assuming you’ve used some broadly omnidirectional type of antennae, not something with much gain. Antenna gain and differing “pointing alignment” can have a large effect on RSSI. Then on a practical side there’s transmitter output power variations (which could in theory be calibrated out) and gain variations at the receiver end (same story). So in a freespace environment with calibrated transmitters and receivers, you could make it work. In your environment and in practice … the errors in your estimate would render it mostly useless unless all you wanted was a “close by” or “far away” indication with a large inbetween zone of “I can’t tell”.