Hi there,
I am a novice to Wireless and RF use and could use a pointer or two.
I am using the 2.4GHz Development Nodes and want to program the device so that it can tell how far another device is from it’s current location. The second step then would be to use that distance to determine the direction of a third transmitter. My problem is that I can’t get my head around how do determine the first distance. I imagine that this problem has probably been solved before, I just don’t quite where to start.
Would it be something like sending a signal that says “I am going to send another signal X milliseconds after I send this one.” and then sending it at that time and measuring the delay?
thanks for taking the time,
Peter Gardner
Measuring the delay as you descrbe will not work. Even if you have the equipment to measure the delay (about 3.3nS per meter), it wouldn’t help. The delay in receiving the second message would be the same as reading the first.
You can get a rough idea of distance from signal strength, depending on what’s between the transmitter and receiver.
If the modules are far enough apart, you could use GPS. This probably wouldn’t work unless the modules are at least 50 meters apart.
You could use ultrasound to measure distance (about 3mS per meter at room temperature and sea level) if the devices are close enough, and there aren’t any barriers between them.
Nope. Here’s a mini-tutorial (from my professional endeavors)
The wireless signal travels at the speed of light. That’s about one nanosecond per foot. So a signal from A to B will take x nSec if the distance is x ft.
There are three basic ways to measure location with wireless devices, neither of which is simple enough to do for a hobby endeavor. The only way to measure distance independent of x,y location is “radar”, where you measure the round-trip time of an RF signal. For distances of less than thousands of feet, today’s systems first measure location.
Now the so-called ultra-wideband devices do a radar-like measurement at short ranges. But this is exotic and costly at this point in time. And FCC regulations are such that the range is perhaps 30 ft. line of sight.
So on measuring location - then range is simple trig…
- Time Difference Of Arrival (TDOA). Two receivers, A and B. One transmitter, well call that C. C transmits some sort of mark in time pulse that the receivers can discern. Sometimes this comes naturally from a transmitted spread spectrum signals’ digital components. So A and B note the time of arrival of C’s signal. The time difference allows a hyperbola to be computed if A and B know exactly where one another are. Do this with another pair of receivers and you have intersecting hyperbolae and thus the location of C.
THe trouble with this is that A and B have to measure their times “with respect to” some absolute time reference. Providing A anb B with such a reference is the costly problem. Our GPS satellites do it with super stable oscillators ($$$$$$) which are recalibrated by even more accurate time sources on the ground, given a radar tells us exactly where the satellite is (because it moves in orbit). For TDOA systems A and B are given a common time reference by a wire or wireless. But it has to be accurate to a few nanoseconds to get accuracies of a few ft. This gets complicated and expensive. It can be done with cat-5 cable and precision circuity. Also, multipath reflected signals wreak havoc with TDOA.
-
Angle of Arrival (AOA). Each receiving site has two or more antennas and receivers. The antennas are separated by some known distance. Each receiver measures the time of arrival of an RF carrier’s wavefront at each antenna. The phase angle difference tells us the angle of arrival of the signal - if we know the frequency and the antenna spacing. (there is ambiguity if the angle exceeds 2 pi which it can, but there are ways to deal with this). Triangluation with two or more receiving sites yields the location of the transmitter. The circuity to measure the phase angle is quite exotic.
-
RF Footprinting. This one is cheap and popular. The concept here is that receivers A, B and so on all note the signal strength of a transmitter C. No pulsed precision needed. Now A and B report this to a computer program via a network. The computer knows, from a prior survey of the environment in which A and B are located that the signal strength differences at A and B and a third or fourth differ in this measurement from the original survey by X. Some statistics give an estimate of location. Now the rub: If the environment changes, the computer will make the wrong assumptions. Examples of changes in the surveyed environment: A or B moves; a wall is built or removed; furniture changes to a great extent. Etc. This technique doesn’t work well outdoors. This is because of the inverse square law… the signal strength decreases rapidly as distance increases but no so as distances get large. To wit, we can communicate with spacecraft zillions of miles away due to this law of physics. Indoors, walls and floors create far greater attenuation gradiants which permit this technique to work so long as the environment doesn’t change. Other gotchas include blockage of the transmitter’s signal, say, due to your body’s water content (yes, at 2.4GHz is is quite significant). Your body wasn’t in the survey so the signal strengths at some receivers is distorted.
Well, there’s a quiz tomorrow.
Thank you for your replies and the physics lessons… I am consistently impressed with the helpfulness of people on this site.
If distance finding isn’t possible, I guess I’ll have to do with direction finding, which seems far less baffling, though I’d still appreciate tips existing code or schematics if anyone has any.