Here is a theory I would like to test out, please tell me if this is plausible.
This is supposing that there are no obstacles and that the atmosphere is giving the least interference.
The setup: A stationary cellphone connects to the smirf, which is also stationary. The time that it took for the smirf to detect the cellphone is recorded-this is essentially the time it took for the bluetooth signals to reach the cellphone(there’s probably some time off due to processing time and etc).
time it took=t
Frequency is known to be 2.4 ghz
So to find the distance:
c is speed of light
c=fλ
λ=c/f
T(period)=1/f
t/T=number of waves
t/T*λ=distance from device.
Am I totally missing something here? I know that using pure math is not going to guarantee perfect results. What are factors that could cause the errors in this solution? Any discussion is appreciated!
The time it takes for signal detection scheme to process the incoming and the speed of light makes this a challenging project.
Radio waves travel at 299,792,458 meter per second or around 299.8 meters per microsecond. Every microsecond of processing overhead during the detection phase will skew your distance measurements by that amount. If the time to detect the signal varies (and it most likely will), your results will be completely unreliable.
The problem that has to be addressed is the synchronization of the times between the transmitter and receiver. They have to be very close if you ever hope to get this working. Short of an atomic clock, I don’t know how you are going to get those clocks synchronized and keep them locked together over time. That is why GPS satellites use cesium based atomic clocks.
One might be able to have a high precision clock on the transmitter and then have the receiver immediately return the signal. The transmitter then could calculate time-of-flight from that information. The key thing is that the receiver must echo back the signal as soon as it gets it (or a fixed amount of time after it gets it). If the distances are small (<299 meters), you will need extremely fast processing. However, it can be done. Google “time domain reflectometers”. TDRs are instruments used to test cables by measuring the impedence at various points in a transmission line. However they have the advantage of using the internal clock to measure time of flight.
lzhang:
Here is a theory I would like to test out, please tell me if this is plausible.
This is supposing that there are no obstacles and that the atmosphere is giving the least interference.
The setup: A stationary cellphone connects to the smirf, which is also stationary. The time that it took for the smirf to detect the cellphone is recorded-this is essentially the time it took for the bluetooth signals to reach the cellphone(there’s probably some time off due to processing time and etc).
time it took=t
Frequency is known to be 2.4 ghz
So to find the distance:
c is speed of light
c=fλ
λ=c/f
T(period)=1/f
t/T=number of waves
t/T*λ=distance from device.
Am I totally missing something here? I know that using pure math is not going to guarantee perfect results. What are factors that could cause the errors in this solution? Any discussion is appreciated!
read about
multilateration
time-difference-of-arrival in RF
RF footprinting, i.e., location estimates using a surveyed, a priori knowledge of overlapping coverage of access devices
I’ve worked with all these and there are quite a few commercial products for tracking things like children in an amusement park, people and equipment in a hospital or warehouse, and so on. RF Footprinting is the most practical method. Multipath bedevils the other methods, and they are costly, needing DSPs and clever ways to get sub-microsecond common time references.