Weak Signal with XBee Pros

Hello All,

I am having the worst luck getting my xbees to establish a reliable connection. I’m using a 2 XBee pro s1 one connected to my computer with a SparkFun XBee Explorer Dongle and the other attached to an arduino using a SparkFun XBee Shield which is being powered by an external 7.4v 1300ma battery. I’ve gone through all of sparkfun’s xbee tutorial. I can get them to connect but the signal is very weak. I’m supposed to be getting a range of 1 mile line of sight but I’m only getting a maybe 100 feet line of sight. I’ve been looking all over the internet for a solution for the past 2 days but can’t find any reason for such a low signal range. In xtcu’s networking tab it says that the signal quality is unknown. Does anyone have any ideas?

I’ve used many XBee Pro S1. And am an RF engineer.

1 mile line of sight is an unrealistic goal at 2.4GHz even with the Pro’s 50mW or so, with the simple wire or on-PCB antennas.

Sparkfun probably doesn’t have RF engineers… and reads the marketing literally.

To get a mile line of sight, you’d need a yagii on at least one end of the link.

Do you have the XBee S1’s at max TX power via the AT config, with ATW?

what antennas do you have?

with 100% clear line of sight, and you ensure that the antenna pattern is oriented to favor the direction to the other unit.

Assume on-PCB or little wire-dipole antenna. On-PCB has about -4dBi of gain (loss). Wire dipole has about 2dBi of gain in the directions perpendicular to the wire. Off the tip of the wire, the pattern loss is great so avoid that.

One quick calculation from the laws of physics

TX power: 60mW (if XBee is thusly configured)

antenna gain for both ends of link: 0 (a bit optimistic for PCB antennas).

Assume neither end has a gain antenna.

Assume neither has a length of coax with its high losses at 2.4GHz.

path length: 1 mile

Frequency 2.4GHz

Line of sight path loss is 104dB. Big.

calculating received signal strength:

TX radiated power (TX + antenna gain) = 12dBm

so we have 12+0-104+0= -92dBm. Received signal strength.

The XBees per IEEE 802.15.4 have one single modulation rate (unlike 802.11). So the required signal strength doesn’t vary.

The XBee Pro receivers might yield a reasonable frame error rate at about -90dBm. A bit more if you dare, say, -95.

So at -92 you are very marginal.

The vendors tend to spec receiver sensitivity without a frame or bit error context. Or they state 1% FER (frame error rate).

Real world, you need to allow 10dB or more of excess received signal strength to accommodate fading.

A 1 mile path with real line of sight might come from tall buildings, hills, etc. But not on flat earth.

A 15dBi gain antenna on one end would help a lot.

There is one more loss to consider: Fresnel zones. This is the loss due to inadequate antenna elevation for some path length and terrain in the path.

For line of sight not terrain, no trees, etc., you need about 32 ft. of antenna elevation at each end in order to avoid losses due to Fresnel zone impingement. Or much higher on one end only. Envision a football shaped oval between the antennas. Too low, and part of this oval, near its center, touches the earth to some extent. The extent of lack of 100% clearance of the oval is more path loss. The Fresnel zone affects are quite frequency band dependent.

Google Fresnel Zone and read a bit on this.

Using the 900MHz band, the path loss is about 10dB less. (lower freq). But antenna gain is harder to get at the lower frequency due to antenna size.