Newbie to XBee, few simple Questions

So I am using the ID-12 to read RFID tags. I need to send the tag ID wirelessly less to a board across the room(a few feet away). I had originally tried using the RF-Link ASK(http://www.sparkfun.com/commerce/produc … ts_id=7816) transmitter/receiver pair. However, this does not like any of the TTL output I feed it from a PIC(not directly from the ID12 because of baud issues) and seems to have tons of noise.

To navigate around this problem I was looking at the XBee module because I am outputting UART from the ID-12 and the popular XBee looks like it will fit my project perfectly. I was just wondering, do I need to buy an expensive board to program this thing, or if I just buy the XBee 1mW Chip Antenna(http://www.sparkfun.com/commerce/produc … ts_id=8664) will it work for point-to-point UART transmission like I need? The data sheet is a bit fuzzy about how to program it, and mentions a “professional developers kit”. However, at the beginning of the spec sheet it says that by default the XBee module works like a UART wire replacer. So for my project I can just use this module without having to buy any programming boards, correct?

Thanks, -WK

The XBee Series 1 modules work point to point without any programming. I use them in a project and they seem pretty solid. The developer kit includes PC interface boards (USB and Serial) which are useful when programming custom applications, but not required to use the modules.

Fortunately for you, the ID-12 outputs data at 9600 bps and that is what the XBee default to (otherwise you would need to reprogram the baud rate or use the PIC as a baud rate buffer). So yes, you should be able to connect one to the ID-12 and the other to your PIC across the room and it >should< just work.

does the XBee (with out enabling zigbee) give you guaranteed delivery? if not, then it will be problematic. Noise will kill you.

does the XBee (with out enabling zigbee) give you guaranteed delivery?

Its actually unclear from the datasheet. By default it operates in "transparent mode" (the serial extender mode). From my experimenting, it looks to be using broadcast packets (which will allow any number of the devices in range to communicate) but which also disables automatic retransmission. In my application, I use broadcast mode (no automatic retransmission) but with a redundant protocol (everything gets sent multiple times over a short period of time). However, I will be deploying at much larger range (up to 100' indoors and through walls) so the noise profile + signal loss will be much worse. I am using an XBee Pro (60mw) for the "base station" and the regular XBee (1mw) for my distributed controllers.

With that said, I have found the 1mw modules to be very reliable in my short distance development environment (3 to 5 feet). To top it off, they are sitting on high-RF noise sources (my development computers). Since the OP only needs to transmit a few feet, it may be acceptable. Depends on whether this is for a distributed commercial solution or a one-off deployment. If the OP wants to be really confident, just spent the extra $10 and use an XBee Pro on the ID-12. Seriously-- 60mw of output at 2.4 Ghz over 6 feet. It would take some serious interference to mess with that.

Yeah but even that case can fail. Plenty of mobile sources of RF (cell phones, wifi devices, portable microwave ovens…).

Yeah but even that case can fail. Plenty of mobile sources of RF (cell phones, wifi devices, portable microwave ovens…).

Absolutely. It all comes down to the application and whether an occasional failure is acceptable or not (and how you want to define occasional). Mind you, he could simply program the unit on a PC first and then save the settings to enable automatic retransmission, but that does require interfacing to a PC long enough to configure it. Even then, transmission could still fail with enough interference (I think it will retransmit packets up to 3 times before giving up).

[/quote]

The XBee series 1 in the 802.15.4 mode will retry x times to deliver an errored frame.

Like all communications, ESPECIALLY wireless there will be uncorrectable errors. In wireless, the error rate is related to the extent of noise and interference.

It is up to your application to deal with an uncorrectable error, by

  1. wait and try again later

  2. improve the noise and interference: shorter path, better antennas, …

  3. change channels and try again

  4. report the error to a higher level application and pass the buck.

  5. tell a human that the link is too unreliable.

The latter will eventually happen no matter the product.

In wired communications like RS232 the error rate can be very low, like one in 10e-10. On the Internet, lost packets are common and this is a kind of error due to congestion rather than noise and interference as in wireless.

sorry if this is too pedantic.

  1. implement a retry scheme. since the xbee is 2-way you can do that. the sender waits for an ack. If the receiver gets the packet, he sends an ack. If the sender doesn’t get an ack in a predefined time, he sends the packet again. This keeps up until the sender times out or receives an ack.

By the way, there are other schemes that work well. For example, if you have series data (like sensor readings), you can put a sequence number in the packet. Then the receiver knows the sequence number(s) he didn’t receive and sends that to the sender who will then resend the lost packets.

Also, a simple thing you can do if you have RSS output from the receiver is to sample it before you send, if it’s above a predefined level, wait for a period of time. Random backoff times work better in general, especially if the interference is cyclical.

Philba:
6) implement a retry scheme. since the xbee is 2-way you can do that. the sender waits for an ack. If the receiver gets the packet, he sends an ack. If the sender doesn’t get an ack in a predefined time, he sends the packet again. This keeps up until the sender times out or receives an ack.

By the way, there are other schemes that work well. For example, if you have series data (like sensor readings), you can put a sequence number in the packet. Then the receiver knows the sequence number(s) he didn’t receive and sends that to the sender who will then resend the lost packets.

Also, a simple thing you can do if you have RSS output from the receiver is to sample it before you send, if it’s above a predefined level, wait for a period of time. Random backoff times work better in general, especially if the interference is cyclical.

end to end session layer ack is what my (1) was, above.

I implementented a protocol like that, with sequence numbers and ACKs and timeout on ACKs. For Xbees. Ran 10 XBees at once, each transmitting every 5 seconds. The session layer ACKs and sequence numbers (lost message detection) worked fine. The ACKs have the message number in them of course. This is bi-directional in my case. This all yielded 100% reliable comms for 10 nodes in a star topology, assuming all 10 have an adequate signal strength. With XBeePro’ we got nearly a mile line of sight with the coordinator elevated and with a modest gain omni antenna. In this, the moving XBees had the PC board chip antenna.

What I found is that 802.15.4 has a start-up problem when there are many nodes (Xbees). The CCA (listen before transmitting) is supposed to almost preclude collisions, except for the hidden node problem. If all XBees are commanded to reply at once, via a broadcast-to-all from the coordinator, the MAC layer fails to error correct and the session layer ACKs are essential.

If all XBees are commanded to reply at once, via a broadcast-to-all from the coordinator, the MAC layer fails to error correct and the session layer ACKs are essential.

Great to see this spelled out. For my project, I suspected this might be the case, but had not gotten to verifying. It makes sense as when you send a "report all" request, you implicitly synchronize the reply timing. They all perform the CCA at the same time, notice nobody is listening, and then all proceed with transmitting on top of each other.

I assumed this could be solved by inserting a “random” delay after the response request to break the synchronization. Even if its nothing more than “wait node-address times X CPU cycles”, it should allow the CCA to work since they wont all try to send at exactly at the same moment. One thing I don’t know is if two nodes detect activity via CCA and each wait for the current node to finish transmitting, will they then attempt to send at the same time (once the first node finishes) or do they implement a random backoff (similar to Ethernet) to avoid it. Would be very interested if you happen to know.

CCA (also known as Carrier-sense multiple access or CSMA as in WiFi and Ethernet) - has a “random backoff time” notion. This tries to randomize the time until retry for obvious reasons.

In my session protocol, if two transmit simultaneosly, one or both will not get ACKs for that message # at the session level. They will time out at the application level and retransmit. Here’s where you should randomize too.

But I didn’t and with a few retransmits now and the, ten are able to pound away.

The 108 byte data payloads in 802.15.4 at 250Kbps don’t take much time to send.

Beware that 802.15.4 allows ACKs in the MAC layer to be turned off by option. Not a good idea.

ALso beware that supporting message sequence number checking at the MAC layer is an option and most vendors don’t implement it - they assume the session layer will detect lost and duplicate packets.