xBee CSMA/CA questions

I’m trying to understand the basics of xBee collision avoidance (CSMA/CA). I’ve perused several pieces of xBee and 802.15.4 documentation as well as the wireless forum. I’ve seen a few questions similar to what I’m asking, but have yet to find an explanation re: exactly how CSMA/CA works, or what I should expect to observe if I have a situation where RF transmissions from xBee modules overlap.

Here are some details of a simple setup I’ve got going, and what I’m trying to observe …

  • - Two transmitters (different addresses) set up in transparent mode; both periodically (almost

    simultaneously) sending single bytes (9600 baud) to a common receiver.

  • - I'm intentionally trying to make collisions (to see how the algorithm works) .. so xmtr 2 gets its byte to

    transmit 500 us delayed from xmtr 1. (i.e. 1/2 byte time at 9600 baud, so they overlap about 500 us).

  • - Modem parameters for xmtrs: RN = 1; CCA = 0x50 (-80 dBm ?)

    (The transmitters are adjacent, so xmtr2 should certainly see considerable carrier energy from xmtr1 when it initially tries to transmit).

  • I’m monitoring data at the receiver. With a delay between bytes > 1 ms or so (i.e. greater than a transmit byte time) all looks as expected at the receiver … byte1, byte 2 … byte1, byte2 … etc. However, when I reduce the delay time to 500 us (i.e. purposely try to induce collisions), data gets corrupted at the receiver. My question is “why does this happen if the collision avoidance algorithm is supposed to be functioning?”

    What I expect to happen is xmtr 2 will back off when it initially tries to transmit (during the overlap period), then send its packet after xmtr 1 finishes. Further, I expect to see xmtr 2’s EC parameter (i.e. CCA failure count) indicate the number of times xmtr 2 had to wait to transmit. However, it just sits at 0.

    Anyway, sorry for the long-winded post, but I hope someone familiar with the details of collision avoidance can shed some light on the situation, or point me to some documentation that may help out … will be happy to clarify, or provide add’l information if necessary. Obviously my understanding is lacking.

    Thanks in advance

    Delay between bytes?? Confusing.

    When you tell the XBee to send n bytes, it composes an 802.15.4 frame for up to about 100 bytes. The frame has quite a number of fields and bits in addition to the user data byte (payload).

    If you have CCA (listen before transmitting), enabled, the frame transmission is delayed a bit while clear channel assessment is done. If there is energy present, there’s a delay, this being similar to an exponentially increasing backoff. The exponent of this function is user defined in the AT commands. As is the signal threshold for CCA triggering.

    Often, an 802.11 WiFi packet or beacon will cause a CCA delay, if such is present in the 2MHz channel, or maybe 5MHz if the WiFi signal is quite strong. So channel selection affects things. Also, Bluetooth (a freq. hopper) can cause CCA delays. If CCA fails after n iterations of the backoff, you’ll see a CCA fault transmission status - if you look for it in the status.

    It’s best to do all this using the XBee binary API, not the AT commands and transparent serial mode. Doing so, there’s a transmission completion status for each frame and some may be saying CCA fault. ALso in that transmit complete status is whether or not the receiving unit sent a MAC layer ACK (assuming you have that enabled as is prudent, since with it disabled, there’s no error correcting retries).

    All of this is similar to what 802.11 does for CCA (CSMA/CA).

    stevech:
    Delay between bytes?? Confusing.

    Sorry, let me elaborate. I’m sending a single (9600 baud) character to each of two xmitting modules … (the character length is therefore approximately 1 ms). The delay I’m referring to is the offset in time between when I input a character to the modules. If this offset is less than 1 ms, I’m intentionally trying to cause collisions. Again, I’m purposely attempting to make transmissions interfere so I can observe how CSMA/CA mitigates this situation.

    stevech:
    When you tell the XBee to send n bytes, it composes an 802.15.4 frame for up to about 100 bytes. The frame has quite a number of fields and bits in addition to the user data byte (payload).

    Got it … thanks.

    stevech:
    If you have CCA (listen before transmitting), enabled, the frame transmission is delayed a bit while clear channel assessment is done. If there is energy present, there’s a delay, this being similar to an exponentially increasing backoff. The exponent of this function is user defined in the AT commands. As is the signal threshold for CCA triggering.

    Here’s where I think I might be confused. I believe I have CCA enabled (I have set config. parameters as RN = 1; CCA = 0x50 … -80dBm) … but I’m not sure I see evidence of this. I see data corruption at the receiver (the two bytes received in reverse order, or one of them not getting through), and I don’t see the CCA counter incrementing … just stays at 0. (But maybe this ok. I thought the counter should increment with each failed attempt … but you indicate it increments after "n failed iterations). Again, I’m just trying to understand if I indeed have CCA enabled … and what I might observe (given that I am seeing corrupted receive data).

    stevech:
    Often, an 802.11 WiFi packet or beacon will cause a CCA delay, if such is present in the 2MHz channel, or maybe 5MHz if the WiFi signal is quite strong. So channel selection affects things. Also, Bluetooth (a freq. hopper) can cause CCA delays. If CCA fails after n iterations of the backoff, you’ll see a CCA fault transmission status - if you look for it in the status.

    Again, the EC (CCA failures) parameter is always 0.

    stevech:
    It’s best to do all this using the XBee binary API, not the AT commands and transparent serial mode. Doing so, there’s a transmission completion status for each frame and some may be saying CCA fault. Also in that transmit complete status is whether or not the receiving unit sent a MAC layer ACK (assuming you have that enabled as is prudent, since with it disabled, there’s no error correcting retries).

    I’ve tried running with the MM (MAC Mode) parameter set to 0 (802.15.4 + MaxStream header w/Acks), as well as set to 2 (802.15.4 w/Acks). I do need to spend more time learning and trying out API mode vs. AT commands and transparent mode.

    stevech:
    All of this is similar to what 802.11 does for CCA (CSMA/CA).

    Thanks for your feedback and comments … they are appreciated.

    Perhaps your received data errors are due to the transparent serial mode characteristics. There are many serial-to-x transparent adapters, where x is some wireless media, or more commonly, ethernet (802.3). Usually, the technique is to collect incoming data from the serial port until either the internal buffer is full or some time period t elapses with no arriving characters. When either condition occurs, the so-far buffered characters are made into the right form of frame or packet and sent as a batch. (The XBee has a setting for time t).

    If you send characters so rapidly that the rate exceeds the rate that the other media can support (802.15.4), you will see errors like you describe (buffer overflow) unless you utilize flow control on the serial interface. The rate that 802.15.4 (at 2.4GHz) can accomodate is some fraction of 250Kbps, where the fraction is determined by the data coding per 802.15.4, CCA delays, and retransmission for error correction due to MAC ACK timeouts, and time lost while doing MAC ACK timeouts.

    In the transparent serial mode, the RS232 CTS signal can be used for flow control into the XBee. If you enable that in the XBee config. registers. And your PC or device must immediately stop sending when CTS goes false. When CTS goes false, it means there is no more buffer available - the “pipe is full”.

    In the XBee API mode, flow control is inherent if you send an API frame then wait to get a transmission complete status report via the serial port. This will occur AFTER the receiving end sends its MAC ACK, and after the MAC ACK has been received by the XBee doing the mating transmission. That transmission complete status also reports transmission failure error codes such as CCA fault and MAC ACK timeout.

    So constantly jamming out characters at more than about 50Kbps with an ideal RF condition and no WiFi interference, etc. is about all you can expect. And for that, you must use CTS flow control or the binary API for reliability.

    Otherwise, just don’t cram so much data into the transparent serial pipeline, and/or do your own batches-of-bytes and get something back from your receiving application saying, OK, send more. This is an application layer protocol, such as XMODEM and the like use, but it can be made more simply. Remember that the 802.15.4 frame user payload is about 100 bytes.

    stevech:
    If you send characters so rapidly that the rate exceeds the rate that the other media can support (802.15.4), you will see errors like you describe (buffer overflow) unless you utilize flow control on the serial interface … So constantly jamming out characters at more than about 50Kbps with an ideal RF condition and no WiFi interference, etc. is about all you can expect … Otherwise, just don’t cram so much data into the transparent serial pipeline, and/or do your own batches-of-bytes and get something back from your receiving application saying, OK, send more.

    Thanks for the detailed explanation .. I appreciate the information. I don't think the data errors I'm dealing with though could be due to any sort of buffer overflow or due to exceeding channel capacity.

    Again, I simply send one (different) character to each of two transmitting xBees, slightly offset (by perhaps 500 us … i.e. 1/2 a byte time), with the intention of trying to create collisions due to this overlap. I do this so I can monitor how the receiver/network handles it … do the bytes come out in the same time order I presented them to the transmitting modules?, do I see any sort of CCA failures?, etc. It’s just a very simple experiment to see how collision avoidance operates.

    What I haven’t mentioned is that I do this only once/second. So, the aggregate data rate is two bytes of data per second … hardly enough to overload anything I would assume.

    I’ll keep playing around with it, and also try out API mode as you suggested previously.

    Thanks again.

    I don’t understand how you are getting corrupted data. XBees communicate among themselves at 230kbs (although that is half duplex ← I know, half duplex is a dumb term), so any corruption you get is most likely not the XBees. Additionally, at 9600 baud, you are nowhere close to being able to choke the xbees without some sort of interference. In practice, you can usually get about 70kbs out of a pair before you start losing packets.

    My last point, too, is that the XBee, depending on configuration, has a large buffer that it will wait to fill up before sending it on its merry way. Check out packetization timeout (ATRO).