Hello,
I have an Arduino based robot using serial commands for control, and decided to use a pair of XBee series 2 to control it wirelessly. I’ve used XBee in AT and API mode before, but I’ve been away from it for a long time. What I had in mind was to simply flash the radios with the default settings for AT mode, plug in the shield and the USB dongle and send data with a terminal app the same way I’d do with the serial cable.
So, I did all this and it works perfectly at first. Using the XCTU terminal I get to send some command chars and the robot reacts… but sometimes when I send more than a few chars, the robot simply hangs and I get no reaction for a few seconds. After some time the data goes and it works fine.
I assume this behavior is not expected, and the same thing does not happens when I use the assemble packet feature of XCTU and send a lot of bytes at a time.
I think I’m missing some setting somewhere, or some header/footer to sinalize packet start/end. I checked some documentation, but couldn’t find anything: all assumes that in AT mode it works transparently as a drop-in replacement for a serial cable.
I’m using Series 2 with all default settings, 19200 bps.
Yes, serial port extension mode is transparent. Not called the “AT” mode since responding to AT commands is a brief mode initiated by
idle time
+++
AT commands/responses
idle time
revert to transparent serial mode
From what you said, it sounds like you don’t have a flow control problem. This means you send more than about 100 characters in a burst before they can be transmitted on the air at about 100Kpbs.
Some things to check
ACK option enabled to get error correction? I don’t recall what the factory default is. Also, retry limit setting.
Clear channel assessment (CCA) threshold factory default?
Tried a different channel number (vastly different number) in case there’s some persistent interference?
Thanks for the reply, stevech
I read your suggestions, but couldn’t get some of them to work. For instance, I can’t change the channel on the XCTU Modem Configuration, and when I try to change it on AT mode with ATCH I get an error. I couldn’t find any settings for CCA threshold, and althought there are some related settings on the manual, I couldn’t change them on Modem Configuration either.
The ACK related problem was that I was using a broadcast address, which means it has no ACK. After I changed it to use the readio address it started working better.
I got it further improved by changing Packetization Timeout (RO). It has a default 3, which means it waits 3 char times before sending a partial buffer. Since I need real-time control, I set it to 0 to get chars send as they arrive.
It’s working a lot better, but I’m still getting a few lost and out of order bytes.
Thanks for the help
Yes, a broadcast (not unicast) packet in '15.4 gets no error detection/correction at the MAC layer - The sending unit has to do some protocol to get a response from all nodes that were supposed to get the broadcast, if applicable.
Be sure to set the MAC layer ACK to enable and the retries to a reasonable number.
Changing channels… you have to use only the channel numbers defined by 802.15.4, and pay attention to Hexadecimal versus decimal number entries. The numbers, as I recall, for the US are 11 through 26, decimal.
The timer - that’s on the sending end, not receiving. Simply says the transmitting device will send all chars (bytes) it has accumulated in the last x amount of time, where x gets restarted each time another byte arrives. Normally you want to accumulate chars and not send each one individually, if the data is organized as groups of bytes. If you’re sending text with a CR/LF at the end, the authoring device can put a delay of x after the CR/LF to give time for transmission and error correction. Some serial to Wireless devices, and I can’t recall if XBee does so, will send the accumulated chars upon x amount of idle time OR receipt of a certain char such as CR.
The serial mode does work reliably if the hardware logic levels are correct, you don’t overrun the 100 byte buffer, use error correction, etc. But if the data HAS to be error free, you’ll need your own wrapper for your data to do sequence numbering, a data count, etc. Which is why non-trivial uses of XBee use the binary API mode.