Help with Xbee Srs2 / Arduino Network

I am trying to develop a wireless sensor network with 5 signal sources. The sensor data is read in as an analog signal to an Arduino Uno microcontroller and shipped off via Xbee Series 2 radio. I am using an Arduino library that uses API mode to communicate between Xbee radios (http://code.google.com/p/xbee-arduino/) and a Python program to read the transmitted data from the USB serial port to which an Xbee explorer dongle is connected. The Rx code is run through the windows command line interface and is as follows:

#! /usr/bin/python
 
from xbee import ZigBee
import serial
import struct
 
PORT = 'COM8'
BAUD_RATE = 115200
 
def hex(bindata):
    return ''.join('%02x' % ord(byte) for byte in bindata)
 
# Open serial port
ser = serial.Serial(PORT, BAUD_RATE)
 
# Create API object
xbee = ZigBee(ser,escaped=True)
 
# Continuously read and print packets
while True:
    try:
        response = xbee.wait_read_frame()
        sa = hex(response['source_addr_long'][4:])
        rf = hex(response['rf_data'])
        datalength=len(rf)
        # if datalength is compatible with two floats
        # then unpack the 4 byte chunks into floats
        if datalength==12:
            h=struct.unpack('h',response['rf_data'][0:2])[0]
            t=struct.unpack('f',response['rf_data'][2:])[0]
            print sa,' ',rf,' t=',t,'h=',h
        # if it is not two floats show me what I received
        else:
            print sa,' ',rf
    except KeyboardInterrupt:
        break
         
ser.close()

The Arduino Unos are programed with a modified version of this code (http://code.google.com/p/xbee-arduino/s … es2_Tx.pde). I took out the delay function at the end, added a ‘micros()’ function within the if() statement that executes when a transmission status is received as a ‘SUCCESS’, and do my analogread() in an interrupt service routine as shown below (for 2 Hz):

void setup()
{
  
  // initialize timer1 
  noInterrupts();           // disable all interrupts
  TCCR1A = 0;
  TCCR1B = 0;
  TCNT1  = 0;

  OCR1A = 31250;            // compare match register 16MHz/256/2Hz
  TCCR1B |= (1 << WGM12);   // CTC mode
  TCCR1B |= (1 << CS12);    // 256 prescaler 
  TIMSK1 |= (1 << OCIE1A);  // enable timer compare interrupt
  interrupts();             // enable all interrupts
}

ISR(TIMER1_COMPA_vect)          // timer compare interrupt service routine
{
  sense = analogread(11););   // toggle LED pin
}

void loop(){ ...

The radios are set to operate at 115,200 baud. The current design uses hardware interrupts to sample at a fixed rate of 200 Hz and each data point is recorded as a 2 byte (16 bit) integer. I’ve been told that the max packet size is 80 bytes, implying that I can send 40 data points (covering 1/5th of a second) per packet.

The problem I have run into is that, regardless the baud rate, interrupt frequency, or number of nodes transmitting, the coordinator Xbee receives packets every 500 ms. I infer this from the value returned from the micros() function, which I transmit in the packet with the sensor data (as a float) for debugging.

I pared the system down to a single end device performing sampling hardware interrupts at a rate of 2 Hz and it could only send data every 500 ms. This is a problem because even if I fill the packet 40 data points, the transmitter will still not be able to keep up with the sensor.

On top of this problem, in this pared down setting, I noticed sometimes that maybe once per minute there was a full second gap or a 1.5 second gap between successful transmissions, instead of the normal ½ second gap. When I scale up to 5 end devices simultaneously transmitting, the ½ second intervals between transmissions from each end device were still apparent. However the number of ‘lost packets’ increased significantly. Every 10th transmission roughly from EACH DEVICE appears after a 1 to 2 second gap.

All the transmitters and receivers are sitting on the same table, so distance/interference isn’t a problem.

How should I got about get data transmitted faster?