I’ve been running some experiments to quantify the delay involved in
USRP + GNU Radio (ex. round trip time). I downloaded UCLA’s 802.15.4
implementation on GNU Radio from the SVN server on acert.ir.bbn.com
However, I got some results which I cannot explain. I hope someone
here can shed some light on this.
When both transmitter and receiver are USRP + UCLA 802.15.4 on GNU
Radio, sometimes I found that a packet is finished decoded by
ucla_ieee802_15_4_packet_sink long after a packet is received by USRP.
To be more precise, the packet is sent to the message queue (to upper
layer) after the next packet (which can be half a second later than
the first packet) is transmitted by the transmitter. My guess is that
USRP does not have enough samples to form a USB packet and transmit to
the host side, so it waits until the next packet arrives. Is this
correct? Does this ever occur to anyone who is experimenting with
packet radio? Is there anyway to flush the buffer in USRP so the delay
The above URL is a trace of my experiment. Most of the lines marks the
time of the entry and exit time stamps of the blocks using
gettimeofday(). From line 414 to 438, you can see that the packet is
almost done, but it waits until line 512, which is when the next
packet arrives, to insert the packet to the queue.
Thanks for reading this long post. I’m looking forward to the reponses.
Hsin-Mu (Michael) Tsai
Electrical and Computer Engineering Department
Carnegie Mellon University