- To my background there’s to say - before you read this - I’m just a
software-dev. I have little expertise when it comes to hardware.
I support the development of a real-time application that uses
802.15.4 on an AT86230RF Chip. That RF chip is IEEE 802.15.4-2003
It has got a data-rate of 250 kBit/s. Let’s say I want to send 10
Byte. 2 Bytes CRC are added, and 6 Bytes from the Physical Layer
(Preambel + Length Field + Delimeter).
So I send 144 Bits (12 Byte PSDU are 12 8 and 48 Bytes Physical Layer
due 6 * 8). Theoretically I get 0,576 ms per transmission using 250
Now… I use my USRP with an XCVR, start the sender, and record some
transmissions into cfiles. I plot the Q channel, and calculate the
transmission lengh: like how many samples do I see with a high
amplitude relative to the sample-rate. That’s when the RF chip is
active. I get 3,45 ms.
So the difference between what I theoretically can assume as TX time,
and what I see in reality is abnormally high.
1.) Can I use an Oscilloscope to find out whether there’s signal on
IF. The Oscilloscope operates on something like 100 MHz, so I want the
I read that the RF frontend down-scales to the ADC. I can pin-point
the microcontroller with the chip. Can I also pin-point the USRP2 with
the XCVR (or RFX)? I could correlate the state-changes of the RF Chip
2.) I suspect the differences result from state-changes on the RF chip
(PLL_ON -> TX). Is anybody aware of a similar performance analysis on
Zigbee for real-time applications? As I stated out I’m not a
3.) I use the Gnuradio Companion, and record from the USRP2 directly
into a cfile. Is there any way to apply a filter to be able to
automatically detect the start and end point of a high-amplitude that
represents sending activity? I tried to use the integrator to lower
the amplitude (I’m not interested in demodulation at this point, just
tx start and end). But that doesn’t work very well.
I hope my terminology isn’t too crude