Forum: GNU Radio USRP Delays

Announcement (2017-05-07): www.ruby-forum.com is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see rubyonrails.org/community and ruby-lang.org/en/community for other Rails- und Ruby-related community platforms.
04a562aa0bb96f564cf5134a32d63c2d?d=identicon&s=25 sri ram (Guest)
on 2008-11-28 21:48
(Received via mailing list)
Hi everyone,
                  I am interested in knowing the delay jitter of the
total
transmission time of a packet/waveform. Specifically, I want to know the
time between the time the flowgraph is tsarted in python (tb.run or
tb.start) and the time that the first sample is transmitted into the air
from the USRP hardware. I want to know if I can reduce the jitter in
this
time (across runs) to as low a value as possible.

I use usrp_siggen.py with gnuradio-3.1.2 to transmit a square waveform
.I
try to start the flowgraph at a precise time x (in microseconds) by
using
y=x-time.time() and time.sleep(y) and then tb.start(),time.sleep(0.1)
and
tb.stop(). I also use an interp of 32 at the Tx. I have a receiver that
logs
all data (with -d 64). I observe the samples just out of the usrp
source.

For transmissions that are precisely spaced in time using appropriate
values
x, the inter transmission delay measured at  the receiver is off by a
value
> than 2 ms.
(despite using nice to increase the priority of this process).

1. Is it possible to reduce the jitter between successive Tx.delay
measurements to be few 10's of microseconds or lesser?

2. Is there a way to run the usrp_siggen code as a kernel module to
improve
the delay jitter performance?

Thanks in advance for your help,
Sri
This topic is locked and can not be replied to.