Delay

Dear All,

I have made my own little program (based on libusrp) that reads data
from a
file and sends it on the DAC of the USRP (basic db) and simultaneously
receives data and saves it on file. All at 2MHz sample-frequency.

It seems to work well. But when I connect the DAC directly to the ADC I
see
a delay of 7.65ms. I know that my program introduces some delay but not
that
much (I believe). Does anyone know what the minimum delay would be and
what
the contributing sources are ? Is it varying between different computers
?

BR/
Per Z.

On Tue, Mar 04, 2008 at 11:32:46AM +0100, Per Z. wrote:

BR/
Per Z.

To reduce the delay, you’ll want to specify the fusb_* parameters to
the constructor. This is assuming you’re running under GNU/Linux.

Eric

Per-

Per Z.

To reduce the delay, you’ll want to specify the fusb_* parameters to
the constructor. This is assuming you’re running under GNU/Linux.

After implementing Eric’s advice, please post the minimum delay value
you obtain.
I’m interested to hear. Thanks.

-Jeff

Per-

After implementing Eric’s advice, please post the minimum
delay value you obtain.
I’m interested to hear. Thanks.

Changin the fusb_* parameters didn’t change my results. By reducing the
buffer size (of the reads and writes) the delay is reduced down to around
1ms (I have some problems with underruns and they are more frequent when I
use small buffer sizes - still for short runs (0.25sec) its quite OK ).

Thanks Per. This gives some idea of a range of typical in-to-out delay
to expect (1
to 7 msec).

-Jeff

After implementing Eric’s advice, please post the minimum
delay value you obtain.
I’m interested to hear. Thanks.

-Jeff

Changin the fusb_* parameters didn’t change my results. By reducing the
buffer size (of the reads and writes) the delay is reduced down to
around
1ms (I have some problems with underruns and they are more frequent when
I
use small buffer sizes - still for short runs (0.25sec) its quite OK ).

BR/
Per Z.