I have made my own little program (based on libusrp) that reads data
from a
file and sends it on the DAC of the USRP (basic db) and simultaneously
receives data and saves it on file. All at 2MHz sample-frequency.
It seems to work well. But when I connect the DAC directly to the ADC I
see
a delay of 7.65ms. I know that my program introduces some delay but not
that
much (I believe). Does anyone know what the minimum delay would be and
what
the contributing sources are ? Is it varying between different computers
?
After implementing Eric’s advice, please post the minimum
delay value you obtain.
I’m interested to hear. Thanks.
Changin the fusb_* parameters didn’t change my results. By reducing the
buffer size (of the reads and writes) the delay is reduced down to around
1ms (I have some problems with underruns and they are more frequent when I
use small buffer sizes - still for short runs (0.25sec) its quite OK ).
Thanks Per. This gives some idea of a range of typical in-to-out delay
to expect (1
to 7 msec).
After implementing Eric’s advice, please post the minimum
delay value you obtain.
I’m interested to hear. Thanks.
-Jeff
Changin the fusb_* parameters didn’t change my results. By reducing the
buffer size (of the reads and writes) the delay is reduced down to
around
1ms (I have some problems with underruns and they are more frequent when
I
use small buffer sizes - still for short runs (0.25sec) its quite OK ).
BR/
Per Z.
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.