I’ve been using a waveform driver with known amplitude to calibrate the
counts reading on the USRP/LFRX when I use the usrp_cfile_rx.py script
file with decimation of 32 (2 MHz sampling).
I performed the calibrationÂ using 0dB gain on the USRP, and 10 dB gain
for various frequencies from 10 to 150kHz.Â The difference in measured
voltage between 0dB and 10dB was about aÂ factor of 3.04 for all
I know that the voltage gain with 10dB should be a factor of 3.2.Â Does
anyone know if this is purely in an ideal state?Â Â I’m guessing that is
the case and my setup is working properly?
On Fri, 2008-12-05 at 08:09 -0800, dan s wrote:
Does anyone know if this is purely in an ideal state? I’m guessing
that is the case and my setup is working properly?
10dB power delta is ~3.16 factor in amplitude change.
An amplitude change of 3.04 is 20*log10(3.04)=9.65 dB.
The AD9862 datasheet says the PGA gain error is ±0.3 dB. I suspect if
you look at the tolerances in your measurement setup you’ll find you are