On Tue, Mar 15, 2011 at 4:48 PM, Scott Johnston
[email protected] wrote:
I wrote a simple grc graph to compute power spectra of the output of the
usrp. My test setup is a signal generator outputting a tone at the same
frequency that I set as the usrp center frequency, directly connected to the
usrp. But when I examine the output in Matlab, the first vector is exactly
what I expect, a peak at the center and then the noise floor. The following
vectors show the peak at lower and lower power levels until, after 10 or so,
it disappears into the noise. My code is attached. Anybody know what could
be causing this?
My guess is that you are seeing a DC offset correction loop closing.
This would occur if the RF tone of interest is downconverted to DC
(which sounds like what you described). I don’t recall how DC offset
correction is actually implemented in USRP land, but it may very well
take a few tens/hundreds of samples before it is completely corrected,
which lines up with what you’re seeing.
Try changing your sig gen to output a tone at, say 100 Khz from your
RF tuning frequency, and it should not attenuate over time.