On Tue, Jul 29, 2008 at 5:42 AM, isaacgerg [email protected]
If this is true, why is it that the residual carrier is reported as zero?
Are you suggesting that this value is not correct?
What GRC is reporting as “residual carrier” is not what you seem to
think it is. It refers to how close the software was able to tune the
USRP to your requested frequency. Usually this is in the millihertz
range, or zero, which is what you are seeing.
However, the tuning calculation assumes the USRP clock is exactly 64
MHz, which it never is. Each USRP clock will be slightly different in
frequency, depending on manufacturing tolerances, temperature, age,
and other factors. That is why it is rated at 20 ppm (parts per
million), which means the actual frequency can be anywhere from -20
ppm to +20 ppm away from 64 MHz, or ±1280 Hz. At 24 MHz center
frequency, the USRP/BasicRX can actually be tuned to ±480 Hz around
that frequency. And it’s not even fixed; it will still drift within
this range, primarily due to thermal transients. This difference from
exactly 24 MHz is not what the “residual carrier” term is reporting,
and there is no way for the software to “know” what this offset is.
So even if two USRPs are tuned to what they each think is 24 MHz, and
their tune functions report a “residual carrier” of 0, one will be
receiving at a slightly different frequency that the other is
transmitting, perhaps even a a kilohertz away (2*480 Hz). As many
have mentioned, this will cause your received data to rotate in phase
at the difference in frequency, resulting in “flipped bits”. (The
effect is dependent on which type of modulation you are using,
Thus, your USRP is working, and 24 MHz is an okay frequency to tune
the BasicRx. You just haven’t implemented any receiver
synchronization in your software.
Corgan Enterprises LLC