I tried Greg Heckler’s suggestion of minimizing the reference
running rx_cfile.py, giving a frequency of 1.57542G results in a
divisor of 8, whereas a frequency of 1.574G decreases this to 2. (I
probed at the chip and saw the same inexplicable factor of two disparity
against the data sheet.)
But I saw no particular improvement, so I guess I was already at the
noise floor of the MAX2118 synthesizer. After surfing a bit of the lore
regarding local-clock issues for GPS, though, it seems the USRP’s TCXO
is more or less par for the course: 10 ppb per second jitter is roughly
needed for a reasonable receiver, and the USRP is delivering this
Could FPGA jitter be contributing? It can be hundreds of picoseconds,
but whether there would be low-frequency content in the jitter spectrum
I don’t know. Should I try moving the R193/R194 resistor on the dbs_rx
to select clock_p (which is apparently a clean 64 MHz feed from the
chip)? The MAX2118 data sheet says 27 MHz max, but maybe that’s just
for the crystal oscillator, and the reference divider might be okay with
a higher external frequency.
Attached is a plot of carrier frequency over ten seconds for two
different satellites. Clearly the gyrations are overwhelmingly
common-mode. (The difference in doppler rate is just visible as
the traces start to converge, about 0.2 Hz/sec.) The blue trace is
about 6 dB stronger, as can be seen from the lower loop noise.
y axis is carrier in Hz, x axis is time in milliseconds.
Digikey stocks some small SMD OCXOs. They cost $120 (ouch) and
dissipate 1.5 W (ouch). But fitting one to a USRP might be as easy
as tripling the 20 MHz oscillator to 60 MHz (and the edge rate is
apparently fast enough that the tripler would just be a filter).