Decimation Rates

I’m using a USRP and the usrp_rx_cfile program to record data for later
processing. When I graph the recorded data, it looks choppy as if I
need to sample at a higher rate. I have done a few runs at various
decimation rates and I’m confused by the resulting file sizes. When I
set the decimation rate to 4, 8, or 16 for a constant period, the
resulting recordings are approximately the same size. At decimation
rates of 32 or higher, the recording sizes reduce dramatically. I would
expect the recording sizes to increase inversely and proportionally to
the decimation rate. The fact that this isn’t happening makes me
suspicious of whether changing the decimation rate is really doing
anything. I thought it might be that the USB throughput was the
limiting factor when decimation rates we low, but I tried recording
shorts rather than floats to reduce bandwidth usage, but I saw the same
results. What am I missing?

On Thu, Oct 04, 2007 at 10:52:17PM -0400, Andrew B. wrote:

anything. I thought it might be that the USB throughput was the
limiting factor when decimation rates we low, but I tried recording
shorts rather than floats to reduce bandwidth usage, but I saw the same
results. What am I missing?

It it reporting overruns? Do you see uOuOuO… on stdout?
If so, you’re not keeping up and are dropping samples.

If you’re running decim = 8, you’re producing data at 8 MS/s (32MB/s
across the USB, and to disk if you’re storing as shorts). You may be
running into filesystem or disk throughput problems. If you’re
writing data to an ext3 filesystem (the default under Linux), edit
/etc/fstab so that it’s mounted as ext2 instead of ext3 and
reboot. The ext3 file systems don’t stream continuously well because
they “go dead” while they post their journals.

Also, when you say “a constant period” are you referring to watching
the clock, or are you specifying the -N command line
argument. Use the -N argument.

Eric