(no subject)

Hi,
I am working on a USRP project. I am using a USRP N200 with a BasicRX
daughter board to perform data streaming. I am using GRC 3.6.4.1 on a
Ubuntu 12.04 machine. The USRP N200 is set to have a decimation factor
of 8
to achieve a streaming rate of 12.5 MSPS, the daughter board is set to
A:A
to perform single channel real sampling. The I Q data wire format is set
to
be 16 integer each, and the host data format is set to be complex float
32.
The settings above are also in the attachments. For analyzing data, I am
using Matlab R2013b on Windows 7. My question is: when I am streaming
data
for about 10 seconds, I am getting a file around 500,000,000 bytes which
is
reasonable ((16 bits/sample I+16 bits/sample Q)*(10 seconds)12.5
MSPS
(1/8
bytes/bits)=500 Mbytes), but when I use read_complex_binary.m provided
by
GNUradio to read the file in Matlab, I am only getting half of the
expected
samples (~62,500,000 samples) instead of 125,000,000 samples. Could
anyone
please explain why this is happening? Doesn’t 16 bits I and 16 bits Q
wire
format give me 32 bits per sample? Any help would be appreciated.