For complex IQ data values, what is the min/max range both for data
coming from the USRP and data sent to the USRP?
There are three obvious guesses:
- Normalized to +/-1.0.
- Direct match to the ADC/DAC ranges.
- Normalized to a nominal ADC/DAC range.
The first is right out since the values coming back are typically much,
much larger than that.
The second would result in the RX range (which uses a 12-bit ADC) being
either 0 to 4095 or -2048 to +2047 while the TX range (which uses a
14-bit DAC) would be either 0 to 16383 or -8192 to +8191. However, while
I see both positive and negative values in the RX data stream, I also
see values that are well above even the unsigned range.
This leads me to suspect that the data is being normalized to signed
16-bit values in both directions yielding a range of -32,768 to +32,767
by the simple expedient of left shifting the ADC output by four places
and right shifting the DAC output by two places (with sign extension, of
course). This seems reasonable as it would make migrating to higher (or
lower) resolution ADCs and DACs transparent, at least until devices with
greater than 16-bits of resolution are used.
Is that correct, at least where the useful range of values for complex
IQ data is concerned?
Thanks.