Have some questions regarding received signal levels and what seems like
LO leakage on transmit side.
We are using N210 with WBX (UHD driver). This issue seems to be pretty
independent of carrier frequency.
-
Receive sensitivity - We’re cabling in a CDMA2000 carrier (1.2288 MHz
bandwidth) from a femtocell
base station we have. The transmit power is about -80 dBm. We’ve done a
simple test with GRC to
receive this signal, write to a file and do PN correlations in Matlab to
verify the integrity of the
data. With the signal level of -80 dBm and the rx_gain in the GRC gui
set to maximum (38 dBm) the
I/Q data seems to be about 3 bit values (about +/- 4 or 5). These values
seem a bit low
for this signal level. On other SDR receivers we use (not usrp) with max
gain, signals over the air from base stations
are about -90-95 dBm and toggle about 6 bits on ADC.
In short, I’m wondering if this is expected behavior or if there is
another analog gain setting to change on receive side.
-
Transmit leakage
We’ve set up a simple program in GRC to take the received IQ from the
same base station and retransmit them on a different frequency.
The receive is in the 800MHz band, and we’ve tried transmit on both
500MHz and 1GHz. Looking at the output on a spectrum
analyzer as well as just plotting IQ in Matlab, there is some strong
carrier leakage, in fact the carrier is dominant over the
signal for all practical ranges of signal level at the input. With the
input at -30 dBm and the rx_gain set to max 38 dBm
the signal begins to finally overtake the carrier however this signal
level is obviously not practical. Furthermore, the gain parameter in
GRC’s UHD: USRP Sink block seems to do nothing based on observing live
spectrum of the transmit signal.
The DC offset of the receive data seems to be negligible - we’re writing
it to a file before it goes out.
Wondering what could be causing such strong LO leakage.
On 08/02/2011 08:10 AM, Delgado, Christopher wrote:
and the rx_gain in the GRC gui set to maximum (38 dBm) the I/Q data
seems to be about 3 bit values (about +/- 4 or 5). These values seem
a bit low for this signal level. On other SDR receivers we use (not
usrp) with max gain, signals over the air from base stations are
about -90-95 dBm and toggle about 6 bits on ADC. In short, I’m
wondering if this is expected behavior or if there is another analog
gain setting to change on receive side.
You didn’t state what frequency you are using or what decimation, but
the noise figure should be under 6 dB, and that is the only true measure
of sensitivity.
You can always add more digital gain in the fpga, but I don’t think you
need that.
Are you sure you are using the correct antenna input? Also, the
grand-daughterboard is static sensitive. In order to check to see if
yours is working, please put a known level sine wave in from a signal
generator, and send us a screenshot of the uhd_fft display along with
the settings you used.
GRC’s UHD: USRP Sink block seems to do nothing based on observing
live spectrum of the transmit signal. The DC offset of the receive
data seems to be negligible - we’re writing it to a file before it
goes out. Wondering what could be causing such strong LO leakage.
I am confused here. Is this carrier leakage out of the transmit port,
or leakage of TX into the RX?
Matt
Thanks for the feedback.
In regards to the first part, our receive signal from the BTS is at
about 880 MHz and our
sampling rate is set to 4 MHz.
We are definitely using the correct antenna ports, assuming they are
labeled correctly
on the hardware. The uhd_fft.py screenshot is attached - this is from a
signal generator outputting at -60dBm.
In regards to the transmit leakage, I am referring to leakage out of the
transmit port.
I see the CDMA spectrum sitting well under a dominant carrier on the
spectrum analyzer
for the ranges I mentioned. Also in the IQ data in Matlab I see the data
riding the
dominant carrier in the time domain. All the GRC program does is
transmit the receive
samples at a different frequency. The CDMA spectrum hiding under the
carrier is of the
correct bandwidth (~1.23MHz)