USRP / DBSRX calibration

As Eric Matlis a while ago (see link to thread at the bottom), I would
like to calibrate my USRP/DBSRX combination for measurements. As noted,
the USRP is not a measurement device, but nevertheless I’d like to give
it a try. My application are GPS based measurements in the GSM downlink
band.

Using a noise generator, I came up with the following figures:

USRP settings: DBSRX, gain 52, decimation factor 32
Calibration bandwidth: 190 kHz around center frequency, 3dBm cable loss
accounted for

Offset (to convert from dB to dBm): -177.7
Noise floor: -112.5 dBm, sensitivity -165 dBm/Hz (NF: 8.7 dB at 20
degrees)
Linear measurement range: -100 dBm to -50 dBm

I noticed that (with gain 52) the USRP goes into saturation above -50
dBm and does not quite show linear behaviour below -100 dBm.
Furthermore, -165 dBm/Hz is a better sensitivity than my spectrum
analyzer
offers, which I find rather odd. Do these figures make sense?

Best regards,
Jens E.

Link to relevant thread:
http://lists.gnu.org/archive/html/discuss-gnuradio/2007-03/msg00414.html

Replying a bit late here…

Jens E. wrote:

accounted for

That’s a lot of cable loss. Are you sure its that much?

Offset (to convert from dB to dBm): -177.7
Noise floor: -112.5 dBm, sensitivity -165 dBm/Hz (NF: 8.7 dB at 20 degrees)
Linear measurement range: -100 dBm to -50 dBm

I noticed that (with gain 52) the USRP goes into saturation above -50
dBm and does not quite show linear behaviour below -100 dBm.

Not sure what you mean above. With gain of 52, the system gain is
pretty high. For a strong signal like -50dBm you would want to have a
lower gain.

Furthermore, -165 dBm/Hz is a better sensitivity than my spectrum analyzer
offers, which I find rather odd. Do these figures make sense?

Spectrum analyzers rarely have good noise figures. -165dBm/Hz
translates to a roughly 9dB noise figure, which is a little high for the
DBSRX, but not unreasonable. It is better than my spectrum analyzer.

Matt

Matt,

thanks for your reply.

Using a noise generator, I came up with the following figures:

USRP settings: DBSRX, gain 52, decimation factor 32
Calibration bandwidth: 190 kHz around center frequency, 3dBm cable loss
accounted for

That’s a lot of cable loss. Are you sure its that much?

The figure I posted was wrong, it was actually 1.3 dB.

I noticed that (with gain 52) the USRP goes into saturation above -50
dBm and does not quite show linear behaviour below -100 dBm.

Not sure what you mean above. With gain of 52, the system gain is
pretty high. For a strong signal like -50dBm you would want to have a
lower gain.

I was trying to measure signals starting at roughly -100 dBm (400kHz).
That’s why I chose a high gain. The “saturation effect” I’m seeing
around -50 dBm might well be due to numerical limitations of the
algorithm (integral over PSD estimate) used to calculate the power in
the band. I have to check on that again.

Should I be able to measure linearly over more that 50 dB dynamic range
with the USRP/DBSRX combination?

Furthermore, -165 dBm/Hz is a better sensitivity than my spectrum analyzer
offers, which I find rather odd. Do these figures make sense?

Spectrum analyzers rarely have good noise figures. -165dBm/Hz
translates to a roughly 9dB noise figure, which is a little high for the
DBSRX, but not unreasonable. It is better than my spectrum analyzer.

Thank you very much for that insight. I was rather suprised, given the
20dB$ price difference.

Jens

Hi,

During my tests with DBSRX board, I noticed that using the same Local
Oscillator offset tick (by 4MHz) used with the RFX boards in the dbsrx
board, enhanced the dbsrx SFDR by about 14 dB as shown in the following
figures :

http://rapidshare.com/files/79323579/dbsrx.tar.gz

I used 1000.25 MHz sine wave to do the tests. To simulate the dbsrx LO
offset I cheated the usrp_fft.py set_freq() function as follows :

r = self.u.tune(0, self.subdev, target_freq+4000000)
self.u.set_rx_freq(0,4000000)

Firas A.


View this message in context:
http://www.nabble.com/USRP---DBSRX-calibration-tp12677545p14509886.html
Sent from the GnuRadio mailing list archive at Nabble.com.

Hi.

I’m trying to estimate the noise temperature in the DBSRX and USRP as a
part of a student project where we compare the USRP as an alternative
for the receiver in a radio telescope. I tried to convert the -165dBm/Hz
to Kelven but I’m not getting a sane value. I may be way out here but
would be very pleased to find out what the noise temperature might be,
in Kelvin.

Thanks
Staffan J.

Staffan J. wrote:

Staffan J.

The MAX2118 chip that is used to do the downconversion has a very high
noise figure–10dB.
The DBS_RX board has a GaAsFET amplifier in front of it. which has a
roughly 0.9dB
noise figure.

Unless you have the DBS_RX right at the receive horn, you’ll need your
own LNA chain
in front of the DBS_RX. Go with an LNA with the lowest noise figure
you can afford.
The LNAs produced by Radio Astronomy Supplies are pretty good, and
there’s a fellow
in Switzerland who produces semi-custom LNAs for the 21cm band that
are really well
built. I use three LNAs in series in my system–all from Down East
Microwave. My
first LNA is attached directly to the feedhorn, followed by a ceramic
filter cut for 1420Mhz,
followed by another pair of LNAs–I did it this way because I had a
lot of feedline to drive,
and certainly you could get away with less front-end gain. But your
first low-noise amplifier
element had better be right at the feedhorn–no feedline (high-quality
RF connectors between
the feedhorn and the first LNA are OK, but keep them to an absolute
minimum).

My overall system temperature with this setup was about 95K.


Marcus L. Mail: Dept 1A12, M/S: 04352P16
Security Standards Advisor Phone: (ESN) 393-9145 +1 613 763 9145
Strategic Standards
Nortel Networks [email protected]

Jens E. wrote:

I was trying to measure signals starting at roughly -100 dBm (400kHz).
That’s why I chose a high gain. The “saturation effect” I’m seeing
around -50 dBm might well be due to numerical limitations of the
algorithm (integral over PSD estimate) used to calculate the power in
the band. I have to check on that again.

Should I be able to measure linearly over more that 50 dB dynamic range
with the USRP/DBSRX combination?

I would think so. Also, one thing to keep in mind is that the DBSRX has
a very complex collection of settings for gain. You have 4 interacting
parts –

  • RF Gain (GC1, set by a DAC)
  • Baseband gain (GC2, set by serial bus)
  • DL, a 1-bit scale control set by serial bus
  • Programmable gain amp, in the AD9862 ADC, set digitally

The current formula for setting for all of these is most likely NOT
optimum. In fact, it may be quite far from optimal. If you care about
dynamic range, you should take a look at the code in
gr-usrp/src/db_dbs_rx.py and optimize it. You might even want to call
the individual gain controls from your app instead of the main gain
control function.

I like the concept of dB$ !!!

In any case, typically the spectrum analyzer has a step attenuator at
the front, and on most it is intentionally difficult to set it to 0dB.
Mine has a minimum of 10dB, which just adds 10dB to the noise figure
right there.

Matt