GRC and gr_fft_vcc function showing different signal power in dB

Hi all,

I am new to USRP and I started by trying to calibrate the received power
in
USRP by giving a known sine signal. I gave a signal of *100.2 MHz at -30
dBm

  • to USRP which sampled it at 1MHz then passed it to a 1024 point FFT
    scope
    in Gnu Radio Companion. I saw a level of about* -52 dB at 100.2 MHz*.

I also modified the usrp_spectrum_sense.py to find FFT of same signal
using gr_fft_vcc function and then print the magnitude squared value to
a
csv file. When I took 10log10(value) the plot showed me about -30dB
level
at 100.2 MH*z. Following are my queries

  1. Is the power level in GRC -52 dB or -52 dBm? Accordingly why is the
    -22
    dB loss? ( My antenna and channel gain were set to 0dB)

  2. Why are fft squared amplitude in dB different here and which one is
    correct?

Regards,
Hemant

On 10/17/2012 05:21 AM, Hemant Saggar wrote:

at 100.2 MH*z. Following are my queries

  1. Is the power level in GRC -52 dB or -52 dBm? Accordingly why is the -22
    dB loss? ( My antenna and channel gain were set to 0dB)

The WX gui FFT plotter is scaled for dBfs.

0 dBfs is equivalent 1.0 sample counts, which is fullscale.

  1. Why are fft squared amplitude in dB different here and which one is
    correct?

See ./gr-blocks/lib/nlog10_ff_impl.cc

This uses the gr_fft_vcc and does various scaling compensation

-josh

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs