Hi all,
I am new to USRP and I started by trying to calibrate the received power
in
USRP by giving a known sine signal. I gave a signal of *100.2 MHz at -30
dBm
- to USRP which sampled it at 1MHz then passed it to a 1024 point FFT
scope
in Gnu Radio Companion. I saw a level of about* -52 dB at 100.2 MHz*.
I also modified the usrp_spectrum_sense.py to find FFT of same signal
using gr_fft_vcc function and then print the magnitude squared value to
a
csv file. When I took 10log10(value) the plot showed me about -30dB
level
at 100.2 MH*z. Following are my queries
-
Is the power level in GRC -52 dB or -52 dBm? Accordingly why is the
-22
dB loss? ( My antenna and channel gain were set to 0dB) -
Why are fft squared amplitude in dB different here and which one is
correct?
Regards,
Hemant