I was able to gather results, and I am really confused with it. I
generated
a -30 dB signal based on the fft plot shown and transmitted it using a
usrp. My spectrum analyzer received a signal at -50 dBm (-80 dB) and my
receiver which also uses a usrp received the signal and plotted it at
-30
dB. My question is, what is the unit of the fft plot? Is it dB or dBm?
–
Best,
*Gerome Jan M. Llames *
Engineering Research and Development for Technology (ERDT) Scholar
University of San Carlos - Technological Campus
Nasipit Talamban, Cebu City, Philippines, 6000
Mobile: +639271525124
Email: [email protected]
“Design is not just what it looks like and feels like. Design is how it
works.” - Steve Jobs
I was able to gather results, and I am really confused with it. I generated
a -30 dB signal based on the fft plot shown and transmitted it using a usrp.
My spectrum analyzer received a signal at -50 dBm (-80 dB) and my receiver
which also uses a usrp received the signal and plotted it at -30 dB. My
question is, what is the unit of the fft plot? Is it dB or dBm?
They’re dBFS or dB Full Scale …
So they’re just relative to the “full scale” range defined in GR as -1.0
… 1.0
USRP are not calibrated instruments, so you can’t map this to dBm or
any absolute power measurement. Only relative measurements are valid.
In order for Gnu Radio to display calibrated power units, it would need
to have a very vigorously-defined interface to each and every piece of
hardware so
that power levels displayed in the FFTs are in calibrated convenient
power units, like dBm. This in turn, would require that every piece of
hardware that
connects to Gnu Radio be vigorously calibrated over their entire
operating parameter space, including sample-rate, tuning frequency, and
gain setting.
This isn’t, as you might imagine, practical.
So, what Gnu Radio receives are digitized voltage samples that are
mostly-linearly-proportional to the voltage received at the antenna
terminals
of the device. These are in turn, for purposes of convenience and
generality, converted into a floating-point number in the range
{-1.0,+1.0}
within a flow-graph. But without calibration on the part of the
end-user, they are “unitless”, and you have to determine the
proportionality
in the context of your own application.
On 08/03/2015 12:35 PM, John Ackermann N8UR wrote:
It might be helpful to clarify that since this is a voltage ratio,
it’s 20log rather than the 10log used for power (e.g., doubling
voltage is 6dB, doubling power is 3dB), so the scaling will look
different than a typical spectrum analyzer. (It would be nice if the
instrumentation blocks could have an option to select voltage, or
power into 50 or 75 ohms…)
Yes, and we need 50 ohm and 75 ohm terminator blocks in GRC…
But, seriously, one can always prefix any instrumentation block with a
scaling function in the ax + b form with a multiplier and adder to
achieve whatever
pleasant-and-convenient scaling one wants.
It might be helpful to clarify that since this is a voltage ratio, it’s
20log rather than the 10log used for power (e.g., doubling voltage is
6dB, doubling power is 3dB), so the scaling will look different than a
typical spectrum analyzer. (It would be nice if the instrumentation
blocks could have an option to select voltage, or power into 50 or 75
ohms…)
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.