Dear all,
I’m calculating the RMS value of a signal with the following setup
RF vector signal generator -> USRP/rtl-sdr/… -> “RMS” block in GRC.
Please bear with me - I’m not interested in the exact (absolute) voltage
level.
What I can’t explain is why the calculated RMS value consistently shows
a higher value when the signal is modulated (e.g. has a higher
peak/average power ratio) compared to CW (unmodulated sine wave)
For instance, RMS value shown in GRC for band-limited Gaussian noise is
always around 2.5 dB higher than RMS of CW of equivalent power
(equivalent power according to the generator level setting and a
spectrum analyzer with a power meter function). Similarly, 100% AM
modulated signal shows around 1.3 dB higher RMS.
This effect appears with a USRP, rtl-sdr dongle as well as some custom
hardware, so it doesn’t seem to be something device-specific. Also, all
hardware effects on the receiver side I can imagine result in lower
gain for modulated signals. If anything, I would expect to see a lower
RMS when turning on the modulation.
Some more details are in this blog post:
https://www.tablix.org/~avian/blog/archives/2015/04/signal_power_in_gnu_radio/
Any ideas would be welcome. At this point I have a feeling I’m missing
something obvious here…
Thanks,
Tomaž