RMS value of a signal changes with PAPR

Dear all,

I’m calculating the RMS value of a signal with the following setup

RF vector signal generator -> USRP/rtl-sdr/… -> “RMS” block in GRC.

Please bear with me - I’m not interested in the exact (absolute) voltage
level.

What I can’t explain is why the calculated RMS value consistently shows
a higher value when the signal is modulated (e.g. has a higher
peak/average power ratio) compared to CW (unmodulated sine wave)

For instance, RMS value shown in GRC for band-limited Gaussian noise is
always around 2.5 dB higher than RMS of CW of equivalent power
(equivalent power according to the generator level setting and a
spectrum analyzer with a power meter function). Similarly, 100% AM
modulated signal shows around 1.3 dB higher RMS.

This effect appears with a USRP, rtl-sdr dongle as well as some custom
hardware, so it doesn’t seem to be something device-specific. Also, all
hardware effects on the receiver side I can imagine result in lower
gain for modulated signals. If anything, I would expect to see a lower
RMS when turning on the modulation.

Some more details are in this blog post:

https://www.tablix.org/~avian/blog/archives/2015/04/signal_power_in_gnu_radio/

Any ideas would be welcome. At this point I have a feeling I’m missing
something obvious here…

Thanks,
Tomaž

Hi Tomaž,

this points to your signal generator simply defining “the same power”
differently for modulation, noise and a sine wave.

Generally, things get a bit hairy as soon as you try to compare powers.
For example, your spectrum analyzer will have an adjustable bandwidth
per frequency step.
When observing a narrowband signal, you might, for example, set that
bandwidth to 200kHz. Now, your sine will have exactly the same power in
that window as if you observerd it with a 2MHz bandpass. For pseudowhite
noise, however, measured power will be linear to observed bandwidth –
because “white” defines the power spectral density to be constant, so
your 2MHz filter will see ten times the noise power of your 2kHz filter.

You already take care of that, by using a filter of your own – however,
I can’t tell from your blog post about the spectral properties of your
filter. Typically, when designing a filter, you have to make the choice
(at a fixed length) between sharp transition from pass- to stopband, and
low integral power of stopband “leakage”. The typical way of determining
the total power that passes through a filter is observing its response
to white noise – which is exactly what you’re doing. So my blind guess
would be that your spectrum analyzer’s filters are way sharper than your
GNU Radio-implemented filter.

You could play with different filter designs, longer filters, and even
cascading of a well-suppressing and a sharp-edgy filter, decimating as
far as each step allows. You should then look at the spectral shape.
Luckily, Parseval’s theorem says that the energy of the Fourier
transform is proportional to the energy in time domain, so just looking
at the sum of squares of taps will give you a value that’s proportional
to the total power that your filter lets through.

Regarding modulated signal: I’d assume that your signal generator uses
some pulse shape – maybe a root-raised cosine, or so, to convert the
individual symbols to baseband signals. These filters might or might not
actually be truely represented by the “constant power” setting, and just
as noise, they have a non-zero bandwidth. Depending on your filtering,
you might see more or less of the complete spectral shape of your pulse
(or the symbol rate spectral repetitions).

I hope I guessed in the right direction,
best regards,
Marcus

[1] by default, some daughterboards/USRPs have adjustable analog
bandwidth

Tomaz,

Looks like you are measuring power on the spectrum analyzer with
bandwidth
integration, and it’s giving you -95 dBm in both cases, which is good if
it;s doing teh math, but what happens when you put the analyzer in
linear
mode? Or leave it in log mode, open up the RBW >> 100 kHz and put it in
noise marker mode? Reason I’m suggesting is that the spectrum analyzer
(assuming it’s not signal analyzer) is taking the video average of the
log
detector output, versus the log of the average, and that equates to 1.4
dB
error for CW vs AWGN.

Maybe the ALC loop in your signal generator is also using a log
detector,
which dives you a false power for AWGN, which the spectrum analyzer then
falsely corrects with it’s log detector. Maybe GR is giving you the
correct
answer. Need an amp and power meter to give you a true power
measurement.

In GR I use complex_to_mag_squared block and integrator (with decimate)
to
get the mean square power. Maybe try that to see what it yields.

Lou

Tomaž Šolc wrote

What I can’t explain is why the calculated RMS value consistently shows
hardware, so it doesn’t seem to be something device-specific. Also, all

Thanks,
Tomaž


Discuss-gnuradio mailing list

[email protected]

https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


View this message in context:
http://gnuradio.4.n7.nabble.com/RMS-value-of-a-signal-changes-with-PAPR-tp53281p53283.html
Sent from the GnuRadio mailing list archive at Nabble.com.

Dear Lou,

On 13. 04. 2015 17:57, madengr wrote:

Looks like you are measuring power on the spectrum analyzer with bandwidth
integration, and it’s giving you -95 dBm in both cases, which is good if
it;s doing teh math, but what happens when you put the analyzer in linear
mode? Or leave it in log mode, open up the RBW >> 100 kHz and put it in
noise marker mode? Reason I’m suggesting is that the spectrum analyzer
(assuming it’s not signal analyzer) is taking the video average of the log
detector output, versus the log of the average, and that equates to 1.4 dB
error for CW vs AWGN.

Thank you very much for this suggestion. Indeed it appears that I was
taking an average of the log when doing power measurements with the
signal analyzer (technically my instrument is indeed a signal analyzer,
not a spectrum analyzer)

After some poking around the menus I found the “Average Mode” setting.
When setting it to “Power” instead of “Log” (which is the default), the
power measurements match what I’m seeing in GRC.

I didn’t look for this previously, because the instrument has a special
“channel power” measurement mode which I assumed set everything
correctly (it automatically sets, for instance, the detector to “RMS”
mode, optimal RBW and so on). I’ve gone through the manual, and it’s a
bit cryptic on this topic (“The average value is always correctly
displayed irrespective of the signal characteristic.”)

Maybe the ALC loop in your signal generator is also using a log detector,
which dives you a false power for AWGN, which the spectrum analyzer then
falsely corrects with it’s log detector. Maybe GR is giving you the correct
answer. Need an amp and power meter to give you a true power measurement.

Now everything indeed seems to point to the signal generator. I don’t
have a RF power meter though. I did just measure the RMS of the output
signal with a digital oscilloscope (at a low frequency), and that also
shows that the RMS increases with PAPR, but that is not a very accurate
measurement.

It’s not the ALC loop I think, because I can switch it off and the
difference remains. Anyway, at least now I know where to look. Thanks.

In GR I use complex_to_mag_squared block and integrator (with decimate) to
get the mean square power. Maybe try that to see what it yields.

I tried several flow graphs for calculating signal power in GR (the
standalone “RMS” block, “Complex conjugate” + “Multiply”, “Complex to
real” + “Complex to imag” + “Multiply”, etc.). I haven’t tried
complex_to_mag_squared, but others all come to within 0.1 dB of each
other.

Best regards
Tomaž

Now that’s disturbing that the analyzer is giving a false readings over
integrated BW measurements, even when it claims it’s in RMS mode,
although
that’s maybe opposed to peak detector mode. It’s still probably
defaulting
to a LOG detector, and you can measure multi-tone signals, but each tone
must fall within it’s own RBW to get a correct RMS measurement, then
integrate all of those to get the total power. I suppose the moral of
the
story is traditional spectrum analyzers in log detector mode are only
valid
for measuring single tones within a RBW, and use a thermistor based
power
sensor to validate everything.

Maybe you can now try calibrating your setup to dBm by measuring a
strong,
single tone then adjusting the K offset in LOG10 block to read the same
as
the spectrum analyzer. With the WBX card I can vary the source 40 dB
and
the power computed by GR varies maybe +/- 0.1 dB. I have not tried it
with
AWGN so I’d be interested to see your results, and I’m guessing it will
be
off due to shape of filter. Maybe instead use the spectrum plot for
calibration near the center of band, then normalize by bin size to get a
dBm/Hz for noise.

Lou

Tomaž Šolc wrote

After some poking around the menus I found the “Average Mode” setting.
When setting it to “Power” instead of “Log” (which is the default), the
power measurements match what I’m seeing in GRC.

I didn’t look for this previously, because the instrument has a special
“channel power” measurement mode which I assumed set everything
correctly (it automatically sets, for instance, the detector to “RMS”
mode, optimal RBW and so on). I’ve gone through the manual, and it’s a
bit cryptic on this topic (“The average value is always correctly
displayed irrespective of the signal characteristic.”)


View this message in context:
http://gnuradio.4.n7.nabble.com/RMS-value-of-a-signal-changes-with-PAPR-tp53281p53293.html
Sent from the GnuRadio mailing list archive at Nabble.com.

Dear Marcus,

thanks for your detailed answer.

I did carefully consider the effect of filters in my signal path. At one
point I removed the extra band-pass filter that is shown in the flow
graph in my blog post and the difference in power was still there
regardless.

For all my measurements, the majority of the signal power was in the
receiver’s pass band. Even the “noise” signal was limited to 100 kHz by
the signal generator, so only generator’s out-of-band emissions were
affected by the filter shape. As you said, these are non-zero, but as
far as I can see they don’t significantly contribute to the measured
signal power.

Best regards
Tomaž

On 13. 04. 2015 17:45, Marcus Müller wrote:

that window as if you observerd it with a 2MHz bandpass. For pseudowhite
to white noise – which is exactly what you’re doing. So my blind guess

Marcus

(equivalent power according to the generator level setting and a

[email protected]
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Discuss-gnuradio mailing list
[email protected]
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio


Tomaž Šolc, research assistant
SensorLab, Jožef Stefan Institute
http://sensorlab.ijs.si
mail: [email protected]
blog: http://www.tablix.org/~avian/blog

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs