Frequency response from fading simulator block not working as expected

Hello,

In order to simulate a multi-path scenario, Im trying to validate the
“Frequency Selective Fading Model” block, but the results I found are
not
the ones I would expect.
The set up consisted of 2 taps delayed 20 microseconds each other, so
that
the frequency response should have very deep fadings spaced regularly in
frequency. The results I got showed a flat frequency response, as well
as
in other standardized power delay profile scenarios.
The blocks I used were a random source -> OFDM Mod -> Throttle ->
Frequency
Selective Fading Model -> WX GUI FFT Sink. Sample rate was set as 1MHz,
so
that in Freq. Selective block I set PDP Delays as [0, 20] and PDP
Magnitudes as [1, 1].
Please tell me if I misunderstood any of these parameters or this block
could be really improved.

Thanks and Regards,
Ricardo

On 02/28/2014 02:13 PM, Ricardo Yoshimura wrote:

and PDP Magnitudes as [1, 1].
Please tell me if I misunderstood any of these parameters or this block
could be really improved.

Did you increase the number of taps?
Also, note that this block provides a time-variant channel. If you
average, you might not see fades. For a static channel, use the “Channel
Model”.

If you use the default settings, you should see fades, right?

M

Hi Martin,

Thats right! As the number of taps was increased, the deep fadings were
visible, ranging until about 20dB.

Thanks and Regards,
Ricardo

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs