I am generating my own “OFDM” waveform which actually does not modulate
anything. Step by step.
I am simulating 1024 subcarriers but first generating 1024 zeros, and
want to only enable the center 100 subcarriers. To do this, I map the
center frequency at the first index in the array, then the positive
frequency subcarriers, then the negative frequency subcarriers. To
the center 100 subcarriers, I multiply the first 50 positive frequency
subcarriers by: (1.472) * complex(1,1), and do the same with the first
negative frequency subcarriers. Finally, I take the IFFT of the data,
a size of 1024. I multiply this by sqrt(1024).
If I then take the FFT, to double check what I’ve done, and plot against
subcarrier index, I get what I expect:
Now, I want to transmit this out of the USRP2 so I write the typical GNU
Radio python script to read an 8-byte complex file source and pump it to
USRP2… repeating the complex samples in the file on finish for
I start up the spectrum analyzer and get a clean noise floor:
Now, when I transmit from the USRP2 using an interpolation rate of 32:
Well, I can certainly see the waveform. But I have two questions…
My calculation of the bandwidth of the 100 active bins, and the
bandwidth, is off by a factor of two:
(((100 MHz) / 32) / 1024) * 100 = 305.175781 kilohertz, where 32 is the
interpolation rate, 1024 was the size of the IFFT, and 100 is the number
active bins. I see 610KHz, however.
I’m not seeing a clean power falloff outside of the active bins. I
seeing a lot of power rolloff. Do I need to be applying a low pass
high pass filter for this? Or is this the result of weirdness created
the interp CIC filter and the dual half-band filters? I’m not too
at this level of communications…
I’d greatly appreciate any feedback. I can provide any code, but my
is that I have a more fundamental misunderstanding than a coding error.