I’ve been working with the OFDM examples that come with GNURADIO
(benchmark_tx.py and benchmark_rx.py). I’ve noticed that my bit error
rate
(BER) increases as the length of my cyclic prefix (CP) increases. This
is
opposite of what theory predicts, since larger CP decreases inter-symbol
interference.
I googled the issue and found this paper document the same issue (page
31):
The paper explains this by saying longer CP adds additional overhead by
making the frames longer, thus leading to worse performance. I’m having
a
hard time understanding how this explains a performance increase. I see
little change in system performance between CP=1 and 128 when monitoring
system resources, and the number of packets received and number of
correct
packets doesn’t change much from run to run. Just the number of errors
in
packets that do fail.
I played around with a lot of different input parameters (FFT size,
tx/rx
gain, occupied tones, bandwidth, etc) but the issue persists. There’s
also
no direct correlation between these parameters and BER, and all these
parameters can increase the complexity of the system.
I’m using a USRP2 with a xcvr2450 daughterboard and the latest version
of
GNURADIO as of august (I think 3.6?) on ubuntu.
Has anyone else had this issue? Any solutions or insight into the
source
of the problem?
Regards,
Dave.