GRC question: How to measure BER when Tx and Rx are on different machines?

Hi all:

I have Tx and Rx on different machines. To measure the BER, I need to
compare the a reference stream and the output stream of the demodulator.

I suspect that my way to obtain the reference stream is wrong.

Here is how I obtain the reference stream:

1: At the Tx, I used Random Source to generate a 0/1 sequence. This
sequence
is saved into a DAT file.

Random Source (Byte) --> File Sink (the file is saved as input.DAT)

2: Then I use the DAT file as the input of the Modulator, in this way, I
have the same input data every time when I run the flow diagram.

File Source (input.DAT) --> Modulator

3: I made a copy of the DAT file and put it on the Rx machine

4: I tried to compare the output of the demodulator and this DAT file,
but
something goes wrong here. The output of the demodulator is a 0/1
sequence
but the DAT file is not.

Could anyone guide me with this problem?

Thanks a lot!

Rachel

On Tue, Nov 2, 2010 at 4:32 PM, Rachel Li [email protected]
wrote:

sequence is saved into a DAT file.
4: I tried to compare the output of the demodulator and this DAT file, but
something goes wrong here. The output of the demodulator is a 0/1 sequence
but the DAT file is not.

Could anyone guide me with this problem?

Thanks a lot!

Sounds like a packed/unpacked problem. Some blocks work with packed
representations of data (using all bits in the byte). Others work with
just
the LSB of the byte. Experiment with using packed_to_unpacked /
unpacked_to_packed blocks.

-Steven