Corrupted points when writing to file


I have a python app which seems to corrupt points when I use it to
demodulate a data stream stored in a file and write the de-modulated
result back to a file. I’ve attached the python code, and an image of
the problem can be seen at

The modulated waveform is shown in the top plot and the demodulated
in the bottom plot.

The signal was acquired with a Lecroy digital oscilloscope; it’s a 100
sine AM modulated at 10 kHz.

As you can see from the image, while the original modulated data shows
corruption in 10002 points, the demodulated data has the first 875-900
points incorrectly set to close to zero. This is repeatable.

Could it have anything to do with the fact that I’m generating a gui?
Could the time it takes to generate the gui window interfere with the
operation of the demodulation? If I were to make this just a script, ie
with no scope or fft sink, and no gui box, would it affect things? (And
p.s., how would I do that?)


Eric H. Matlis, Ph.D.
Aerospace & Mechanical Engineering Dept.
120 Hessert Center for Aerospace Research
University of Notre Dame
Notre Dame, IN 46556-5684
Phone: (574) 631-6054
Fax: (574) 631-8355

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs