Could not open large files (KILL)

Dear All,
I’m using USRP to capture data at certain interval of time and save the
data in a file. I used the level() method of gr.probe_signal_vc and a
vector length 1024 (samplelength). if the program is run for a short
time it works fine and and the files can be opened but if I run the
program for a long time such that the data save in the files is large
e.g 1.7GB, when I try to open the files, I will receive and error
message Killed, please any help? Is there need to always flush the
buffer and use os.fsync() and if i’m to use os.fsync() what should i
pass to this method? it is the handles (object) of the files? Thanks
Sunday Iliya

newvector2= gr.probe_signal_vc(samplelength)
magnitude= gr.probe_signal_vf(samplelength)
direct= gr.probe_signal_vc(samplelength)

newvector3=newvector2.level()
newvector4=magnitude.level()
newvector5=direct.level()
for yki in range (0,samplelength):
capturedata.write(’%s,’ %newvector3[yki])

magnitudefile.write(’%s,’ %newvector4[yki])
directusrpfile.write(’%s,’ %newvector5[yki])

On Wed, Sep 4, 2013 at 8:35 AM, sunday iliya
[email protected] wrote:

Sunday Iliya
You haven’t told us what program you are using to try to open the
files with. It’s more likely that the program simply can’t handle
files that large.

You should at least close all of your file handles before exiting the
program, and you shouldn’t have to flush them yourself in that case.

                directusrpfile.write('%s,' %newvector5[yki])

This seems like a very inefficient way of doing things. Why not use a
file_sink block instead? It will be a direct part of the flowgraph and
stores the data in binary, which will be much more compact and faster
to deal with.


Tom
Visit us at GRCon13 Oct. 1 - 4
http://www.trondeau.com/grcon13