Faster csv locking file

I run this in a loop and the fastercsv locks up the file. how do i
release it after processing the file so that the next export statement
can use the same file.

analysis.export(“c:\temp\report.txt”)
hasresults=false
rows=FasterCSV.open(“c:\temp\report.txt”,
{:headers=>:first_row,:col_sep=>"," ,:skip_blanks=>true})
File.delete(“c:\temp\report.txt”)

Permission denied - c:\temp\report.txt

On Dec 24, 2007 2:20 AM, Junkone wrote:

I run this in a loop and the fastercsv locks up the file. how do i
release it after processing the file so that the next export statement
can use the same file.

analysis.export(“c:\temp\report.txt”)
hasresults=false

  •               rows=FasterCSV.open("c:\\temp\\report.txt",
    
  • {:headers=>:first_row,:col_sep=>"," ,:skip_blanks=>true})
  •               rows=FasterCSV.read("c:\\temp\\report.txt",
    
  • {:headers=>:first_row,:col_sep=>"," ,:skip_blanks=>true})

File.delete(“c:\temp\report.txt”)

Or use FasterCSV.foreach or FasterCSV.open with a block.
Or call rows.closewhen you’re done.

The best approach is most probably using read, and then foreach (will
yield row arrays) and open (will yield proper FasterCSV object).
Do your processing in the open’s block, and then let the library to
close the file for you. It’s safer (with regards to exceptions) than
manually closing the file.

This is the same pattern as IO.open vs. IO.new vs. IO.read.