2009/4/20 [email protected] [email protected]:
I am doing a batch run, where I have several .csv files with the same
name in different sub-directories. At the end of the batch run, I want
to cat all the individual files together, and load them into the
database.
Here is the code:
Dir[“**/store_need*”].each do |path|
single_file_nm = path.to_s
Why do you convert a string to a string and then do string
interpolation in the next line?
single_file = File.open("#{single_file_nm}", "r")
The string interpolation is superfluous plus you should rather use the
block form of File.open.
single_file.each do |line|
store_need_file << line
Where do you open store_need_file?
puts line
end
single_file.close
end
I am recursively looking for the store_need.csv files, and then
reading them into a single store_need_file. The problem is that this
code is not reading the file at all. I can see that the path is
defined correctly.
This is a shorter variant if your individual CSV files are small:
File.open “load.csv”, “w” do |io|
Dir[“**/store_need*”].each do |path|
# note: puts instead of write to get the terminating newline
io.puts(File.read(path))
end
end
If your individual files are large I would probably not bother to
combine them. If you still want to do it then I would do something
like this
untested
NL = “\r\n”.freeze
BL = 1024
File.open “load.csv”, “wb” do |fout|
buffer = “”
Dir[“**/store_need*”].each do |path|
File.open path, “rb” do |fin|
while fin.read(BL, buffer)
fout.write(buffer)
end
end
# only needed if files are not terminated properly:
# fout.write(NL)
end
end
Cheers
robert