Writing lines to a file

I have a text file with on every line a magic card number and such info

eg: 072 GTC U Mental Vapors

I’m trying to read the file line for line and append each line to a file
named foozled.txt.
I wrote a method to_s it prints to screen right with output :
072 |GTC| U | Mental Vapors
but I need this output to append this to the foozled file.

how can this be done?
it now prints nil on each line in the foozled file

------------> snipped code
File.open(cardList,“r”) do |file|
file.each_line {|line| line.length
parse=Card.new("#{line}","#{line}","#{line}","#{line}")

  File.open('foozled.txt','a') do |line| line.puts parse.to_s

#appends only nil to the file but prints to screen
end
}
end

Does this work?

File.foreach(cardList) do |line|
parse= Card.new( *Array.new( 4, line.to_s ) )
File.open(‘foozled.txt’,‘a’) { |f| f.puts parse.to_s }
end

Can you show the class internals, particularly the initialize and to_s
methods?

On 04/29/2013 02:27 PM, Joel P. wrote:

File.foreach(cardList) do|line|
parse= Card.new( *Array.new( 4, line.to_s ) )
File.open(‘foozled.txt’,‘a’) {|f| f.puts parse.to_s }
end
no, it gives the same output nil’s to the file, prints good to screen

I think I know what the problem is, you’re using puts or print in #to_s
instead of returning the string. The puts and print methods return nil.

On 04/29/2013 02:50 PM, Joel P. wrote:

Can you show the class internals, particularly the initialize and to_s
methods?

sure I can
-------->
class Card
attr_accessor :cnumber, :set, :rarity, :cname

 def initialize(cnumber,set,rarity,cname)

     @cnumber = cnumber[0..3]
     @set = set[4..6]
     @rarity = rarity[8..9]
     @cname = cname [10..50]
 end

 def to_s
     puts "#{cnumber} |#{set}| #{rarity}| #{cname}"
 end

 def to_set
     puts "#{set}"
 end

end

On 04/29/2013 02:51 PM, Joel P. wrote:

I think I know what the problem is, you’re using puts or print in #to_s
instead of returning the string. The puts and print methods return nil.

you are right! thanks Joel, newb mistakes I’m making

On Apr 29, 2013 11:20 AM, “peteV” [email protected] wrote:

I have a text file with on every line a magic card number and such info

eg: 072 GTC U Mental Vapors

I’m trying to read the file line for line and append each line to a file
named foozled.txt.
file.each_line {|line| line.length
parse=Card.new("#{line}","#{line}","#{line}","#{line}")

 File.open('foozled.txt','a') do |line| line.puts parse.to_s #appends

only nil to the file but prints to screen

   end
}

end

Just want to say, openning the same output file for every line in the
input
file seems wasteful. General alg:

File.open(outfile,‘a’) do |cout|
File.open(infile,‘r’) do |cin|
cout.puts process(cin.gets)
end
end

Or you could do the whole operation with a single read, string
processing, and a single write.

On 04/29/2013 05:35 PM, Joel P. wrote:

Or you could do the whole operation with a single read, string
processing, and a single write.

thanks again guys, I’ll investigate both ways some more and learn from
it.

On 04/29/2013 07:34 PM, tamouse mailing lists wrote:

On Apr 29, 2013 4:37 PM, “Joel P.” <[email protected]
mailto:[email protected]> wrote:

Or you could do the whole operation with a single read, string
processing, and a single write

This can be a problem if the source file is large, but certainly, yes.

the source file is 17000 + lines :slight_smile:

On Apr 29, 2013 4:37 PM, “Joel P.” [email protected] wrote:

Or you could do the whole operation with a single read, string
processing, and a single write

This can be a problem if the source file is large, but certainly, yes.

On Apr 29, 2013 6:38 PM, “peteV” [email protected] wrote:

the source file is 17000 + lines :slight_smile:

well, ok, that would have meant opening, appending a line and closing it
17,000 times in the original. it is sometimes hard to conceive of
computers
doing actual work, but that is a lot of unnecessary opens and closes.

otoh, 17,000 lines may not costbthat much ram if read in all at one
time,
filtered to a buffer, and appended en masse. so joel does have a point.

a billion lines is more into the problem space of too much to buffer.
(or
in the case i am familiar with, 4GB database file)