Memory leak problem

I have a txt file with some data that i need to import to de database.
I’m using ruby to import those data but i have a major problem, when i
've
a large amount of data i run out of memory.

File.open("#{RAILS_ROOT}/public/files/Neighborsville.TXT").each() do
|line|
@stringArray = line.split("|")
@i += 1
puts @i
@pid = @stringArray[0]
@chain_id = @stringArray[1]
@business = Business.find_by_pid_and_chain_id(@pid,@chain_id);
#Check PID + CHAIN_ID
@business.pid = @stringArray[0]
@business.chain_id = @stringArray[1]
@business.cityname = @stringArray[17]
@business.state = @stringArray[18]
@business.business =
Business.find_by_pid_and_chain_id(@pid,@chain_id);
@business.city = City.new
@business.business_category = get_category_id(@stringArray[40])
@business.address = @stringArray[8] +" “+ @stringArray[9] +”
“+ @stringArray[10]+” “+ @stringArray[11] +” “[email protected][12]+”
[email protected][13]+” "[email protected][14]
if @chain_id == nil
@chain_id = “”
end
business.save
end
end

I belive that ruby use in every cycle of the do new blocks of memories
por my instances
of Business. Can someone help me please?

Thanks,

Elioncho

Probably not related, but should business.save at the end there be
@business.save?