Problem in downloading large file apche+mongrel

hii all, am using apache as frontend to mongrel and loaded xsendfile
module in apache ,am looking for 1 GB of download .this is download
routine

def download

@stream = Stream.find(params[:id])

send_file(@stream.location,:filename => @stream.name,:disposition >
‘attachment’,:x_sendfile => true )

and

And this is how it is processed

Processing StreamsController#download (for 127.0.0.1 at 2010-09-22
12:32:10) [GET] Parameters: {“id”=>“6596”}

<-[4;36;1mSQL (0.0ms)?[0m ?[0;1mSET NAMES ‘utf8’?[0m

<-[4;35;1mSQL (0.0ms)?[0m ?[0mSET SQL_AUTO_IS_NULL=0?[0m

<-[4;36;1mStream Columns (15.0ms)?[0m ?[0;1mSHOW FIELDS FROM streams?[0m

<-[4;35;1mStream Load (0.0ms)?[0m ?[0mSELECT * FROM streams WHERE
(streams.id = 6596) ?[0m

<-[4;36;1mCACHE (0.0ms)?[0m ?[0;1mSELECT * FROM streams WHERE
(streams.id = 596) ? ?[0m

Sending X-Sendfile header
d:/dm/predator_720x480_5mbps_30fps_17minclip.264.filepart Completed in
57513ms (View: 0, DB: 15) | 200 OK [http://src/streams/download/6596]

very first times its okk i getting full 1.1 GB of file getting
downloaded,but when i am trying to download this 1.1 gb of file again,
am getting failed to allocate memroy what could be soultion?? and can i
use cache.clear before send_file function??
and more thing to again download 1.1 gb of file i have to restart my
servers…

thanks??