I’m trying to build a simple proxy to Amazon S3 that would allow a
client to request a url that gets mapped to a S3 Object by Rails and
streamed back to the client.
For S3 integration, I use the highly recommended gem AWS::S3
(http://amazon.rubyforge.org/), and to send a file, I simply use the
send_file function.
The concept works BUT the I can’t get the file to be sent as it’s been
downloaded. Even better would be to download the file to memory instead
of the disk, I’m really not familiar with Ruby’s IO Class though… If
anyone has a suggestion…
I first tried this:
download the file entirely then uploads it to the client
send_data AWS::S3::S3Object.value ‘object_name’, ‘bucket_name’
Then I figured I needed to use the streaming feature of AWS::S3
Start a thread that downloads the file in the background
Thread.start do
open(‘/tmp/tempfile’, ‘w’) do |file|
AWS::S3::S3Object.stream(‘object_name’, ‘bucket_name’) do |chunk|
file.write chunk
end
end
end
Then send the file as it’s being written.
send_file “/tmp/tempfile”, :stream => true
This doesn’t work, the sent file is often between 0 and 4 bytes… not
cool.
I’m guessing I could put it all together, I dont know how.
Here’s the send_file function that could be overwritten:
def send_file(path, options = {}) #:doc:
raise MissingFile, “Cannot read file #{path}” unless
File.file?(path) and File.readable?(path)
options[:length] ||= File.size(path)
options[:filename] ||= File.basename(path)
send_file_headers! options
@performed_render = false
if options[:stream]
render :text => Proc.new { |response, output|
logger.info "Streaming file #{path}" unless logger.nil?
len = options[:buffer_size] || 4096
File.open(path, 'rb') do |file|
if output.respond_to?(:syswrite)
begin
while true
output.syswrite(file.sysread(len))
end
rescue EOFError
end
else
while buf = file.read(len)
output.write(buf)
end
end
end
}
else
logger.info "Sending file #{path}" unless logger.nil?
File.open(path, 'rb') { |file| render :text => file.read }
end
end
Please, any idea anyone?