X-Accel-Redirect for embedded videos and images in a page

Hi,

What I have seen so far are download scripts that require the user to
click on a link that redirects to a download page from which the
download automatically starts à-la sourceforge. I’d like to know if I
can use X-Accel-Redirect to protect videos and images that appear on a
webpage?

I am trying to implement the tutorial from Alex Kovyrin:
http://blog.kovyrin.net/2006/11/01/nginx-x-accel-redirect-php-rails/lang/en/

But instead of proxying to apache, I proxy php scripts to spawn-fcgi.

The protection of the files does work, but when I want to access a
jpeg image, I get the raw data displayed in my web-browser. How to
make it as an image?

In my php script I have added:

header(‘Content-Type: image/jpeg’);

But it didn’t work.

Is this a php or fastcgi issue?

Ok I got it working, my fastcgi params were once again messed up.

Now that I have my down.php file.

Let’s say I have my index.php page, how to have index.php call
down.php to send it images to display inside index.php? Basically what
I am trying to create is a simple anti-leech script.

Best regards,

Hello Thomas,

Saturday, March 22, 2008, 5:23:49 PM, you wrote:

I am trying to implement the tutorial from Alex Kovyrin:
http://blog.kovyrin.net/2006/11/01/nginx-x-accel-redirect-php-rails/lang/en/

But instead of proxying to apache, I proxy php scripts to spawn-fcgi.

The protection of the files does work, but when I want to access a
jpeg image, I get the raw data displayed in my web-browser. How to
make it as an image?

In my php script I have added:

header(‘Content-Type: image/jpeg’);

But it didn’t work.

Is this a php or fastcgi issue?

re-compile nginx with the --with-debug option, setup ‘error_log /path
debug;’ in
your config and check what really happens inside nginx :wink:

I have been making some tests with X-Accel-Redirect:

On my PIII 500 with 384MB with a 10/100 ethernet card runing on Ubuntu
7.10 server edition, I have a 1Go file to serve. Both my server and my
client box are on the same LAN, and I have a router-DSL to connect to
internet with about 1MB upload and 8MB download (I am not 100% sure
about the figures).

When I uploaded the file to the server on the LAN, the speed was
3.3MB/s.

Downloading the file from the internet using, uses 100% cpu on my
server and the speed reaches the limit of my DSL connection: roughly
1.3MB/s. First question, is the 100% cpu usage normal for my poor
little server?

I have had sometimes the download get suddenly interrupt. Where could
that problem come from? Is it the router that is getting hammered by
upload and download data at the same time?

Then I have made a test downloading the same file but without going on
internet, therefore bypassing the router. Now I get 1.7MB/s of
download, which I guess is the absolute limit of my server power. I
have not had any disconnection up to now, but as the file is big, I
cannot test many times. But probably the initial disconnections came
from my DSL router.

My production server will be a Conroe-L cpu with a 100MB connection,
it don’t think it will be capable of managing a Rails app and
transferring files at full speed, I know limit_rate can work on a per
client basis, but is there a way I can limit the total rate of
transfer?

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs