Secure permission structure for server blocks?

Hello everybody,

I am trying to wrap my head around this for weeks now. What is the most
secure way to organise the permissions of the web root directories (WRD)
for several server blocks. Especially when you have PHP applications
like Wordpress that download and create files in the WRD? Latter makes
it difficult to control the file’s owner, group and permissions.

For as “secure” is the following in my understanding: Hijacked websites
(e.g. injections in Wordpress) must not be able to read or write do any
other directory outside it’s own WRD! I am open for more security tips,
but the main topic is about directory permission structure.

I haven’t found any solution to my problem in the web, yet.

Thank you

Stadtpirat

Since the problem comes from the dynamic language PHP, you can create
several pools using different user/group pairs.
You could use 644 (or 640) permissions with user = PHP user on a
specific
directory and group = Web server group with read-only permissions.

Raw idea of the big picture, There must be some details to check (such
as
verify PHP isolation/jail/chroot inside pools).

My (quick) 2 cents,

B. R.

On Mon, Sep 09, 2013 at 02:22:50AM -0700, - - wrote:

Hi there,

I am trying to wrap my head around this for weeks now. What is the most secure
way to organise the permissions of the web root directories (WRD) for several
server blocks. Especially when you have PHP applications like Wordpress that
download and create files in the WRD? Latter makes it difficult to control the
file’s owner, group and permissions.

nginx doesn’t “do” php.

Once you accept that, then the model you have for securing things may
become clearer. The “nginx” user and the “php” user are completely
separate, unless you choose to make them not be.

One nginx runs as one user, and accesses the files it is told to. It
needs to be able to read files in the web root directory that it serves
directly. It does not (in general) know or care what php is, or where
the files that the php server (typically, a fastcgi server) reads are.

So your php-running user (configured in the fastcgi server) should be
able to read the files it needs; and if you want it to write anything,
it needs to be able to write those things. How you configure that is
not nginx’s concern.

If your fastcgi server writes files that nginx must later serve, then
nginx will need to be able to read them. If it doesn’t, then nginx
doesn’t need to care.

For as “secure” is the following in my understanding: Hijacked websites (e.g.
injections in Wordpress) must not be able to read or write do any other directory
outside it’s own WRD! I am open for more security tips, but the main topic is
about directory permission structure.

So - let the php user not be able to write any file that php will read;
but do let it be able to write files that nginx will serve. That should
stop any php-side injections. If you also want to stop any javascript
injections, you’ll want to prevent the php user from writing any files
that nginx will serve (and even that will probably not be enough).

If you care about multiple web root directories or multiple name-based
servers, then use multiple php users, and let each one only write to
the appropriate places which nginx will read.

(You can also use multiple nginx instances, each running as a different
user, if you want to use that to also restrict what files can be served
directly.)

I haven’t found any solution to my problem in the web, yet.

If you can write down what exactly you are trying to prevent, and what
you are not trying to prevent, then the solution may become clear –
or it may become clear that it is not possible.

But any uncertainty or lack of clarity in the requirements will make it
hard to confirm that the proposed solution is adequate.

f

Francis D. [email protected]

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs