Limit issue, i think it's a bug

I have:

    limit_req_zone  $binary_remote_addr  zone=one:10m   rate=5r/s;
    limit_req   zone=one  burst=5;

in http zone.

whatever changes I make ot that directive even if I set 100/s won’t
serve more than 2 requests per second.

maybe Im doing it wrong?

Posted at Nginx Forum:

I realize this is a two-year-old topic, but I’m having the same issue.
It doesn’t matter what I have the rate set to, it only allows about 2
requests per second. This is my config:

in http:
limit_req_zone $binary_remote_addr zone=flood:10m rate=15r/s;

in server location:
limit_req zone=flood;

That should really be all that is necessary, right? I’m using Nginx
1.1.6.

Posted at Nginx Forum:

Hello!

On Sat, Nov 26, 2011 at 05:03:24AM -0500, talisto wrote:

That should really be all that is necessary, right? I’m using Nginx
1.1.6.

How do you test it?

Maxim D.

On Sat, Nov 26, 2011 at 5:03 AM, talisto [email protected] wrote:

That should really be all that is necessary, right? I’m using Nginx
1.1.6.

Maxim D. Wrote:
How do you test it?

I’ve been testing it with Siege, but I can actually generate error 503
responses just by refreshing my web browser fast enough. My results
with Siege are somewhat random; sometimes the very first request fails,
then a couple will pass, then another will fail, all within a couple
seconds. That’s with my rate set to 15r/s, which should be more than
enough… in Siege I’m only using 5 concurrent users with 1 request per
second, yet it still fails, often on the first or second hit. In my
browser, it’s a bit more reliable; I have a simple page which makes 2
ajax requests; the 2nd ajax request will always fail.

As soon as I remove the limit_req line, I can flood my server with as
many requests as Siege can handle and it never generates a 503, so I
know that the errors aren’t being caused by something else. I’m not
sure why the limit_req isn’t working properly though.

You are truncating your response. Don’t top post or do that, it makes
it difficult to track things or for us to help you.

From a review of the documentation this is by design.
Siege with 5 concurrent users at 1r/s could easily exceed 15r/s if
there is other traffic or if it is not behaving predictably.
Are you sure there are no other requests to the site, and that the
browser test only makes 2 requests, and that siege only sends 10
requests? We need logs.
From what you have sent, you will get 503 because you are limiting the
requests and have not defined burst. Set burst to nodelay and see if
it allows 15/s.
And you cannot claim bug without evidence; please see the DEBUG
README, show relevant configuration, define limit_req_log_level
correctly, and send the output.

Stefan C.
http://scaleengine.com/contact
“People who enjoy having meetings should never be allowed to be in
charge of anything.”

Hello!

On Sat, Nov 26, 2011 at 07:22:51AM -0500, talisto wrote:

browser, it’s a bit more reliable; I have a simple page which makes 2
ajax requests; the 2nd ajax request will always fail.

As soon as I remove the limit_req line, I can flood my server with as
many requests as Siege can handle and it never generates a 503, so I
know that the errors aren’t being caused by something else. I’m not
sure why the limit_req isn’t working properly though.

The limit_req is expected to generate 503 as soon it sees more
requests than burst set. As you have no burst set, it will
generate 503 as soon as it sees requests coming with less than
1/15 second interval. This is expected behaviour.

While flooding your your server with requests (with limit_req
set), you should see about 15 successfull requests per second,
others will return 503.

Maxim D.

Maxim D. Wrote:

How do you test it?

I’ve been testing it with Siege, but I can actually generate error 503
responses just by refreshing my web browser fast enough. My results
with Siege are somewhat random; sometimes the very first request fails,
then a couple will pass, then another will fail, all within a couple
seconds. That’s with my rate set to 15r/s, which should be more than
enough… in Siege I’m only using 5 concurrent users with 1 request per
second, yet it still fails, often on the first or second hit. In my
browser, it’s a bit more reliable; I have a simple page which makes 2
ajax requests; the 2nd ajax request will always fail.

As soon as I remove the limit_req line, I can flood my server with as
many requests as Siege can handle and it never generates a 503, so I
know that the errors aren’t being caused by something else. I’m not
sure why the limit_req isn’t working properly though.

Posted at Nginx Forum: