Hi there!
I have nginx serving several different domains.
I've already applied requests limits **per IP**, in order to protect against some types of attacks.
However, that limits are about the traffic coming from one IP.
Now I would like to apply allow no more than X requests per second **to a domain**.
Why would I want to do that in the first place? Well, my server runs several websites with nginx (uwsgi is in the background). I want to prevent uwsgi or nginx slowing down, timing out or going completely down due to a sudden and huge spike on traffic. It has happened in the past, and I'm already tunning my uwsgi application to make it more fast and scalable.
However, in the meanwhile, I would like to apply some security limits, to allow not more than X requests per second to a domain.
I've found an example on the web, but I'm not sure if this would work. I've tested locally and it does, but I want to be sure, that's why I'm making this post.
Is the following approach correct to limit requests per second to a domain?
In this example, if it's correct, I would be limitting to 10 requests per second the traffic to siteA.com and siteB.com, and 50 requests per second the traffic to the siteC.com.
Notice the variable "$host" used as the key to create two different zones, one more restrictive than the other.
http {
limit_req_zone $host zone=restrictive:10m rate=10r/s;
limit_req_zone $host zone=powerful:10m rate=50r/s;
server {
server_name siteA.com;
location / {
limit_req zone=restrictive;
}
}
server {
server_name siteB.com;
location / {
limit_req zone=restrictive;
}
}
server {
server_name siteC.com;
location / {
limit_req zone=powerful;
}
}
}
What do you think?
Thanks in advance.
I have nginx serving several different domains.
I've already applied requests limits **per IP**, in order to protect against some types of attacks.
However, that limits are about the traffic coming from one IP.
Now I would like to apply allow no more than X requests per second **to a domain**.
Why would I want to do that in the first place? Well, my server runs several websites with nginx (uwsgi is in the background). I want to prevent uwsgi or nginx slowing down, timing out or going completely down due to a sudden and huge spike on traffic. It has happened in the past, and I'm already tunning my uwsgi application to make it more fast and scalable.
However, in the meanwhile, I would like to apply some security limits, to allow not more than X requests per second to a domain.
I've found an example on the web, but I'm not sure if this would work. I've tested locally and it does, but I want to be sure, that's why I'm making this post.
Is the following approach correct to limit requests per second to a domain?
In this example, if it's correct, I would be limitting to 10 requests per second the traffic to siteA.com and siteB.com, and 50 requests per second the traffic to the siteC.com.
Notice the variable "$host" used as the key to create two different zones, one more restrictive than the other.
http {
limit_req_zone $host zone=restrictive:10m rate=10r/s;
limit_req_zone $host zone=powerful:10m rate=50r/s;
server {
server_name siteA.com;
location / {
limit_req zone=restrictive;
}
}
server {
server_name siteB.com;
location / {
limit_req zone=restrictive;
}
}
server {
server_name siteC.com;
location / {
limit_req zone=powerful;
}
}
}
What do you think?
Thanks in advance.