Git Product home page Git Product logo

nginx-limit-traffic-rate-module's Introduction

Notes on the limit_traffic_rate module
=====

To install, compile nginx with this ./configure option:

    --add-module=path/to/this/directory

Nginx directive limit_rate could limit connection's speed, and limit_conn 
could limit connection number by given variable. If the client is a browser,
 it only open one connection to the server. The speed will be limited to 
limit_rate, unless the client is a multi-thread download tool.

Limit_traffic_rate module provides a method to limit the total download rate
 by client IP or download URL, even there are several connections. The 
limit condition could be defined by the following directive.

The limit_traffic_rate module need to use a share memory pool. Directive 
syntax is same to limit_zone. 

    http {
        #limit_traffic_rate_zone   rate $request_uri 32m;
        limit_traffic_rate_zone   rate $remote_addr 32m;
        
        server {
            location /download/ {
                limit_traffic_rate  rate 20k;
            }
        }
    }

Changelogs
  v0.2 
    *   modify algorithm, rate = (limit - last_rate)/conn + last_rate
  v0.1
    *   first release

License
=====

Same as nginx:

/* 
 * Copyright (C) 2010 Simon([email protected])
 *
 * Redistribution and use in source and binary forms, with or without
 * modification, are permitted provided that the following conditions
 * are met:
 * 1. Redistributions of source code must retain the above copyright
 *    notice, this list of conditions and the following disclaimer.
 * 2. Redistributions in binary form must reproduce the above copyright
 *    notice, this list of conditions and the following disclaimer in the
 *    documentation and/or other materials provided with the distribution.
 *
 * THIS SOFTWARE IS PROVIDED BY AUTHOR AND CONTRIBUTORS ``AS IS'' AND
 * ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
 * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
 * ARE DISCLAIMED.  IN NO EVENT SHALL AUTHOR OR CONTRIBUTORS BE LIABLE
 * FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
 * DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
 * OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
 * HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
 * LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
 * OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
 * SUCH DAMAGE.
 */


nginx-limit-traffic-rate-module's People

Contributors

bigplum avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nginx-limit-traffic-rate-module's Issues

1.9.12 work?

is it work with 1.9.12? i try and it not work for me

limit_traffic_rate_zone rate $remote_addr not limiting by IP

I have just installed a fresh copy of Nginx for our HTTP download file server. This module is amazing (when working properly) and just what we are looking for.

However, while we have managed to configure it to limit per connection - if a user from the same IP opens up another connection then the download speed is not shared - they get a full speed connection again.

For example, we have set the speed limit to 20kb/s... if a user connects with one connection goes to download a file with one thread, that's fine and speed is limited. But if they open multiple connections (for example a download software), they can achieve many 20kb/s connections - for example 5 connections brings 100kb/s.

How can I limit the connection speed across all connections from the same IP?

Current config:

limit_traffic_rate_zone rate $remote_addr 32m;
location / {
                limit_traffic_rate  rate 20k;
            }

get 32k not used!!!

limit_traffic_rate_zone rate $server_addr 50m;
limit_traffic_rate rate 10971520;

[root@localhost data]# ab -c 200 -n 10000 http://192.168.1.243/32k
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking 192.168.1.243 (be patient)
Completed 1000 requests
Completed 2000 requests
Completed 3000 requests
Completed 4000 requests
Completed 5000 requests
Completed 6000 requests
Completed 7000 requests
Completed 8000 requests
Completed 9000 requests
Completed 10000 requests
Finished 10000 requests

Server Software: nginx/v0.0.1T
Server Hostname: 192.168.1.243
Server Port: 80

Document Path: /32k
Document Length: 32768 bytes

Concurrency Level: 200
Time taken for tests: 18.086 seconds
Complete requests: 10000
Failed requests: 0
Write errors: 0
Total transferred: 330180000 bytes
HTML transferred: 327680000 bytes
Requests per second: 552.92 [#/sec] (mean)
Time per request: 361.716 [ms] (mean)
Time per request: 1.809 [ms] (mean, across all concurrent requests)
Transfer rate: 17828.42 [Kbytes/sec] received

Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 2 2.7 2 29
Processing: 42 351 232.7 163 646
Waiting: 0 7 11.4 5 115
Total: 66 354 232.3 165 649

Percentage of the requests served within a certain time (ms)
50% 165
66% 629
75% 631
80% 632
90% 634
95% 635
98% 636
99% 636
100% 649 (longest request)

limit_traffic_rate problem

{ ngx_string("limit_traffic_rate"),
NGX_HTTP_MAIN_CONF|NGX_HTTP_SRV_CONF|NGX_HTTP_LOC_CONF|NGX_CONF_TAKE2,
ngx_http_limit_traffic_rate,
NGX_HTTP_LOC_CONF_OFFSET,
0,
NULL },

use limit_traffic_rate in server{} , it work?

License?

Hello bigplum,
What is the license under which I can use this excellent module?

Thanks,
-Umesh

deny limit exceeding connections

I need to deny new connections if they would exceed the limit. I can't risk to slow down current connections.

Is there a way to access the last_rate variable or do you have another suggestion for me?

simple limit config

hello sir,
what will be an ideal config for limiting speed for max 8 threads for download managers and speed not more than 4MB/s ?

limit_traffic_rate_filter issues

2017/11/28 09:52:35 [alert] 103990#0: shared memory zone "rate" was locked by 49547
2017/11/28 09:52:43 [crit] 49552#0: ngx_slab_alloc() failed: no memory in limit_traffic_rate_filter "rate"
2017/11/28 09:52:43 [alert] 103990#0: worker process 49552 exited on signal 11
2017/11/28 09:52:43 [alert] 103990#0: shared memory zone "rate" was locked by 49552

Question : To limit number of open simultaneous / concurrent connections to a file in a real world DDoS scenario

So in a DDoS scenario the config can be the following.

limit_req_zone $binary_remote_addr zone=one:10m rate=30r/m;
limit_conn_zone $binary_remote_addr zone=addr:10m;

location ~ \.mp4$ {
limit_conn addr 1; #Limit open connections from same IP
limit_req zone=one burst=5; #Limit max number of requests from same IP

mp4;
limit_rate_after 1m;
limit_rate 1m;

expires max;
valid_referers none blocked networkflare.com *.networkflare.com;
if ($invalid_referer) {
return   403;
}
}

My above config would be good for a single IP or few IP's that would be spamming / flooding the mp4 files on the server but would not hold up against a real larger DDoS.

The real world scenario : An attacker could have over 1000 machines connections to hit you with.

The way they could easily bypass the above is all 1000 machines connect to download a file at the same time and they don't spam / flood the requests since if they did that they would be blocked and served the 503 status code instead they just keep the connection open to download that file they request at 1mb a second X by 1000 machines would mean your servers 1Gig port is maxed out now (Successful attack) when they finished downloading the file they instantly open it again and repeat constantly consuming the bandwidth, Because 1 IP per machine is accessing the file none of them have the same IP and they are not spamming or flooding to trigger the limit blocks they easily bypass the firewall.

A possible way to fix the above is to use limit_conn with $uri or $request_uri to limit simultaneous connections to download a single file at once.

limit_conn_zone $request_uri zone=peruri:10m; #Limit max number of open connections to a single file

limit_req_zone $binary_remote_addr zone=one:10m rate=30r/m;
limit_conn_zone $binary_remote_addr zone=addr:10m;

location ~ \.mp4$ {
limit_conn peruri 1; #Limit max number of connections to open / download a single file at anytime.

limit_conn addr 1; #Limit open connections from same IP
limit_req zone=one burst=5; #Limit max number of requests from same IP

mp4;
limit_rate_after 1m;
limit_rate 1m;

expires max;
valid_referers none blocked networkflare.com *.networkflare.com;
if ($invalid_referer) {
return   403;
}
}

The above added code :

limit_conn_zone $request_uri zone=peruri:10m; #Limit max number of open connections to a single file
limit_conn peruri 1; #Limit max number of connections to open / download a single file at anytime.

Would make it so only 1 person may access that particular file URL at a time. (Still not good enough) they could adapt to instead of spamming / flooding or slow-loris opening a single file on a mass scale to bypass the above solution via opening different mp4 URL's simultaneously if each of the 1000 IP's connect and all open a different file back to square one.

How your module could save the day! ๐Ÿ‘

http {
limit_traffic_rate_zone rate $request_uri 32m; #Requested URL
server {
location /download/ { #The folder where all MP4 files sit
limit_traffic_rate rate 500m; #Limit total traffic to the download folder across infinite connections / requests to half of the servers 1Gig port capacity
}
}
}

So with the above is my understanding correct that your module could fix all the issues listed above by no matter how many downloads over how ever many different MP4 file URL's are taking place simultaneously your module will not let the max bandwidth output be more than Half of the servers 1Gig port capacity.

Or is the above config wrong for your module and would result in only limiting per file ($request_uri) still ?

I didn't write the above to poke holes or flaws out etc, I do so to help and make things better and solve / address potential problems (if any) sorry for the lengthy post. Think its a great module with allot of potential and I strive to maximize its potential ๐Ÿ‘

[branch v1.0] Possible null pointer dereference

cppcheck

[ngx_http_limit_traffic_rate_filter_module.c:235] -> [ngx_http_limit_traffic_rate_filter_module.c:243]:
(warning) Possible null pointer dereference: node - otherwise it is redundant to check it against null.

Question about limit_traffic_rate module

Hi,

I just installed a fresh new version of Nginx (v1.8.0) with limit_traffic_rate module.

I am using the module inside a proxy_pass section.
When I am using a wget client on the same server the rate limitation is perfect (2Mbps in my case), but when I am using another application like VLC (I don't know if it is multi-threaded or not) the traffic can jump to 8 Mbps from time to time.

So I am wondering if this could be caused by a bug (chat I don't think) or by the rate limitation computation. Let me explain my thought : sometimes the bitrate can go very low (50Kbps) so if the computation algorithm makes an average over a certain period of time it could let the same IP go very high for a certain period of time so that the global average does not go over the 250Kbps mentioned above.

So my question is : is the limit_traffic_rate is an upper limit in any case or an upper limit over a certain period of time (eg since the connection has been initiated by the end-user) ?

I hope I made it clear.

Anyway thanks for the great job.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.