A Rate-Limiting HTTP Proxy(1)Nginx Openresty Installation

A Rate-Limiting HTTP Proxy(1)Nginx Openresty Installation

https://golang.org/pkg/net/http/

theory
http://calvin1978.blogcn.com/articles/ratelimiter.html
http://mp.weixin.qq.com/s?__biz=MzIwODA4NjMwNA==&mid=2652897781&idx=1&sn=ae121ce4c3c37b7158bc9f067fa024c0#rd
http://www.kissyu.org/2016/08/13/%E9%99%90%E6%B5%81%E7%AE%97%E6%B3%95%E6%80%BB%E7%BB%93/

lua nginx proxy
https://openresty.org/en/components.html
lua web framework
https://github.com/idevz/vanilla
https://github.com/362228416/openresty-web-dev

Start to Understand Nginx + Lua + Openresty
Openresty
http://blog.csdn.net/qq362228416/article/details/53537103
https://github.com/362228416/openresty-web-dev/tree/master/demo1

Install the software on Mac
Install PCRE
>wget https://ftp.pcre.org/pub/pcre/pcre-8.40.tar.gz
configure and make and make install that.

Install openSSL
Find the latest version here https://www.openssl.org/source/
>wget https://www.openssl.org/source/openssl-1.1.0e.tar.gz
>./config
make and sudo make install the software

>wget https://openresty.org/download/openresty-1.11.2.3.tar.gz
>./configure --prefix=/Users/carl/tool/openresty-1.11.2.3
make and sudo make install the software on this machine.

During make command, there is Exception
src/event/ngx_event_openssl.c:2048:21: error: use of undeclared identifier ‘SSL_R_NO_CIPHERS_PASSED'

It seems that the version of openssl is wrong. I will rollback to an old version. Switch to this version and try again.
https://www.openssl.org/source/openssl-1.0.2k.tar.gz

Or we can install homebrew on my MAC
>/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
>brew update
>brew install pcre openssl
>brew install homebrew/nginx/openresty

After installation, the work directory is here /usr/local/Cellar/openresty/1.11.2.3

Install the software on CentOS
>wget https://openresty.org/download/openresty-1.11.2.3.tar.gz
>sudo yum install -y zlib-devel pcre-devel
>sudo yum install openssl-devel
>./configure --prefix=/home/ec2-user/users/carl/openresty-1.11.2.3
make and sudo make install the software on this machine.

Install the software on raspberryPi
Install PCRE
>wget https://ftp.pcre.org/pub/pcre/pcre-8.40.tar.gz
configure and make and make install that.
>sudo apt-get install libssl-dev

>wget https://openresty.org/download/openresty-1.11.2.3.tar.gz
>./configure --prefix=/home/carl/tool/openresty-1.11.2.3
make and sudo make install the software on this machine.

First code running
>resty -e 'print("hello,hello")'
It works well on MAC, CentOS, but something wrong on raspberrypi

Exception
error while loading shared libraries: libpcre.so.1: cannot open shared object file: No such file or directory

Solution:
Check the missing library
>ldd /opt/openresty/bin/openresty
linux-vdso.so.1 (0x7eed9000)
/usr/lib/arm-linux-gnueabihf/libarmmem.so (0x76fca000)
libdl.so.2 => /lib/arm-linux-gnueabihf/libdl.so.2 (0x76fb0000)
libpthread.so.0 => /lib/arm-linux-gnueabihf/libpthread.so.0 (0x76f88000)
libcrypt.so.1 => /lib/arm-linux-gnueabihf/libcrypt.so.1 (0x76f49000)
libluajit-5.1.so.2 => /home/carl/tool/openresty-1.11.2.3/luajit/lib/libluajit-5.1.so.2 (0x76ed7000)
libm.so.6 => /lib/arm-linux-gnueabihf/libm.so.6 (0x76e5c000)
libpcre.so.1 => not found
libssl.so.1.0.0 => /usr/lib/arm-linux-gnueabihf/libssl.so.1.0.0 (0x76e02000)
libcrypto.so.1.0.0 => /usr/lib/arm-linux-gnueabihf/libcrypto.so.1.0.0 (0x76c94000)
libz.so.1 => /lib/arm-linux-gnueabihf/libz.so.1 (0x76c6d000)
libc.so.6 => /lib/arm-linux-gnueabihf/libc.so.6 (0x76b2c000)
/lib/ld-linux-armhf.so.3 (0x54b92000)
libgcc_s.so.1 => /lib/arm-linux-gnueabihf/libgcc_s.so.1 (0x76afe000)

Actually, I have that file
>find /usr/ -name "libpcre.so.1"
/usr/local/lib/libpcre.so.1

Add this to the PATH, it works well.
export LD_LIBRARY_PATH="/lib:/usr/lib:/usr/local/lib"

Create Work Directory
https://openresty.org/en/getting-started.html
>mkdir logs/ conf/

Add the configuration on nginx
>cat conf/nginx.conf
worker_processes  1;
error_log logs/error.log;
events {
    worker_connections 1024;
}
http {
    server {
        listen 8080;
        location / {
            default_type text/html;
            content_by_lua '
                ngx.say("<p>hello, world</p>")
            ';
        }
    }
}

Add this to the PATH as well
PATH="/opt/openresty/nginx/sbin:$PATH”

List all the parameters for nginx command
>nginx -h
nginx version: openresty/1.11.2.3
Usage: nginx [-?hvVtTq] [-s signal] [-c filename] [-p prefix] [-g directives]

Options:
  -?,-h         : this help
  -v            : show version and exit
  -V            : show version and configure options then exit
  -t            : test configuration and exit
  -T            : test configuration, dump it and exit
  -q            : suppress non-error messages during configuration testing
  -s signal     : send signal to a master process: stop, quit, reopen, reload
  -p prefix     : set prefix path (default: /usr/local/Cellar/openresty/1.11.2.3/nginx/)
  -c filename   : set configuration file (default: /usr/local/etc/openresty/nginx.conf)
  -g directives : set global directives out of configuration file

Start the nginx Service
>nginx -p /Users/carl/work/openresty/ -c conf/nginx.conf

Access the Test Page
http://localhost:8080/

Download the Perf Tool (https://acme.com/software/http_load/)
>wget https://acme.com/software/http_load/http_load-09Mar2016.tar.gz
Unzip and make, sudo make install, it will be installed under /usr/local/bin/http_load

Why the performance is just ok on my laptop
>http_load -p 20 -s 5 urls
16313 fetches, 20 max parallel, 326260 bytes, in 5.00376 seconds
20 mean bytes/connection
3260.15 fetches/sec, 65203 bytes/sec
msecs/connect: 1.48479 mean, 1333.77 max, 0.08 min
msecs/first-response: 1.35895 mean, 1333.71 max, 0.074 min
HTTP response codes:
  code 200 -- 16313

On MAC, we can try with other bench mark tool. apache ab, or weighttp

On CentOS
>nginx -p /home/ec2-user/users/carl/work/openresty/ -c conf/nginx.conf

Verify the service
>curl http://localhost:8080
<p>hello, world</p>

>http_load -p 20 -s 5 urls
152385 fetches, 20 max parallel, 3.0477e+06 bytes, in 5.00008 seconds
20 mean bytes/connection
30476.5 fetches/sec, 609530 bytes/sec
msecs/connect: 0.0353686 mean, 0.637 max, 0.013 min
msecs/first-response: 0.611693 mean, 1.402 max, 0.16 min
HTTP response codes:
  code 200 -- 152385

On RaspberryPI
>nginx -p /home/carl/work/openresty/ -c conf/nginx.conf
>curl http://localhost:8080
<p>hello, world</p>

Wow, my tiny machine can handle that traffic
>http_load -p 10 -s 5 urls
10862 fetches, 10 max parallel, 217240 bytes, in 5 seconds
20 mean bytes/connection
2172.4 fetches/sec, 43448 bytes/sec
msecs/connect: 0.889646 mean, 4.761 max, 0.2 min
msecs/first-response: 2.25265 mean, 6.981 max, 1.42 min
HTTP response codes:
  code 200 -- 10862


References:
golang
http://blog.nella.org/a-rate-limiting-http-proxy-in-go/
http://siberianlaika.ru/node/29/
https://golang.org/pkg/net/http/httputil/
https://godoc.org/github.com/elazarl/goproxy
https://github.com/golang/go/wiki/RateLimiting
http://stackoverflow.com/questions/20298220/rate-limiting-http-requests-via-http-handlerfunc-middleware
https://github.com/buger/gor

http://www.gorillatoolkit.org/pkg/http
https://github.com/parnurzeal/gorequest
http://stackoverflow.com/questions/24455147/go-lang-how-send-json-string-in-post-request
http://stackoverflow.com/questions/27034517/golang-net-http-request

nodeJS
https://github.com/joshdevins/node-rate-limiter-proxy

python
https://github.com/inaz2/proxy2/blob/master/proxy2.py
https://twistedmatrix.com/documents/15.3.0/api/twisted.web.proxy.html
https://github.com/fmoo/twisted-connect-proxy
http://blog.laplante.io/2013/08/a-basic-man-in-the-middle-proxy-with-twisted/
http://stackoverflow.com/questions/9465236/python-twisted-proxy-and-modifying-content
http://stackoverflow.com/questions/3118602/convert-http-proxy-to-https-proxy-in-twisted

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326330248&siteId=291194637