一、Openresty + lua 应用场景(官方原话)

OpenResty® 是一个基于 Nginx 与 Lua 的高性能 Web 平台,其内部集成了大量精良的 Lua 库、第三方模块以及大多数的依赖项。用于方便地搭建能够处理超高并发、扩展性极高的动态 Web 应用、Web 服务和动态网关。

OpenResty® 通过汇聚各种设计精良的 Nginx 模块(主要由 OpenResty 团队自主开发),从而将 Nginx 有效地变成一个强大的通用 Web 应用平台。这样,Web 开发人员和系统工程师可以使用 Lua 脚本语言调动 Nginx 支持的各种 C 以及 Lua 模块,快速构造出足以胜任 10K 乃至 1000K 以上单机并发连接的高性能 Web 应用系统。

OpenResty® 的目标是让你的Web服务直接跑在 Nginx 服务内部,充分利用 Nginx 的非阻塞 I/O 模型,不仅仅对 HTTP 客户端请求,甚至于对远程后端诸如 MySQL、PostgreSQL、Memcached 以及 Redis 等都进行一致的高性能响应

二、安装:

yum install readline-devel pcre-devel openssl-devel postgresql-devel gcc 

wget https://openresty.org/download/openresty-1.11.2.4.tar.gz

./configure --prefix=/usr/local/openresty1.11.2.4 \
            --with-luajit \
            --without-http_redis2_module \
            --with-http_iconv_module \
            --with-http_postgres_module \
        --with-http_drizzle_module

make && make install

三、官方给出的额压测数据:

HelloWorld

Testing the performance of a HelloWorld server does not mean many things but it does tell us where the ceiling is.

The HelloWorld server based on OpenResty is described in the GettingStarted document.

Below is the result using the command ab -c10 -n50000 http://localhost:8080/ on my ThinkPad T400 laptop with ngx_openresty 0.8.54.6:

Server Software: ngx_openresty/0.8.54
Server Hostname: localhost
Server Port: 8080

Document Path: /
Document Length: 20 bytes

Concurrency Level: 10
Time taken for tests: 2.459 seconds
Complete requests: 50000
Failed requests: 0
Write errors: 0
Total transferred: 8550342 bytes
HTML transferred: 1000040 bytes
Requests per second: 20335.69 [#/sec] (mean)
Time per request: 0.492 [ms] (mean)
Time per request: 0.049 [ms] (mean, across all concurrent requests)
Transfer rate: 3396.04 [Kbytes/sec] received

Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.1 0 8
Processing: 0 0 0.2 0 8
Waiting: 0 0 0.1 0 8
Total: 0 0 0.2 0 8

Percentage of the requests served within a certain time (ms)
50% 0
66% 0
75% 0
80% 0
90% 1
95% 1
98% 1
99% 1
100% 8 (longest request)

So on my laptop, for a single nginx worker, we've got 20k+ r/s. For comparison, HelloWorld servers using nginx + php-fpm 5.2.8 gives 4k r/s, Erlang R14B2 raw gen_tcp server gives 8k r/s, and [[node.js|http://nodejs.org/] v0.4.8 yields 5.7k r/s.