Node.js, Go, Python, OpenResty Benchmark

    xiaoxiao2025-12-27  6

    心血来潮,简单测试一下各种语言写的API Server的性能。

    前言

    我已经用过很多Web框架了。Python-httplib, Python-Flask,Python-Tornado,Node-http, Node-Express,Node-koa,Node-restify, Go-http。最近在做OpenAPI,用了一个开源组件Kong,后来觉得这玩意虽然设计的不错但是碍手碍脚,有一些功能还是需要深入底层去自己研究实现。后来发现Kong是基于OpenResty实现的,而OpenResty则是Nginx的一个“Bundle”,打好了很多方便的包,性能很不错的样子。正好籍由此次机会,测试一下各语言写的裸API性能。

    所有Server端使用HelloWorld Server,即发送”Hello, World”字符串作为Body。 测试Client一并使用ab -kc10 -n50000进行。 测试环境Server与Client位于同一级房的两台相邻物理机。规格为: CPU: Intel(R) Xeon(R) CPU E5-2430 0 @ 2.20GHz 24核, 内存100G。

    只是简单测试一下。

    测试Server用例

    Node.js单进程

    var http = require('http');var server = http.createServer( (req, res) => { res.writeHead(200, {'Content-Type': 'text/plain'}); res.end("Hello, World");});server.listen(8080);

    Node.js Cluster(24)

    const cluster = require('cluster');const http = require('http');const numCPUs = require('os').cpus().length;if (cluster.isMaster) { for (var i = 0; i < numCPUs; i++) { cluster.fork(); } cluster.on('exit', (worker, code, signal) => { console.log(`worker ${worker.process.pid} died`); });} else { http.createServer((req, res) => { res.writeHead(200); res.end("Hello, World"); }).listen(8080);}

    Python-Tornado

    import tornado.ioloopimport tornado.webclass MainHandler(tornado.web.RequestHandler): def get(self): self.write("Hello, world")def make_app(): return tornado.web.Application([ (r"/", MainHandler), ])if __name__ == "__main__": app = make_app() app.listen(8080) tornado.ioloop.IOLoop.current().start()

    Go-Http

    package mainimport ( "io" "net/http")func main() { http.HandleFunc("/", sayhello) http.ListenAndServe(":8080", nil)}func sayhello(w http.ResponseWriter, r *http.Request) { io.WriteString(w, "hello world")}

    OpenResty(Nginx+lua)

    worker_processes 1;error_log logs/error.log;events { worker_connections 1024;}http { server { listen 8080; location / { default_type text/html; content_by_lua ' ngx.say("<p>hello, world</p>") '; } }}

    结果

    对各种语言框架的最简EchoServer实现进行不同并发度的测试。结果如下:

    c = 1

    langrpstpr (ms)node 1x2451.250.408node 24x1119.810.893Py-Tornado1301.680.768Go-Http7108.640.141Nginx-lua 1x7385.980.135Nginx-lua 24x7368.340.136

    c = 10

    langrpstpr (ms)node 1x3944.752.535node 24x5645.111.771Py-Tornado1318.857.582Go-Http70085.240.143Nginx-lua 1x24753.790.404Nginx-lua 24x24824.980.403

    c = 100

    langrpstpr (ms)node 1x4042.2724.739node 24x5816.2317.193Py-Tornado1283.4378.261Go-Http77451.381.373Nginx-lua 1x25001.294.080Nginx-lua 24x70333.041.619

    结论:

    OpenResty(Nginx+Lua) 与 Go语言属于性能第一梯队。Node属于第二梯队,Python垫底……。Go是特么的禽兽啊……OpenResty也不错……。
    最新回复(0)