Quantcast
Channel: Nginx Forum - How to...
Viewing all articles
Browse latest Browse all 4759

Re: How to handle concurrent requests with fastcgi?

$
0
0
nginx by itself processes simultaneously (event driven), also embedded Lua does the same thing (hence its performance), but as soon as you have to proxy-pass anything the backend can be blocking if it can't process multiple requests over the same connection, this is not something nginx can do about (other then moving backend code/processing over to Lua for as much as possible).

For example when you have a BI application doing alot of queries you use Lua to cache the query/results and serve them from a Lua cache instead of passing each request to your BI app/database which will limit the amount of concurrent users on nginx's side.

Inside nginx everything is fast and can handle 100k easily, as soon as it needs to talk to the outside, the outside determines how many requests you can handle, hence using loadbalancing (multiple backends or spreading requests between daemons) and hence using embedded processing/caching whenever you can (Lua).

Viewing all articles
Browse latest Browse all 4759

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>