Quantcast
Channel: Nginx Forum - How to...
Viewing all articles
Browse latest Browse all 4759

Re: How to handle concurrent requests with fastcgi?

$
0
0
I'm a bit confused again.

"but as soon as you have to proxy-pass anything the backend can be blocking if it can't process multiple requests over the same connection"

My backend (my FastCGI application) doesn't really block. It accepts connections on the listener socket, and defers them to other threads to process (which do so using a fiber pool) while the listener thread keeps listening for more connections. Why do I need to use loadbalancing to get concurrent FastCGI queries in this case, or is this a limitation of nginx's FastCGI module? As said, I'm fairly sure I'm doing the same thing that Lua does (coroutine sockets) but on the backend-side.

I presume that's what embedded Lua is doing in this case - taking the queries, putting them into a coroutine-socket pool, and kicking them off and then working on the next one pending the completion of the ones that were kicked off? Does this only function if it's done on the nginx side? -- as that is actually how my backend is architectured.

Viewing all articles
Browse latest Browse all 4759

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>