Re: proxy_pass for large number of connections
In that case whats running on http://127.0.0.1:9090 needs to be able to handle many (proxy_passed) connections on its own, unless you have multiple instances which you then loadbalance using the same...
View ArticleRe: proxy_pass for large number of connections
http://127.0.0.1:9090 is local iptv stream running on VLC so can serve lot of connections, I need to configure nginx to proxypass local iptv stream for large users !
View ArticleRe: proxy_pass for large number of connections
If the backend on 9090 can handle connections non-blocking there is no real limit, so what is currently the problem?
View ArticleRe: proxy_pass for large number of connections
9090 working without problem, but problem is with port 8080 ( nginx ) sometimes upload going down ( see picture on attach. ), maybe is a problem network adapter also but I want to know can nginx make...
View ArticleRe: proxy_pass for large number of connections
Attachments don't work too well here, please use some pastebin service.Do you have logfile entries that show what is going on? the logs often will tell why a client link is dropped.
View ArticleRe: proxy_pass for large number of connections
there is no attachments is only live stream, also on log there is nothing !
View ArticleRe: proxy_pass for large number of connections
Enable debug for error logging, if something is getting dropped it will get logged.In your first post you attached an image, this forum doesn't do well with that, so if you want anyone to see that...
View ArticleRe: proxy_pass for large number of connections
Here is a link of picture :http://s4.postimg.org/g9d62snt9/nginx_upload.jpg
View ArticleRe: proxy_pass for large number of connections
Ok, but that can be anything, client poor/low upload, saturation somewhere along the way, your storage not fast enough, etc...If there is no error logged anywhere it's not the backend or nginx but...
View ArticleRe: proxy_pass for large number of connections
Thank you very much, I just wanted to make sure that the nginx is not a problem.
View ArticleSession and proxy_pass
Is possible to store some session on some table to monitoring active auth. user ?I need when is some user connected over proxy_pass to store their username on some table and to display over some...
View Articlenginx upstream module - delay in processing the 2nd request
I went through the EvanMillers Nginx tutorial, and wrote a simple nginx upstream module that does a GET from the backend redis server. The first request work perfectly fine, but for the second request...
View ArticleRe: try_files not working as expected
Not so similar. We want to perform kind of rewrite with try_files.If user hit "/some/location/, try_files should search for /api.php?r="/some/location" (api.php file exists)I think the problem is at...
View ArticleRe: try_files not working as expected
Because the last tryfiles value is taken into consideration, which seems to always be true. Without it the api is always true.
View ArticleNo client certificate CA names sent
I have nginx and want it to verify client certificates. So I bought commercial certificate for server, and non-commercial for clients. Basically I've generated client certificates with easy-rsa...
View ArticleIP.Board bug
This is IP.Board bug. This forum engine can send "40" http code instead of the correct "403'.My solution to the problem - http://ramzes.ws/ipboard/upstream-sent-invalid-status-40
View ArticleSingle Concurrency
I have a legacy application / database server that has very limited connectivity options.It does have CGI built in, but CGI obviously has overheads. I can't use Fast-CGI as there are reliability...
View ArticleRe: Single Concurrency
Any backend loadbalancing solution depends on having multiple backends, if your legacy application can only handle 1 connection and you add another 9 you still can handle only 10 concurrent users.nginx...
View ArticleRe: Single Concurrency
I agree in the sense that traditional load balancing expects multiple back ends.What I am trying to ascertain is if I can utilise load balancing in Nginx as a way to load balance requests to a single...
View ArticleRe: Single Concurrency
sacon Wrote:-------------------------------------------------------> So say we had an 8 core CPU. If I fired up 10 to 12 processes on the> backend, each with its own port, could I use Nginx to...
View Article