1.nginx commonly used commands
Start nginx ./sbin/nginx
停止nginx ./sbin/nginx -s stop ./sbin/nginx -s quit
Overload arranged ./sbin/nginx -s reload (GR) service nginx reload
Overloaded specify the configuration file ./sbin/nginx -c /usr/local/nginx/conf/nginx.conf
View nginx version ./sbin/nginx -v
Check the configuration file is correct ./sbin/nginx -t
./Sbin/nginx -h display help information
2.nginx status code
499: server processing time is too long, the client take the initiative to close the connection.
How 3.nginx is to achieve high concurrency
A main process, a plurality of working processes, each process can handle a plurality of work request
Each comes in a request, there will be a worker process to deal with. But not the whole process, the process where the blockage may occur, such as a server forwards the request to an upstream (rear), and return to waiting for a request. So, this process worker continues to process other requests, and once returned to the upstream server, it will trigger this event, worker will take over, this request will then go down.
Due to the nature of work web server determines a large part of life of each request is in network transmission, the piece actually spent much time on the server machine. This is a process to solve the secret of a few high concurrency. That @skoo said webserver io just part of a network-intensive applications, not a computationally intensive. Java learning circle
4.nginx function
As the http server (instead of apache, the need for PHP FastCGI processor support)
Reverse Proxy
Load balancing
Web Hosting
FastCGI: Nginx does not support PHP and other languages, but it can be a request by FastCGI throw some language or framework deal
5.502 Error Possible Causes
If (1) .FastCGI process has started
(2) .FastCGI worker process if the number is not enough
(3) .FastCGI execution time is too long
fastcgi_connect_timeout 300;
fastcgi_send_timeout 300;
fastcgi_read_timeout 300;
(4) .FastCGI Buffer enough
nginx and apache as buffering limit distal end, the buffer parameters may be adjusted
fastcgi_buffer_size 32k;
fastcgi_buffers 8 32k;
(5). Proxy Buffer enough
If you use Proxying, adjustment
proxy_buffer_size 16k;
proxy_buffers 4 16k;
(6) .php script execution time is too long
The php-fpm.conf of <value name = "request_terminate_timeout"> 0s </ value> into the time 0s
6.nignx Configuration
The difference 7.nginx and apache
Lightweight, also from the web service, take up less memory than apache and resources
Anti concurrent, non-blocking Nginx processing request is asynchronous, and the apache is blocked type, low resource consumption Nginx high performance can be maintained under high concurrency
Highly modular design, relatively simple to write a module
The core difference is that apache is synchronized multi-process model, a connection corresponds to a process; nginx is asynchronous, multiple connections (million level) may correspond to a process
The difference 8.fastcgi and cgi
cgi:
web server based on the content of the request, then will fork a new process to run an external program c (or perl script ...), this process will processed data back to the web server, the web server sends the final content to the user, just fork process also will withdraw. If the next time the user requests also change the dynamic scripts, web server and a fork a new process again, again and again carried out.
fastcgi:
When the web server receives a request, he would not re-fork a process (since this process started when the web server is turned on, and it will not withdraw), web server to deliver content directly to the inter-process (process communication, but fastcgi used in other ways, tcp way communication), the process receives the request is processed, the results back to the web server, and finally his own request and then waits for the next, rather than quit.
These are the nginx few common interview questions
Fans Welfare
I figure above information is carefully recorded video, interested can join my Java learning circle freely available. We hope to be able to make a modest during the interview you about to deal with in the next.