[Reprint] tomcat+nginx+redis realizes load balancing and session sharing (1)

http://www.cnblogs.com/zhrxidian/p/5432886.html

 

During the operation of the project, we all encounter a problem. When the project needs to be updated, we may need to temporarily shut down the server to update. But some situations may arise:

 

1. The user is still operating and is forcibly terminated (we can watch the log and wait for updates when no one is operating, but there may always be cases)

2. Users who do not know may think whether the website has been attacked, which reduces the degree of trust in the website, and thus leads to the loss of some potential customers, which is especially detrimental to financial Internet companies.

 

After checking some information, I decided to use Tomcat + Nginx + Redis to achieve load balancing and session sharing. The following records my practice process. If there are any mistakes or deficiencies, you are welcome to give pointers. If you don’t like it, don’t spray it.

 

1. A brief introduction and opening of Nginx

Nginx is a lightweight and high-performance Http and reverse proxy server. The so-called reverse proxy means that when a user initiates an access request, the proxy server receives it, then forwards the request to the official server, and returns the data processed by the official server to the client. At this time, the proxy server behaves as a server. This seems to be an extra step and a little troublesome, but in fact there are many benefits, which I will show in the demo below.

First, let's go to the Nginx official website to download a Nginx. I am on my own computer, so of course I downloaded the windows version. After the download is complete, you can directly put it in a certain disk, no installation is required. Next, we open cmd, enter the nginx directory, and enter start nginx.

 

 

We can see a window flash by so that nginx has been started and we can find its process in the task manager.

Now we enter localhost in the browser. You can see that a page appears, although it is a bit simple, but it is indeed the welcome page of nginx, similar to the welcome page of locahost:8080 just after tomcat has been started.

 

2. Use Nginx to implement reverse proxy

Now we build a maven project based on the SpringMVC + Spring + Mybaties framework, and the construction process will not be repeated. The function is very simple, that is, you can jump to a page, of course, you can also use other frameworks.

Run the demo, my tomcat port is 8080, enter localhost:8080 in the browser, and our page appears.

 

At this time, we still directly access the tomcat server. Now I want to access tomcat through nginx, that is, enter localhost to display our demo page.

This requires us to modify the core configuration file of nginx, the nginx.conf file in the conf folder in its directory, then first we need to understand the role of some nodes in this file.

  • worker_processes: the number of worker processes, multiple can be configured

  • worker_connections: the maximum number of connections for a single process

  • server: each server is equivalent to a proxy server

  • lister: listening port, default 80

  • server_name: The domain name of the current service, there can be multiple, separated by spaces (we are local so it is localhost)

  • location: Indicates the matching path, when configured/indicates that all requests are matched here

  • index: When no home page is specified, the specified file will be selected by default, multiple, space-separated

  • proxy_pass: The request goes to a custom server list

  • upstream name{ }: server cluster name

After knowing the function of the node, we know the server part in the file we need to modify. This is its original code, and I deleted its comment part. Now we can see why we typed localhost,

It accesses its welcome page which is index.html.

Let's make some small changes to this code. It is to direct the request to the server we defined.

 

 

Then enter the command nginx -s reload in cmd to restart nginx.

After restarting, we enter localhost again, and we can see that the page jumped to is our demo.

 

At this point, the reverse proxy has been completed, so that all requests need to go through the proxy server to access the official server, which can protect the security of the website to some extent.

 

3. Use Nginx to achieve load balancing

Load balancing means that the proxy server distributes the received requests in a balanced manner to each server.

The advantages of load balancing may not be obvious when the traffic volume is small or the concurrency is small, not to mention the level of traffic and high concurrency of Taobao Double 11 and the Ministry of Railways rushing for tickets, that is, when the general website rush-buying activities, will also give The server puts a lot of stress and may crash the server. And load balancing can obviously reduce or even eliminate the occurrence of this situation. Let's talk about the implementation method.

First, let's start a tomcat server. Let's call it tomcat2 here. The original one was called tomcat1. Copy the project on tomcat1 to tomcat2, and slightly modify the text on the page to distinguish which tomcat our request is distributed to. The tomcat2 port is 8081 here. Enter localhost:8081 in your browser.

 

 

The server is ready, we need to define a server cluster outside the server, that is, the upstream tag mentioned above is used. The server cluster name is test.

At the same time, we need to modify the server again and transfer the directed path to the server cluster.

 

Restart nginx, enter localhost in the browser, refresh several times, and you can see that the two pages are switching back and forth.

 

This is the balance of liabilities. Suppose our server is running, one of the tomcats hangs, and there is still another one that can be accessed. When updating, you can also close only one of them first and update them in turn. In addition, it can effectively relieve server pressure, isn't it great?

Of course, the above nginx configuration is simple, in fact, we can also configure nginx to cache static resources, etc., so I won't demonstrate it here.

 

4. Summary

It took a long time to finally write it one after another, so I will summarize it here.

As a reverse proxy server, nginx can cache the static files of our project, and implement reverse proxy and load balancing, which can effectively reduce server pressure, and can be used even if the project is not large.

In addition, everyone should have found a problem. Although this request can be requested to two tomcats separately, if it is generally not required for identity verification or any authentication method, it is acceptable, but if such a situation occurs:

We have logged in on tomcat1. Of course, the user session exists on tomcat1. At this time, the request to enter the personal center is sent to tomcat2, and then there will be a problem. tomcat2 will tell you that you are not logged in, which is obviously not what we want to see.

This involves session sharing, how to share sessions on two servers. I'll leave it here for the next time. As a code farmer, I'm busy, and it may take a few days. In addition, I uploaded the demo source code this time, and I will use it next time. The nginx configuration will not be passed on. Everyone should do more experiments by themselves.

Attached download address: http://download.csdn.net/detail/zhrxidian/9517266

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324771842&siteId=291194637