How about a large flow of website traffic is resolved

For today's high-traffic sites, every day tens of millions or even billions of flow, is how to solve the traffic problems it Here are some ways summary:?

First, to confirm whether the server hardware adequate to support the current traffic.

P4 ordinary general server can support up to 100,000 daily independent IP, if you visit bigger than this, you must first configure a higher performance dedicated server to solve the problem, or how to optimize the performance can not completely solve the problem.

Second, the optimization of database access.

Server load is too large, one important reason is the CPU load is too large, reducing server CPU load, to be able to effectively break the bottleneck. Static page can be used such that the CPU load is minimized. Reception to achieve complete static of course the best, you can not fully access the database, but for frequently updated website, static often can not meet certain functions.

Caching is another solution is to store dynamic data in the cache file, dynamic web pages directly call these files, without having to access the database, WordPress, and Z-Blog, extensive use of this cache. I also wrote a Z-Blog counter plug-in, also based on this principle.

If you really can not prevent access to the database, you can try to optimize the query SQL databases. Avoid using this statement Select * from, each query returns only the results they need, to avoid a short time a large number of SQL queries.

Third, the prohibition of external hotlinking.

External site's images or documents Daolian tend to bring a lot of pressure load, and therefore should be strictly limited to its external image, or file hotlinking Fortunately, the current can be controlled simply by hotlinking refer, Apache and that they can be configured to prohibit hotlinking, IIS there are some third-party ISAPI can achieve the same function. Of course, forgery refer the code can also be achieved by hotlinking, but now deliberately falsified refer hotlinking is not much, you can not consider, or using non-technical means to solve, such as adding watermark on the image.

Fourth, control download large files.

Download large files will take up a lot of traffic, and for non-SCSI hard disk, the file download will consume a lot of CPU, making the site a reduced ability to respond. Therefore, try not to provide more than 2M downloads of large files, if required to provide, it is recommended to large files on another server. There are a lot of Web2.0 sites offer free photo sharing and file sharing, so you can try to upload images and files to share these sites.

Fifth, use different hosts major traffic diversion

will file on a different host, provide different images available for users to download. For example, if the RSS file that occupies a large flow, use FeedBurner RSS or FeedSky services will be output on a different host, flow pressure so that others would visit mostly focused on a host of FeedBurner, the RSS will not take up too many resources.

Sixth, traffic analysis using statistical software.

Installed on a Web site traffic statistics analysis software, which can instantly know where cost a lot of traffic, which pages need to be optimized, therefore, to solve the traffic problems still need to be accurate statistical analysis can. I recommend using the statistical traffic analysis software is GoogleAnalytics (Google analysis). I feel that the course is very good, I will later explain in detail the use of some knowledge and skills of GoogleAnalytics.

Reproduced in: https: //www.cnblogs.com/in-loading/archive/2012/02/07/2341551.html

Guess you like

Origin blog.csdn.net/weixin_34191845/article/details/93700328