Web front-end performance & SEO optimization

Go to: https://www.2cto.com/kf/201604/498725.html && https://www.cnblogs.com/EnSnail/p/5671345.html

Browser access optimization

The browser request processing flow is as follows:
1

1. Reduce http requests and set HTTP cache reasonably

The HTTP protocol is a stateless application layer protocol, which means that each HTTP request needs to establish a communication link and perform data transmission, and on the server side, each HTTP needs to start an independent thread for processing. These communication and service overheads are expensive, and reducing the number of http requests can effectively improve access performance.

The main means of reducing http is to merge CSS, merge javascript, and merge images. Combine the javascript and CSS required for a browser visit into one file, so that the browser only needs one request. Pictures can also be combined, and multiple pictures are combined into one. If each picture has different hyperlinks, it can respond to mouse clicks through CSS offset to construct different URLs.
The power of caching is powerful, and proper caching settings can greatly reduce HTTP requests. Suppose the home page of a website, when the browser has no cache, will send a total of 78 requests, a total of more than 600 K of data, and when the second visit, that is, after the browser has cached, there are only 10 requests, a total of more than 20 K data. (It should be noted here that the effect is different if the page is refreshed directly by F5. In this case, the number of requests is still the same, but the requesting server for the cached resource is a 304 response, only the Header has no Body, which can save bandwidth)

  What is a reasonable setting? The principle is very simple, the more you can cache, the better, and the longer you can cache, the better. For example, image resources that rarely change can directly set a long expiration header through Expires in the HTTP Header; resources that change infrequently but may change can use Last-Modifed for request verification. Keep resources in the cache as long as possible. The specific settings and principles of HTTP caching will not be described in detail here.

2. Use browser cache

For a website, static resource files such as CSS, javascript, logo, and icons are updated less frequently, and these files are required for almost every http request. If these files are cached in the browser, it can be extremely Good to improve performance. By setting the attributes of cache-control and expires in the http header, the browser cache can be set, and the cache time can be several days or even several months.

At some point, the changes of static resource files need to be applied to the client browser in time. In this case, it can be achieved by changing the file name, that is, updating the javascript file is not updating the content of the javascript file, but generating a new JS file and updating it. References in HTML files.
When updating static resources, websites that use the browser caching strategy should adopt a method of incremental update. For example, if 10 icon files need to be updated, it is not appropriate to update all 10 files at a time, but should be updated one file at a time. A certain interval is required to avoid a large number of cache failures in the user's browser, and the cache is updated in a centralized manner, resulting in a sudden increase in server load and network congestion.

3. Enable compression

Compressing files on the server side and decompressing them on the browser side can effectively reduce the amount of data transmitted by communication. If possible, combine external scripts and styles as much as possible, and combine multiple into one. The compression efficiency of text files can reach more than 80%, so HTML, CSS, and javascript files can be compressed with GZip to achieve better results. However, compression puts a certain pressure on the server and the browser, and a trade-off should be considered when the communication bandwidth is good but the server resources are insufficient.

4、 CSS Sprites

  Another great way to reduce the number of requests by merging CSS images.

5. Lazy Load Images (I still don’t understand the content of this piece)

  This strategy doesn't actually reduce the number of HTTP requests, but it can reduce the number of HTTP requests under certain conditions or when the page is just loaded. For pictures, only the first screen can be loaded when the page is just loaded, and subsequent pictures can be loaded when the user continues to scroll backwards. In this way, if the user is only interested in the content of the first screen, the remaining picture requests are saved.

6. CSS is placed at the top of the page, and javascript is placed at the bottom of the page

The browser will render the entire page after downloading all the CSS, so the best practice is to put the CSS at the top of the page and let the browser download the CSS as soon as possible. If the CSS is placed in other places such as BODY, the browser may start rendering the page before downloading and parsing the CSS, which will cause the page to jump from the CSS-free state to the CSS state, and the user experience will be poor, so Consider putting CSS in HEAD.

Javascript is the opposite. The browser executes the javascript immediately after loading it, which may block the entire page and cause the page to display slowly. Therefore, the javascript is best placed at the bottom of the page. But if you need to use javascript when the page is parsed, it is not suitable to put it at the bottom.

Lazy Load Javascript (Load only when it needs to be loaded, generally does not load information content.) With the popularity of Javascript frameworks, more and more sites also use frameworks. However, a framework often includes a lot of functional implementations, which are not required by every page. If you download unnecessary scripts, it is a waste of resources - both bandwidth and execution time are wasted. . There are two current approaches, one is to customize a dedicated mini-version framework for those pages with particularly large traffic, and the other is Lazy Load.

7. Asynchronously request Callback (that is, extract some behavior styles and slowly load the content of the information)

  
There may be such a requirement in some pages that the script tag needs to be used to request data asynchronously. similar:

<code class="hljs javascript"> Javascript:
    /*Callback 函数*/
    function myCallback(info){ 
        //do something here 
    } 

 HTML:
  Callback返回的内容 :
   myCallback('Hello world!');</code>

Write directly on the page like the above

8. Reduce cookie transmission

On the one hand, cookies are included in each request and response. A cookie that is too large will seriously affect data transmission. Therefore, which data needs to be written into the cookie needs to be carefully considered, and the amount of data transmitted in the cookie should be minimized. On the other hand, for access to some static resources, such as CSS, script, etc., it is meaningless to send cookies. You can consider using an independent domain name to access static resources to avoid sending cookies when requesting static resources and reduce the number of cookie transmissions.

9. Javascript code optimization

(1). DOM  
  a. HTML Collection (HTML collector, which returns an array of content information)
  in the script document.images, document.forms, getElementsByTagName() all return collections of type HTMLCollection, which are usually used when It is mostly used as an array, because it has a length property, and each element can be accessed by index. However, it is much worse than an array in terms of access performance, because this collection is not a static result, it only represents a specific query, and each time the collection is accessed, the query will be re-executed to update the query result. The so-called "accessing the collection" includes reading the length property of the collection and accessing the elements in the collection.
  Therefore, when you need to traverse the HTML Collection, try to convert it to an array before accessing it to improve performance. Even if it is not converted to an array, please visit it as little as possible. For example, when traversing, you can save the length property and members to local variables and then use local variables.  
  b. Reflow & Repaint  
  In addition to the above point, DOM operations also need to consider the browser's Reflow and Repaint, because these are resource-consuming.

(2). Be careful to use with  
with(obj){ p = 1}; The behavior of the code block is actually to modify the execution environment in the code block, placing obj at the front of its scope chain, in the with code block When accessing non-local variables in , the search starts from obj first, and if not, then searches up the scope chain in turn. Therefore, using with is equivalent to increasing the length of the scope chain. Each time the scope chain is searched, it takes time, and an excessively long scope chain will reduce the search performance.
  Therefore, unless you can be sure that only the properties in obj are accessed in the with code, use with with caution. Instead, you can use local variables to cache the properties you need to access.

(3). Avoid using eval and Function
  Every time the eval or Function constructor acts on the source code represented by the string, the script engine needs to convert the source code into executable code. This is a very resource-intensive operation - often more than 100 times slower than a simple function call.
  The eval function is particularly inefficient. Since the content of the string passed to eval cannot be known in advance, eval interprets the code to be processed in its context, which means that the compiler cannot optimize the context, so it can only be interpreted by the browser at runtime code. This has a big impact on performance.
  The Function constructor is slightly better than eval because using this code doesn't affect surrounding code; but it's still slow.
  Also, using eval and Function is also not beneficial for Javascript compression tools to perform compression.

(4). Reducing scope chain
  lookup The above mentioned the scope chain lookup problem, which is a problem that needs special attention in loops. If you need to access a variable in a non-local scope in a loop, use a local variable to cache the variable before traversing, and rewrite that variable after the traversal is over. This is especially important for global variables, because global variables are in scope The top of the chain has the most number of lookups when visiting.
  
  Inefficient way of writing:
  

<code class=" hljs javascript">// 全局变量 
var globalVar = 1; 
function myCallback(info){ 
    for( var i = 100000; i--;){ 
        //每次访问 globalVar 都需要查找到作用域链最顶端,本例中需要访问 100000 次 
        globalVar += i; 
    }
} 
</code>

More efficient writing:

<code class=" hljs javascript">// 全局变量 
var globalVar = 1; 
function myCallback(info){ 
    //局部变量缓存全局变量 
    var localVar = globalVar; 
    for( var i = 100000; i--;){ 
    //访问局部变量是最快的 
    localVar += i; 
    } 
    //本例中只需要访问 2次全局变量
    在函数中只需要将 globalVar中内容的值赋给localVar 中
    globalVar = localVar; 
}</code>

Also, to reduce scope chain lookups you should also reduce the use of closures.

(5). Data access Data access
  in Javascript includes direct variables (strings, regular expressions), variables, object properties and arrays, among which the access to direct variables and local variables is the fastest, and the access to object properties and arrays is the fastest. Access requires more overhead. When the following situations occur, it is recommended to put data into local variables:
  
  a. Access to any object property more than once
  b. Access to any array member more than once
  In addition , the depth of the object and array should be reduced as much as possible Find.

(6). String concatenation Using the "+" sign to concatenate strings
  in Javascript is relatively inefficient, because each run will open up new memory and generate a new string variable, and then assign the concatenated result to the new variable . A more efficient way to do this is to use the join method of the array, that is, put the strings that need to be concatenated in the array and finally call its join method to get the result. However, since the use of arrays also has a certain overhead, this method can be considered when there are many strings that need to be concatenated.

10. CSS selector optimization

  In the opinion of most people, browsers parse CSS selectors from left to right. For example, if
#toc A { color: #444; }such a selector is parsed from right to left, it will be very efficient, because the first ID The selection basically limits the scope of the search, but in fact the browser parses the selector from right to left. Like the selector above, the browser has to traverse to find the ancestors of each A tag, which is not as efficient as previously thought. According to this behavior of the browser, there are many things to pay attention to when writing selectors. Interested children's shoes can learn about them.

The essence of CDN acceleration
CDN (content distribution network, content distribution network) is still a cache, and the data is cached in the place closest to the user, so that the user can obtain the data at the fastest speed, that is, the so-called first hop of network access, reverse proxy ,As shown below.
2

The traditional proxy server is located on the side of the browser, and the proxy browser sends http requests to the Internet, while the reverse proxy server is located on the side of the website computer room, and the proxy website web server receives http requests. As shown below:

3

The role of website security, access requests from the Internet must go through a proxy server, which is equivalent to establishing a barrier between the web server and possible network attacks.

In addition to the security function, the proxy server can also speed up web requests by configuring the caching function. When a user accesses static content for the first time, the static content is cached on the reverse proxy server, so that when other users access the static content, they can return directly from the reverse proxy server to speed up the response speed of web requests. Relieve web server load pressure. In fact, some websites cache dynamic content on proxy servers, such as Wikipedia and some blog forum websites, cache popular terms, posts, and blogs on reverse proxy servers to speed up user access. When there is a change, the reverse proxy is notified through the internal notification mechanism that the cache is invalid, and the reverse proxy will reload the latest dynamic content and cache it again.

In addition, the reverse proxy can also realize the function of load balancing, and the application cluster built through the load balancing can improve the overall processing capacity of the system, thereby improving the performance of the website under high concurrency.


1. How Search Engines Work

  When we enter a keyword in the input box, click search or query, then we get the result. Digging into the story behind it, search engines do a lot of things.

  In search engine websites, such as Baidu, there is a very large database in the background, which stores a large number of keywords, and each keyword corresponds to many URLs. These URLs are Baidu programs from the vast Internet. These programs are called "search engine spiders" or "web crawlers". These industrious "spiders" crawl the Internet every day, go from one link to another, download the content, analyze and refine, and find the keywords in it. Useful will be stored in the database. On the contrary, if the "spider" thinks it is spam or duplicate information, it will discard it, continue to crawl, find the latest and useful information and save it for user search. When users search, they can retrieve URLs related to keywords and display them to visitors.

  A keyword pair uses multiple URLs, so there is a problem of sorting, and the corresponding URL that best matches the keyword will be ranked first. In the process of "spider" crawling web content and extracting keywords, there is a problem: whether "spider" can understand it. If the content of the website is flashand js, then it is incomprehensible and will be confused, even if the keyword is more appropriate. Correspondingly, if the content of the website is in its language, then it can understand, its language ie SEO.

2. Introduction to SEO

  Full name: Search English Optimization, search engine optimization. Since the advent of search engines, SEO has been born.

  Significance of existence: optimization behavior to improve the number and ranking of web pages in the natural search results of search engines. In short, we hope that search engines such as Baidu can collect more carefully crafted websites, and the websites can be ranked in the front when others visit.

  Category: 白帽SEOand 黑帽SEO. White hat SEO plays a role in improving and standardizing website design, making the website more friendly to search engines and users, and the website can also obtain reasonable traffic from search engines, which is encouraged and supported by search engines. Black hat SEO uses and amplifies search engine policy flaws to gain more user traffic. Most of these behaviors deceive search engines, which are generally not supported and encouraged by search engine companies. This article is aimed at white hat SEO, so what can white hat SEO do?

  1. Carefully set the title, keywords and description of the website to reflect the positioning of the website and let search engines understand what the website does;

  2. Website content optimization: the correspondence between content and keywords increases the density of keywords;

  3. Reasonably set the Robot.txt file on the website;

  4. Generate a search engine friendly sitemap;

  5. Add external links to promote on various websites;

3. Front-end SEO

  Through the structural layout design of the website and the optimization of the webpage code, the front-end page can be understood by both browser users and "spiders".

  (1) Optimization of website structure and layout: try to be as simple as possible, straight to the point, and advocate a flat structure.
  
  Generally speaking, the less structured the website is, the easier it is to be crawled by "spiders", and it is also easier to be included. Generally, the directory structure of small and medium-sized websites exceeds three levels, and "spiders" are unwilling to climb down, "what if they get lost in the dark?" And according to relevant surveys: if a visitor has not found the information they need after 3 jumps, they are likely to leave. Therefore, the three-tier directory structure is also a need for experience. For this we need to do:

  1. Control the number of homepage links

  The homepage of the website is the place with the highest weight. If there are too few links on the homepage and there is no "bridge", the "spider" cannot continue to climb down to the inner pages, which directly affects the number of websites included. However, there should not be too many links on the homepage. Once there are too many links, there will be no substantial links, which will easily affect the user experience, reduce the weight of the homepage of the website, and the indexing effect will not be good.

  Therefore, for small and medium-sized enterprise websites, it is recommended to have less than 100 homepage links. The nature of the links can include page navigation, bottom navigation, anchor text links, etc. Note that the links should be based on the user's good experience and guide users to obtain information. .

  2. Flatten the directory hierarchy, try to let the "spider" jump to any inner page of the website as long as 3 times. Flattened directory structure, such as: "Plant" -> "Fruit" -> "Apple", "Orange", "Banana", you can find bananas after level 3.

  3. Navigation optimization

  Navigation should use text as much as possible, and can also be combined with image navigation, but the image code must be optimized, and <img>tags must be added “alt”and “title”attributes to tell search engines the location of navigation, so that users can see the prompt even if the image is not displayed normally Word.

  Secondly, breadcrumb navigation should be added to each web page. Benefits: From the perspective of user experience, it allows users to understand the current location and the current page's position in the entire website, helping users quickly understand the organization of the website. , so as to form a better sense of location, and at the same time provide an interface to return to each page, which is convenient for users to operate; for "spider", it can clearly understand the structure of the website, and also adds a large number of internal links to facilitate crawling and reduce Bounce Rate.

  4. The structure of the website – details that cannot be ignored

  1) Page header: logo and main navigation, as well as user information.

  2) The main body of the page: the text on the left, including the breadcrumb navigation and text; the popular articles and related articles on the right, the benefits: retain visitors and make them stay longer. For "spiders", these articles are related links and enhance the page Relevance can also enhance the authority of the page.

  3) Bottom of the page: Copyright information and friendly links.

  Special attention: Paging navigation writing method, recommended writing method: "Homepage 1 2 3 4 5 6 7 8 9 drop-down box", so that "spider" can jump directly according to the corresponding page number, and the drop-down box directly selects the page to jump to. The following way of writing is not recommended, "first page, next page, last page", especially when the number of pagination is very large, "spider" needs to climb down many times to grab it, which will be very tiring and easy to give up.

  5. Control the size of the page, reduce http requests, and improve the loading speed of the website.

  A page is best not to exceed 100k, too large, the page loading speed is slow. When the speed is slow, the user experience is not good, the visitor cannot be retained, and once it times out, the "spider" will also leave.

  (2) Web page code optimization
  
  1. <title>Title: Just emphasize the key points, try to put important keywords in the front, do not repeat keywords, and try <title>not to set the same content in the title of each page.

  2. <meta keywords>Tags: keywords, just list the important keywords of several pages, remember to over-stuff.

  3. <meta description>Labels: The description of the webpage, which needs to summarize the content of the webpage, remember not to be too long, too many keywords, and each page should be different.

  4. <body>Labels in: Try to make the code semantic, use the right label at the right place, and do the right thing with the right label. Let readers and "spiders" understand at a glance. For example: h1-h6it is used for the title class, and the <nav>label is used to set the main navigation of the page.

  5. <a>Label: In-page links, add “title”attributes to explain, let visitors and "spiders" know. For external links, those linking to other websites need to add el="nofollow"attributes to tell the "spider" not to crawl, because once the "spider" crawls the external link, it will not come back.

  6. Use <h1>the label for the title of the body: "Spider" thinks it is the most important. If you don't like <h1>the default style, you can set it through CSS. Try to use tags for the title of the text <h1>, tags for the subtitle <h2>, and h title tags should not be used indiscriminately in other places.

  7. <br>Labels: Only used for line breaks of text content, such as:

<p>
   第一行文字内容<br/>
    第二行文字内容<br/>
    第三行文字内容
</p>

 8. Tables should use <caption>table title tags

  9. <img>Should be specified using the "alt" attribute

 10. <strong>、<em>Labels: Use when you need to emphasize. <strong>Tags can be highly valued in search engines. They can highlight keywords and express important content. The <em>emphasis effect of tags is second only to <strong>tags.

    <b>、<i>Tags: only used to show the effect, it will not have any effect in SEO.

 11. Do not use special symbols for text indentation . It &nbsp;should be set using CSS. Do not use special symbols for copyright symbols &copy; you can directly use the input method, spell "banquan", and select the serial number 5 to type the copyright symbol ©.

 12. Cleverly use the CSSlayout, put the HTMLcode of important content at the front, the front content is considered to be the most important, and give priority to the "spider" to read and crawl the content keywords.

 13. Don't use JSoutput for important content, because "spider" doesn't know it

 14. Minimize the use of iframeframes, because "spiders" generally do not read their content

 15. Use with caution display:none :For text content that you do not want to display, you should set the z-index or set it outside the browser display. Because search engines filter out display:nonethe content.

 16. Keep your code down

 17. jsIf the code is an DOMoperation, it should be placed bodybefore the closing tag and htmlafter the code.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324606094&siteId=291194637