[Life] front-end SEO optimization program

First, the search engine works

In the background search engine sites will have a very large database, which stores vast amounts of keywords, and each keyword in turn corresponds to a lot of URLs, they are called "search engine spiders" or "crawlers" program from the Internet to download the vast collection comes little by little. With the emergence of a variety of websites, these hard-working "spiders" crawling on the Internet every day, from one link to another link to download the contents, refine the analysis, which found key words, if the "spider" that keywords not in the database while it is useful to users of the database will be stored in the background. On the other hand, if the "spider" that is spam or duplicate information, they do not give up, continued walking, looking for the latest and useful information to provide user saved search. When a user searches, you can retrieve the URL associated with the keywords displayed to visitors. A keyword to use multiple URLs, so there is a problem sorted, and when the corresponding keyword that best matches the URL will be standing in the front. In the "spider" crawls the web, refining keywords this process, there is a problem: "Spider" can understand. If your site is flash and js, etc., then it is not read, will become confused, even more relevant keywords is useless. Accordingly, if the content of the site may be able to identify a search engine, the search engine will improve the site's heavy weight, increase friendliness of the site. Such a process we call SEO.

Two, SEO Introduction

SEO (Search Engine Optimization), that is search engine optimization. SEO is with the advent of search engines comes two are to promote mutual, symbiotic relationship. The existence of SEO is to optimize behavior in order to enhance the number of pages indexed and ranking position in the search engine natural search results and do. Optimized purpose is to enhance the site right in the search engine's weight, increase search engine friendliness, so that the user can access the site at the top surface. Category: white hat SEO and black hat SEO. White hat SEO, improve and standardize played a role in website design to make the site more friendly to search engines and users, and the site also get a reasonable search engine traffic from this search engine is to encourage and support. Black Hat SEO, search engine use and enlarge defect policy to get more traffic users, such acts are mostly deceive search engines, search engine companies generally do not support and encouragement. In this paper, white hat SEO, white hat SEO So what can we do? 1. The title of the site, keywords, description carefully chosen to reflect the positioning of the website, so that search engines know what the site is doing; 2 web content optimization: keyword corresponding content, increasing keyword density; 3. reasonable set on the site Robot.txt file; 4. site map generation for search engine friendly; 5 increase in external links to each website promotion.

Third, why do SEO

The right to improve the site weights, enhanced search engine friendliness, in order to improve rankings, increase traffic, improve the (potential) user experience, promote sales role. Fourth, the front-end SEO norms

Fourth, the front-end SEO norms

Front-end is very important to build a website link, front-end work is primarily responsible for the page HTML + CSS + JS, good optimization in these areas will lay a solid foundation for SEO work. Through structural layout design and web site code optimization, making the front page not only allows browser users to read (enhanced user experience), but also allows a "spider" to understand (to improve the search engine friendliness).
Front-end SEO Considerations:

1, site structure as simple as possible to optimize the layout, straight to the point, advocates a flat structure

In general, a site created fewer structural level, the more likely to be "spider" crawl, it can easily be included. Most small and medium website directory structure more than three, "spider" would not want to climb down. According to relevant data and survey: If the visitor is not found after the jump three times the required information, it is likely to leave. Therefore, the three-directory structure also requires experience.
To do this we need to do:

(1) control the number of links Home Home is where the highest weight, if too little home link, there is no "bridge", "Spider" can not continue to climb down inside pages, a direct impact on the number of sites included. But the Home link can not be too much, too soon, no substantive link, it is easy to affect the user experience, Home will reduce the weight, included the effect is not good.

(2) flattening the directory hierarchy try to make "Spider" As long jump three times, you will be able to reach any inside pages within the site.

(3) navigation optimized navigation should try to use the text mode, you can also navigate with pictures, but pictures must be carried out to optimize the code, <img>tags must add "alt" and "title" attribute to tell search engine positioning navigation, so that even if the picture is not when the normal display, the user can see the prompt text. Secondly, on each page should add breadcrumbs benefits: from user experience, it allows users to understand the current position and the position in which the current page throughout the site to help users quickly understand the organization of the site to form a better sense of location, while providing a return each page of the interface, user-friendly operation; on the "spider", being able to clearly understand the structure of the site, while also increasing the number of internal links, to facilitate capture, reduce Bounce Rate.

Structure and layout (4) website - details can not be ignored

Head of the page: logo and main navigation, and user information.

Main page: left text, including text and breadcrumbs; the right to put popular articles and articles related benefits: retain visitors, make visitors stays on the "spider", these articles belong links to enhance the relevance of the page right, but also can enhance the page weight.

Bottom of the page: copyright information and links.

Special attention: writing page navigation, recommended wording: "Home 123,456,789 drop down box", such a "spider" jump directly according to page number, page jump direct selection drop-down box. The following wording is not recommended, "First Next Last", especially when the number of pages especially for a long time, "spider" to go through a lot of times to climb down to crawl, will be tired, it would be easy to give up.

(5) the use of layout, the HTML code on the most important part of search engines to crawl before HTML content is from top to bottom, take advantage of this feature, you can make the main code first reading, advertising and other unimportant code in below. For example, in the case of left and right columns of the code unchanged, just change it style, use float: left; and float: right; you can freely let in two columns show the position swap, so that you can ensure that important Code in front, so that the first reptiles crawl. The same applies to the case of multiple columns.

(6) control the size of the page, reduce http requests, increase the loading speed of the website. A page should not exceed 100k, too, slow page loads. When the speed is very slow, the user experience is not good, unable to retain visitors, and once the timeout, "spider" will leave.

2, page code optimization

(1) highlight the important part - the rational design title, description and keywords
<title>Title: only emphasized the focus can, try to important keywords on the front, do not repeat the keywords appear, try to make each page <title>do not set title the same content.

<meta keywords> Tags: keywords, include important keywords to a few pages, remember too much stuffing.

<meta description> Tags: Web page description, need highly summarized web content, remember not too long, excessive keyword stuffing, each page should be different.

(2) semantic writing HTML code, try to make the code conforms to the W3C standard semantics, use the appropriate tags in place, doing the right things with the right label. Let those who read the source code and the "spider" at a glance. For example: h1-h6 is a category title, <nav>tags are used to set the main navigation page, a list of codes used or OL ul, using strong characters and the like is important.

(3) <a>Label: In-page links, to add "title" attribute to illustrate, let visitors and "spider" know. After the external links, links to other sites, you need to add el = "nofollow" attribute, told the "spider" Do not crawl, because once the "spider" climb external links, it will not come back.

<a href="https://www.360.cn" title="360安全中心" class="logo"></a>

(4) use the title text <h1>label: h1 tag comes weight "spider" that it is the most important, there is a page and a maximum of only one H1 tag on the page is the most important title above, such as the home page logo can plus H1 tag. Subtitle with <h2>labels, and other places should not be used lightly h title tag.

(5) <img>should use the "alt" attribute to illustrate

<img src="cat.jpg" width="300" height="200" alt=""  />

When the network is slow, pictures or address failure when you can reflect the effect of the alt attribute, he can let the user know the role of this picture when the picture is not displayed. At the same time set the height and width of the picture, can improve the loading speed of the page.

(6) should use Form <caption>table header tag element defines a table caption title. After the caption label must follow the table tags, you can define for each table a

<table border='1'>
    <caption>表格标题</caption>
    <tbody>
        <tr>
            <td>apple</td>
            <td>100</td>
        </tr>
        <tr>
            <td>banana</td>
            <td>200</td>
        </tr>
    </tbody>
</table>

(7) <br>Tags: only for text wrap, such as:

<p> 
    第一行文字内容<br/>
    第二行文字内容<br/>
    第三行文字内容
</p>

(8)<strong> , <em>tags: use needs to be emphasized. <strong>Label in the search engines can be a high priority, it can highlight keywords, the performance of important content, <em>label, second only to emphasize the effect of <strong>the label; <b>, <i>tags: use only for display, will not play any effect in the SEO.

(9) Do not use special characters to indent text should be set using CSS. Do not use special symbols copyright symbol copyright symbol © © play can be used as the input method.

(10) Do not use an important part of JS output, because the "spider" does not read JS in the content, so it is important content must be placed in HTML.

(11) to minimize the use iframe framework, because the "spider" generally do not read the contents.

(12) with caution display: none: do not want to display the text, should be set or z-index indentation provided a sufficiently large negative offset from the outside of the browser. Because the search engines will filter out display: the contents of none of them.

3, front-end website performance optimization

(1) reducing the number of requests in http browser communicates with the server, mainly communicate via HTTP.
The browser and the server need to go through three-way handshake, the handshake every time need to spend a lot of time. And different browsers on a limited number of concurrent requests a resource file (different browsers allow concurrent), once the number of HTTP requests to a certain number, there is a resource request wait state, which is very deadly, thus reducing the number of HTTP requests can be large extent to optimize site performance. Domestic commonly known as CSS Sprites CSS sprites, which is to merge multiple images into one picture to reach a solution to reduce HTTP requests, picture content can be accessed through the CSS background properties. This approach also can reduce the total number of bytes in the picture. The combined CSS and JS files now have a lot of front-end engineering packaging tools, such as: grunt, gulp, webpack and so on. To reduce the number of HTTP requests, these tools can be released again before the plurality of CSS or combined into a plurality of JS file. Lazyload using commonly known as lazy loading, you can control the content on the page without loading at the outset, without requesting, until the user really needs an immediate operation to load the contents. This one-time request to control the number of web resources.

(2) control resource file is loaded priorities
browser when loading HTML content, the HTML content is from top to bottom in order to resolve, resolve to link or script tag will load the corresponding link href or src content, for the first time show page to the user, you need to load the CSS in advance, not by JS load impact. Under normal circumstances are at the head of CSS, JS at the bottom.

(3) as far as possible outside the chain CSS and JS (separation structure, performance and behavior), to ensure clean page code, but also conducive to future maintenance

<link rel="stylesheet" href="asstes/css/style.css" />

<script src="assets/js/main.js"></script>

(4) using the browser cache
browser cache is stored in a local network resource, waiting for the next request for the resource, if the resource already exists on the server does not need to re-request the resource, the resource is read directly locally.

(5) reducing the rearrangements (Reflow)
Rationale: rearrangement DOM changes affect the geometric attributes of the element (width and height), the browser will recalculate the geometric properties of the elements, it will render tree affected portions failure the browser verifies that all visibility attributes of other nodes DOM tree, which is the reason Reflow inefficient. If Reflow too often, CPU usage goes up dramatically. Reduce Reflow, if you need to add style in the DOM operation, try to increase the use of the class attribute, rather than by style operation style.

(6) reduction DOM manipulation
(7) icon using IconFont replacement
(8) does not use CSS expressions, it will affect the efficiency
(9) using a CDN network caching, to accelerate user access speed, reducing the pressure on the server
(10) GZIP compression is enabled, browsing speed faster, the amount of information the search engine spider will also increase
(11) pseudo-static settings if it is dynamic pages, you can open the pseudo-static feature that lets spider "mistaken" this is a static page, because relatively static pages together spiders appetite, if the url with the keyword better.

Dynamic Address: http: //www.360.cn/index.php
pseudo-static address: http: //www.360.cn/index.html

Conclusion: a correct understanding of SEO, overly SEO, website or content-based.

Published 134 original articles · won praise 80 · views 30000 +

Guess you like

Origin blog.csdn.net/Umbrella_Um/article/details/104846690