Front-end extreme performance optimization manual

Front-end extreme performance optimization manual

Original link

Essential knowledge points on the road to front-end optimization.

  • The display of the browser inputting the url to the page, what happened specifically, and what optimizations the front end can do
  1. DNS recursive query analysis - DNS optimization prefetch;
  2. TCP three-way handshake, four-way wave - the difference between http1.0/1.1/2.0, the difference between http/s;
  3. http cache - 304 and CDN;
  4. Browser rendering mechanism - the importance of CSS and JS order, the loss of @import, anti-shake, throttling, rearrangement, redrawing, GPU acceleration, etc.;
  5. How to optimize JS main thread - web worker, time slicing
  • Have you optimized the picture, sprite, webp, svg;
  • Packaging optimization such as webpack
  • Basic knowledge of operation and maintenance nginx

This article summarizes the basic points of front-end performance optimization in a certain order. You can check your projects step by step to find performance bottlenecks. If there are any errors or omissions, please add and correct them.

Some of the principles and details of the article are in the reference article, and it is recommended to read it if it is of high value.

webpack

By default, webpack4many internal optimizations have been done very well, but they cannot meet all business scenarios.
If you find that the packaging is slow and the packaging volume is too large during development, you need to review the configuration.

Code block analysis plug-in webpack-bundle-analyzer

npm i webpack-bundle-analyzer -D
  • Revisewebpack.config.js
// 在头部添加
const {
    
     BundleAnalyzerPlugin } = require('webpack-bundle-analyzer');

// 在plugins: [] 中新增配置如下
new BundleAnalyzerPlugin({
    
    
  analyzerMode: 'server',
  analyzerHost: '127.0.0.1',
  analyzerPort: 8000,
  reportFilename: 'report.html',
  defaultSizes: 'parsed',
  openAnalyzer: true,
  generateStatsFile: false,
  statsFilename: 'stats.json',
  statsOptions: null,
  logLevel: 'info'
})
  • Start the local development service and open it in the browserhttp://127.0.0.1:8000

webpack-bundle-analyzer

  • webpack4 default code splitting strategy
  1. Whether the new chunk is shared or a module from node_modules
  2. Whether the new chunk size is greater than 30kb before compression
  3. The number of concurrent requests to load chunks on demand is less than or equal to 5
  4. The number of concurrent requests when the page is initially loaded is less than or equal to 3

For example, because the UI components are frequently antdincluded , but they are all smaller than the , they 30kbwill not be packaged independently, resulting in too much repeated code chunkinto .
At this time, according to the actual business situation, the default configuration does not meet the requirements. Modification strategy:

  1. A vendorpackage of
  2. A libpackage
  3. commonPackages in node_modules
// 默认配置
splitChunks: {
    
    
  chunks: 'all',
  name: false,
}

// 修改后的配置
splitChunks: {
    
    
  chunks: 'all',
  name: false,
  cacheGroups: {
    
    
    vendor: {
    
    
      name: 'vendor',
      test: module = >/(react|react-dom|react-router-dom|mobx|mobx-react)/.test(module.context),
      chunks: 'initial',
      priority: 11
    },
    libs: {
    
    
      name: 'libs',
      test: (module) = >/(moment|antd|lodash)/.test(module.context),
      chunks: 'all',
      priority: 14
    },
    common: {
    
    
      chunks: "async",
      test: /[\\/] node_modules[\\ / ] / ,
      name: "common",
      minChunks: 3,
      maxAsyncRequests: 5,
      maxSize: 1000000,
      priority: 10
    }
  }
}

in conclusion:

  • The volume before optimization is 56MB (remove sourceMap)
  • The optimized volume is 36MB (keep sourceMap, remove sourceMap about 7.625MB)

glob and purgecss-webpack-plugin remove useless CSS

npm i glob purgecss-webpack-plugin -D
// 在webpack.config.js中的plugins: []中添加.
// 需要注意的是paths一定要是绝对路径,比如antd的样式不在src目录下就需要写成一个数组,要不然antd的样式就会不见了
new purgecssWebpackPlugin({
    
    
  paths: glob.sync(`${
      
      paths.appSrc}/**/*`, {
    
     nodir: true })
})

Conclusion: CSSResources are reduced a lot

some operation down

  • At present, the resource has been reduced to about 7.25M;
  • The packing speed is reduced from 7.5 minutes to 2.5 minutes, which greatly improves the efficiency.

picture

webp

webpIt is a new image format that provides lossless and lossy compression while maintaining quality.
webpIt is an essential optimization method for sites with many pictures, and conversion services CDNare provided.

  • A large-scale use of a treasure

webp

  • Advantages: Under the same quality png, 26%the size of the lossless image is smaller than that of , and the size of the lossy image is jpegsmaller 25-34%.
  • Disadvantages: Some browsers are not compatible and need to be compatible. The following are the official offers
// check_webp_feature:
//   'feature' can be one of 'lossy', 'lossless', 'alpha' or 'animation'.
//   'callback(feature, result)' will be passed back the detection result (in an asynchronous way!)
function check_webp_feature(feature, callback) {
    
    
    var kTestImages = {
    
    
        lossy: "UklGRiIAAABXRUJQVlA4IBYAAAAwAQCdASoBAAEADsD+JaQAA3AAAAAA",
        lossless: "UklGRhoAAABXRUJQVlA4TA0AAAAvAAAAEAcQERGIiP4HAA==",
        alpha: "UklGRkoAAABXRUJQVlA4WAoAAAAQAAAAAAAAAAAAQUxQSAwAAAARBxAR/Q9ERP8DAABWUDggGAAAABQBAJ0BKgEAAQAAAP4AAA3AAP7mtQAAAA==",
        animation: "UklGRlIAAABXRUJQVlA4WAoAAAASAAAAAAAAAAAAQU5JTQYAAAD/AABBTk1GJgAAAAAAAAAAAAAAAAAAAGQAAABWUDhMDQAAAC8AAAAQBxAREYiI/gcA"
    };
    var img = new Image();
    img.onload = function () {
    
    
        var result = (img.width > 0) && (img.height > 0);
        callback(feature, result);
    };
    img.onerror = function () {
    
    
        callback(feature, false);
    };
    img.src = "data:image/webp;base64," + kTestImages[feature];
}
  • According to the request header reported by the client accept: image/webp, the server will automatically judge, if it is supported, it will return the original image if it webpis not supported.

Sprite Figure

Multiple pictures are combined into one picture, and background-positionthe positioning .
Reduce http requests.

HTTP2 can solve the head-of-line blocking problem.

iconfont

svg version sprite map

base64

// 字符转
window.btoa('str');
// canvas 转
canvas.toDataURL();
// 图片转
const reader = new FileReader();
const imgUrlBase64 = reader.readAsDataURL(file);
// webpack 打包转
// url-loader 
  • Advantages: easy to store in html, js, reduce the number of http requests.
  • Cons: Increased file size 30%by around .

It is suitable for scenarios with a small number of small images.

cache

DNS cache

search process

  1. Browser DNS cache first
  2. Find hoststhe file domain name IPmapping (you know the IP behind the wall DNS pollution but not blocked, you can set the hosts file access)
  3. Look up the local DNS解析器(routing) cache
  4. Root DNS server -> top-level .com -> second-level domain name server xx.com -> host www.xx.com
  • It can be seen that some DNSparsing takes up a lot of time, and optimization is still necessary

DNS

  • dns-prefetch, For example, visit the homepage of a treasure, guess that you will visit some domain names next, and analyze them in advance. to save DNSthe query .
    However, a large number of unnecessary pre-acquisitions cause a great waste of public network resources.

dns-prefetch

  • Optimized

dns-better

http cache

http cache

Briefly, for current SPAprojects , generally static resources are placed on the CDN, and mandatory inspection expiration is set
for frequently changing entry files orindex.html they are not cached directly. Other named resources directly set the long cache (max-age: one and a half years).Cache-Control: no-cahce
hash

The specific details have been explained in another text, which is linked at the end of the article.

CDN (Content Delivery Network) content distribution network

advantage:

  1. Resource files are backed up in multiple places, based on the principle of proximity, the server closest to the user on the network provides services, with fast speed and high security;
  2. Bandwidth is expensive. If a large number of users visit, CDNthe website will freeze or crash.

Local cache localStorage, sessionStorage, cookie

  • storage
  1. localStorage has always existed in the browser, either deleted by the user or removed by the browser cache policy
  2. sessionStorage disappears when the page is closed

Advantages: can store larger dataChrome 5M

  • cookie

compared to storage:

  1. Advantages: you can set the expiration time
  2. Disadvantage: The storage capacity is small, and http1.xit will be reported to the server every time, causing network waste.

http-onlySuggestion: Use as little as possible for server security data settings , and use them only for state maintenance and user identification with the server.

browser rendering

CSS

  • To reduce the use @importof , browser parsing htmlwill optimize sniffing to obtain files concurrently. If you use , @importyou need to download and parse the current CSSfile before downloading.
  • CSS has a higher weight and should be downloaded and parsed first.

Screenplay defer, async

Valid only for external scripts. As we all know, script parsing will block DOMparsing , these two properties are to solve this problem.

  • deferWhen downloading, it does not block HTML parsing into DOM. After the download is complete, it will wait for DOMthe construction be completed and DOMContentLoadedexecute before the event is triggered. Multiple deferscripts ensure the order of script execution.

  • asyncWhen downloading, it does not block HTML parsing into DOM, and arranges JS execution as much as possible after downloading. It means that the execution time is uncertain, download early and execute early. Scripts may block DOM construction if DOMnot completed.

  • For example, a treasure is used in large quantities on the headasync

async

  • But I don't quite understand that, in principle, asyncshould used in DOM构建scenarios that have no dependence on and script order, and downloading too fast may also block DOM构建. It feels defermore appropriate.

Anti-shake, throttling

There are detailed descriptions in other articles, so I won't repeat them here, please see the reference index.

Prevent Forced Layout

  • Mainly to avoid reading and writing styles in the loop

FSL

GPU acceleration

  1. will-change: transform
  2. transform: translateZ(0)

It will raise the level of the element to GPUrendering , which is suitable for some Animationanimations.

requestAnimation, requestIdelCallback

  • googleThere are many discussions in the document, detecting new apidevelopments .
  • FacebookreactThe scheduling in the latest fiberis used to requestAnimationperform requestIdelCallbacktime
    slicing of long tasks, avoiding long time-consuming or even jittering in previous deep DOMtree updates.

main-thread

web worker

For those that require a lot of calculations and will occupy the main rendering thread, it is suitable to be executed web workerin .

server

http2

Compared with http 1.0, the main progress of http1.1 is

  1. Enhancements to cache handling, such asEtag
  2. Add rangeheader , response code 206(Partial Content)to support resumable upload
  3. Add hostthe header , multiple domain names can be bound to oneIP
  4. Response header Connection: keep-alive, long connection, the client does not need to communicate with the same host multiple times三次握手

https 与 http

  1. https needs to apply for CAa certificate and requires money; https has an extra layer of security protocol on top of http SSL/TLS;
  2. The client performs CAauthentication , the connection requires asymmetric encryption (time-consuming), and the transmitted data uses symmetric encryption;
  3. Prevent operator http hijacking, inserting advertisements, etc. http port 80, https 443.

http2 vs. http1.x

  1. Header compression, for example, the latter has to carry a lot of the same header information every time it transmits data, http2 will compress the header and avoid repeated header retransmission;
  2. The new binary format, the latter is based on text protocol analysis, and the former is based on 01 strings, which is convenient and robust;
  3. Multiplexing, attention and keep-alivedifferentiation . The former is connection sharing, each request has a unique ID idto confirm ownership, and multiple requests can be mixed with each other at the same time.
    The latter reduces handshakes and maintains long connections, which will affect server performance. First-in-first-out needs to wait for the previous request to be sent before the next one can be sent, resulting in head-of-line blocking. The client generally limits the upper limit of http connections between a page and different servers at the same time;
  4. Server push, http1.x server can only passively receive requests to send resources, HTTP2 can actively push.

http2 can improve transmission efficiency. It is necessary for nginx to handle the upgrade and downgrade of http2.

gzip

The server enables compression, and text files can reduce network transmission.
Especially the text compression effect is more obvious when the file is large and the repetition rate is high.

As shown in index.htmlthe file compression (383-230)/383=39.95%.

gzip

  1. Chrome request headers tell the server which compression algorithms are supported:Accept-Encoding: gzip, deflate, br
  2. The compression algorithm used by the server response response headers:Content-Encoding: gzip
  • nginx open
gzip on;
// 不压缩临界值,大于1K的才压缩,一般不用改
gzip_min_length 1k; 
// 压缩级别,1-10,数字越大压缩的越细,时间也越长
gzip_comp_level 2;
// 进行压缩的文件类型
gzip_types text/plain application/javascript application/x-javascript text/css application/xml text/javascript application/x-httpd-php image/jpeg image/gif image/png;
// ie兼容性不好所以放弃
gzip_disable "MSIE [1-6]\.";

compression-webpack-plugin with gzip

npm i compression-webpack-plugin -D
const CompressionWebpackPlugin = require('compression-webpack-plugin');

// 在webpack.config.js中的plugins: []中添加.
new CompressionWebpackPlugin({
    
    
  asset: '[path].gz[query]', // 目标文件名       
  algorithm: 'gzip',		 // 使用gzip压缩        
  test: /\.(js|css)$/, 		 // 压缩 js 与 css       
  threshold: 10240,			 // 资源文件大于10240B=10kB时会被压缩        
  minRatio: 0.8 			 // 最小压缩比达到0.8时才会被压缩   
})
  • Advantages
    nginxWhen it is turned on gzip, if there is a compressed gzipcompressed file, it will be used directly, reducing cputhe use of server resources.
  • Disadvantages
    Packing time will be lengthened. Nowadays, static resources are generally uploaded CDN, gzipwhich are the basic services of CDN servers.
    Do you need to save the server resources that you have spent money to buy, and increase the packaging time?

es6, dynamic use of POLYFILL

webpackes6The new features supported by default tree-shakingcan shake out unused code, and the performance of the new API is very high.
Recommended for full use.

// 会根据ua返回不同的内容
https://polyfill.io/v3/polyfill.min.js
plan advantage shortcoming
babel-polyfill React official recommendation Volume 200kb
babel-plugin-transform-runtime small volume Cannot poyfill methods on prototypes
polyfill-service Dynamically load according to ui Compatibility, some domestic browsers have problems

Conclusion: The size of resources can be reduced, but if it is troublesome to rely on external services and self-build, give up.

reference

  1. From Optimization to Interview Pretending Guide - Web Series

browser rendering

  1. Anti-shake debounce and throttle
  2. How to build a 60FPS application
  3. Anatomy of a frame
  4. Performance optimization - critical path rendering optimization

webpack

  1. Touch your hands, take you to use webpack4 with a reasonable posture (below)

http-related

  1. HTTP cache
  2. TCP's three-way handshake and four-way wave (detailed + animation)
  3. The difference between HTTP1.0, HTTP1.1 and HTTP2.0
  4. nginx complete configuration
  5. DNS recursive query and iterative query

iconfont

  1. Instructions
  2. related articles

Guess you like

Origin blog.csdn.net/guduyibeizi/article/details/104030345