Article directory
Front-end extreme performance optimization manual
Essential knowledge points on the road to front-end optimization.
- The display of the browser inputting the url to the page, what happened specifically, and what optimizations the front end can do
- DNS recursive query analysis - DNS optimization prefetch;
- TCP three-way handshake, four-way wave - the difference between http1.0/1.1/2.0, the difference between http/s;
- http cache - 304 and CDN;
- Browser rendering mechanism - the importance of CSS and JS order, the loss of @import, anti-shake, throttling, rearrangement, redrawing, GPU acceleration, etc.;
- How to optimize JS main thread - web worker, time slicing
- …
- Have you optimized the picture, sprite, webp, svg;
- Packaging optimization such as webpack
- Basic knowledge of operation and maintenance nginx
This article summarizes the basic points of front-end performance optimization in a certain order. You can check your projects step by step to find performance bottlenecks. If there are any errors or omissions, please add and correct them.
Some of the principles and details of the article are in the reference article, and it is recommended to read it if it is of high value.
webpack
By default, webpack4
many internal optimizations have been done very well, but they cannot meet all business scenarios.
If you find that the packaging is slow and the packaging volume is too large during development, you need to review the configuration.
Code block analysis plug-in webpack-bundle-analyzer
npm i webpack-bundle-analyzer -D
- Revise
webpack.config.js
// 在头部添加
const {
BundleAnalyzerPlugin } = require('webpack-bundle-analyzer');
// 在plugins: [] 中新增配置如下
new BundleAnalyzerPlugin({
analyzerMode: 'server',
analyzerHost: '127.0.0.1',
analyzerPort: 8000,
reportFilename: 'report.html',
defaultSizes: 'parsed',
openAnalyzer: true,
generateStatsFile: false,
statsFilename: 'stats.json',
statsOptions: null,
logLevel: 'info'
})
- Start the local development service and open it in the browser
http://127.0.0.1:8000
- webpack4 default code splitting strategy
- Whether the new chunk is shared or a module from node_modules
- Whether the new chunk size is greater than 30kb before compression
- The number of concurrent requests to load chunks on demand is less than or equal to 5
- The number of concurrent requests when the page is initially loaded is less than or equal to 3
For example, because the UI components are frequently antd
included , but they are all smaller than the , they 30kb
will not be packaged independently, resulting in too much repeated code chunk
into .
At this time, according to the actual business situation, the default configuration does not meet the requirements. Modification strategy:
- A
vendor
package of - A
lib
package common
Packages in node_modules
// 默认配置
splitChunks: {
chunks: 'all',
name: false,
}
// 修改后的配置
splitChunks: {
chunks: 'all',
name: false,
cacheGroups: {
vendor: {
name: 'vendor',
test: module = >/(react|react-dom|react-router-dom|mobx|mobx-react)/.test(module.context),
chunks: 'initial',
priority: 11
},
libs: {
name: 'libs',
test: (module) = >/(moment|antd|lodash)/.test(module.context),
chunks: 'all',
priority: 14
},
common: {
chunks: "async",
test: /[\\/] node_modules[\\ / ] / ,
name: "common",
minChunks: 3,
maxAsyncRequests: 5,
maxSize: 1000000,
priority: 10
}
}
}
in conclusion:
- The volume before optimization is 56MB (remove sourceMap)
- The optimized volume is 36MB (keep sourceMap, remove sourceMap about 7.625MB)
glob and purgecss-webpack-plugin remove useless CSS
npm i glob purgecss-webpack-plugin -D
// 在webpack.config.js中的plugins: []中添加.
// 需要注意的是paths一定要是绝对路径,比如antd的样式不在src目录下就需要写成一个数组,要不然antd的样式就会不见了
new purgecssWebpackPlugin({
paths: glob.sync(`${
paths.appSrc}/**/*`, {
nodir: true })
})
Conclusion: CSS
Resources are reduced a lot
some operation down
- At present, the resource has been reduced to about 7.25M;
- The packing speed is reduced from 7.5 minutes to 2.5 minutes, which greatly improves the efficiency.
picture
webp
webp
It is a new image format that provides lossless and lossy compression while maintaining quality.
webp
It is an essential optimization method for sites with many pictures, and conversion services CDN
are provided.
- A large-scale use of a treasure
- Advantages: Under the same quality
png
,26%
the size of the lossless image is smaller than that of , and the size of the lossy image isjpeg
smaller25-34%
. - Disadvantages: Some browsers are not compatible and need to be compatible. The following are the official offers
// check_webp_feature:
// 'feature' can be one of 'lossy', 'lossless', 'alpha' or 'animation'.
// 'callback(feature, result)' will be passed back the detection result (in an asynchronous way!)
function check_webp_feature(feature, callback) {
var kTestImages = {
lossy: "UklGRiIAAABXRUJQVlA4IBYAAAAwAQCdASoBAAEADsD+JaQAA3AAAAAA",
lossless: "UklGRhoAAABXRUJQVlA4TA0AAAAvAAAAEAcQERGIiP4HAA==",
alpha: "UklGRkoAAABXRUJQVlA4WAoAAAAQAAAAAAAAAAAAQUxQSAwAAAARBxAR/Q9ERP8DAABWUDggGAAAABQBAJ0BKgEAAQAAAP4AAA3AAP7mtQAAAA==",
animation: "UklGRlIAAABXRUJQVlA4WAoAAAASAAAAAAAAAAAAQU5JTQYAAAD/AABBTk1GJgAAAAAAAAAAAAAAAAAAAGQAAABWUDhMDQAAAC8AAAAQBxAREYiI/gcA"
};
var img = new Image();
img.onload = function () {
var result = (img.width > 0) && (img.height > 0);
callback(feature, result);
};
img.onerror = function () {
callback(feature, false);
};
img.src = "data:image/webp;base64," + kTestImages[feature];
}
- According to the request header reported by the client
accept: image/webp
, the server will automatically judge, if it is supported, it will return the original image if itwebp
is not supported.
Sprite Figure
Multiple pictures are combined into one picture, and background-position
the positioning .
Reduce http requests.
HTTP2 can solve the head-of-line blocking problem.
iconfont
svg version sprite map
base64
// 字符转
window.btoa('str');
// canvas 转
canvas.toDataURL();
// 图片转
const reader = new FileReader();
const imgUrlBase64 = reader.readAsDataURL(file);
// webpack 打包转
// url-loader
- Advantages: easy to store in html, js, reduce the number of http requests.
- Cons: Increased file size
30%
by around .
It is suitable for scenarios with a small number of small images.
cache
DNS cache
search process
- Browser DNS cache first
- Find
hosts
the file domain nameIP
mapping (you know the IP behind the wall DNS pollution but not blocked, you can set the hosts file access) - Look up the local
DNS解析器
(routing) cache - Root DNS server -> top-level .com -> second-level domain name server xx.com -> host www.xx.com
- It can be seen that some
DNS
parsing takes up a lot of time, and optimization is still necessary
dns-prefetch
, For example, visit the homepage of a treasure, guess that you will visit some domain names next, and analyze them in advance. to saveDNS
the query .
However, a large number of unnecessary pre-acquisitions cause a great waste of public network resources.
- Optimized
http cache
Briefly, for current SPA
projects , generally static resources are placed on the CDN, and mandatory inspection expiration is set
for frequently changing entry files orindex.html
they are not cached directly. Other named resources directly set the long cache (max-age: one and a half years).Cache-Control: no-cahce
hash
The specific details have been explained in another text, which is linked at the end of the article.
CDN (Content Delivery Network) content distribution network
advantage:
- Resource files are backed up in multiple places, based on the principle of proximity, the server closest to the user on the network provides services, with fast speed and high security;
- Bandwidth is expensive. If a large number of users visit,
CDN
the website will freeze or crash.
Local cache localStorage, sessionStorage, cookie
- storage
- localStorage has always existed in the browser, either deleted by the user or removed by the browser cache policy
- sessionStorage disappears when the page is closed
Advantages: can store larger dataChrome 5M
- cookie
compared to storage
:
- Advantages: you can set the expiration time
- Disadvantage: The storage capacity is small, and
http1.x
it will be reported to the server every time, causing network waste.
http-only
Suggestion: Use as little as possible for server security data settings , and use them only for state maintenance and user identification with the server.
browser rendering
CSS
- To reduce the use
@import
of , browser parsinghtml
will optimize sniffing to obtain files concurrently. If you use ,@import
you need to download and parse the currentCSS
file before downloading. - CSS has a higher weight and should be downloaded and parsed first.
Screenplay defer, async
Valid only for external scripts. As we all know, script parsing will block DOM
parsing , these two properties are to solve this problem.
-
defer
When downloading, it does not block HTML parsing into DOM. After the download is complete, it will wait forDOM
the construction be completed andDOMContentLoaded
execute before the event is triggered. Multipledefer
scripts ensure the order of script execution. -
async
When downloading, it does not block HTML parsing into DOM, and arranges JS execution as much as possible after downloading. It means that the execution time is uncertain, download early and execute early. Scripts may block DOM construction ifDOM
not completed. -
For example, a treasure is used in large quantities on the head
async
- But I don't quite understand that, in principle,
async
should used inDOM构建
scenarios that have no dependence on and script order, and downloading too fast may also blockDOM构建
. It feelsdefer
more appropriate.
Anti-shake, throttling
There are detailed descriptions in other articles, so I won't repeat them here, please see the reference index.
Prevent Forced Layout
- Mainly to avoid reading and writing styles in the loop
GPU acceleration
will-change: transform
transform: translateZ(0)
It will raise the level of the element to GPU
rendering , which is suitable for some Animation
animations.
requestAnimation, requestIdelCallback
google
There are many discussions in the document, detecting newapi
developments .Facebook
react
The scheduling in the latestfiber
is used torequestAnimation
performrequestIdelCallback
time
slicing of long tasks, avoiding long time-consuming or even jittering in previous deepDOM
tree updates.
web worker
For those that require a lot of calculations and will occupy the main rendering thread, it is suitable to be executed web worker
in .
server
http2
Compared with http 1.0, the main progress of http1.1 is
- Enhancements to cache handling, such as
Etag
- Add
range
header , response code206(Partial Content)
to support resumable upload - Add
host
the header , multiple domain names can be bound to oneIP
- Response header
Connection: keep-alive
, long connection, the client does not need to communicate with the same host multiple times三次握手
https 与 http
- https needs to apply for
CA
a certificate and requires money; https has an extra layer of security protocol on top of httpSSL/TLS
; - The client performs
CA
authentication , the connection requires asymmetric encryption (time-consuming), and the transmitted data uses symmetric encryption; - Prevent operator http hijacking, inserting advertisements, etc. http port
80
, https443
.
http2 vs. http1.x
- Header compression, for example, the latter has to carry a lot of the same header information every time it transmits data, http2 will compress the header and avoid repeated header retransmission;
- The new binary format, the latter is based on text protocol analysis, and the former is based on 01 strings, which is convenient and robust;
- Multiplexing, attention and
keep-alive
differentiation . The former is connection sharing, each request has a unique IDid
to confirm ownership, and multiple requests can be mixed with each other at the same time.
The latter reduces handshakes and maintains long connections, which will affect server performance. First-in-first-out needs to wait for the previous request to be sent before the next one can be sent, resulting in head-of-line blocking. The client generally limits the upper limit of http connections between a page and different servers at the same time; - Server push, http1.x server can only passively receive requests to send resources, HTTP2 can actively push.
http2 can improve transmission efficiency. It is necessary for nginx to handle the upgrade and downgrade of http2.
gzip
The server enables compression, and text files can reduce network transmission.
Especially the text compression effect is more obvious when the file is large and the repetition rate is high.
As shown in index.html
the file compression (383-230)/383=39.95%
.
- Chrome request headers tell the server which compression algorithms are supported:
Accept-Encoding: gzip, deflate, br
- The compression algorithm used by the server response response headers:
Content-Encoding: gzip
- nginx open
gzip on;
// 不压缩临界值,大于1K的才压缩,一般不用改
gzip_min_length 1k;
// 压缩级别,1-10,数字越大压缩的越细,时间也越长
gzip_comp_level 2;
// 进行压缩的文件类型
gzip_types text/plain application/javascript application/x-javascript text/css application/xml text/javascript application/x-httpd-php image/jpeg image/gif image/png;
// ie兼容性不好所以放弃
gzip_disable "MSIE [1-6]\.";
compression-webpack-plugin with gzip
npm i compression-webpack-plugin -D
const CompressionWebpackPlugin = require('compression-webpack-plugin');
// 在webpack.config.js中的plugins: []中添加.
new CompressionWebpackPlugin({
asset: '[path].gz[query]', // 目标文件名
algorithm: 'gzip', // 使用gzip压缩
test: /\.(js|css)$/, // 压缩 js 与 css
threshold: 10240, // 资源文件大于10240B=10kB时会被压缩
minRatio: 0.8 // 最小压缩比达到0.8时才会被压缩
})
- Advantages
nginx
When it is turned ongzip
, if there is a compressedgzip
compressed file, it will be used directly, reducingcpu
the use of server resources. - Disadvantages
Packing time will be lengthened. Nowadays, static resources are generally uploadedCDN
,gzip
which are the basic services of CDN servers.
Do you need to save the server resources that you have spent money to buy, and increase the packaging time?
es6, dynamic use of POLYFILL
webpack
es6
The new features supported by default tree-shaking
can shake out unused code, and the performance of the new API is very high.
Recommended for full use.
// 会根据ua返回不同的内容
https://polyfill.io/v3/polyfill.min.js
plan | advantage | shortcoming |
---|---|---|
babel-polyfill | React official recommendation | Volume 200kb |
babel-plugin-transform-runtime | small volume | Cannot poyfill methods on prototypes |
polyfill-service | Dynamically load according to ui | Compatibility, some domestic browsers have problems |
Conclusion: The size of resources can be reduced, but if it is troublesome to rely on external services and self-build, give up.
reference
browser rendering
- Anti-shake debounce and throttle
- How to build a 60FPS application
- Anatomy of a frame
- Performance optimization - critical path rendering optimization
webpack
http-related
- HTTP cache
- TCP's three-way handshake and four-way wave (detailed + animation)
- The difference between HTTP1.0, HTTP1.1 and HTTP2.0
- nginx complete configuration
- DNS recursive query and iterative query