Front-end performance analysis and improvement plan

How to evaluate the performance of a web page?

First of all, what we can think of is: fast enough, open in seconds.
But there is actually another point, that is stability.

What is the stability of the website?
For example, if you operate a page, it will freeze or even crash after using it for a long time. This is the performance of poor stability

That is to say, the page performance should be considered normal if the page should be opened fast enough, the response during operation should be fast enough, and it can run stably for a long time without freezing.

The "fastness" of the page

In 1993, Jakob Nielson proposed:

  • The loading speed of less than 100 milliseconds is cool
  • One second is probably the limit for the user's thinking to not be interrupted. Users will experience a delay, but it's acceptable
  • About 10 seconds is the limit of the user's attention. Most users leave your site after 10 seconds

Of course, this refers to Internet pages
, but for c-end or b-end pages, customers have to use them. It is rare to open in seconds, basically 2-3 seconds, why?
The reason is that in many cases, performance is not considered at all in development, and only the business code is buried in time. The
webpage needs to be fast, and what will affect it?
At this time, many front-end er will think of a classic interview question: What are the ways to improve page performance?
and answer:

  1. Reduce http requests
  2. Image Compression, Sprite
  3. js, css compression
  4. enable gzip
  5. Use DNS caching strategy
  6. Use cdn cache (if you share it with others, then you and me will be happy)
  7. lazy loading
    ...

Several front-end performance testing tools found

LightHous that comes with Chrome

insert image description here

Pingdom Website Speed Test

Online mediocre tools
Address: https://tools.pingdom.com/
insert image description here

WebPagetest

It is also online, with stronger functions and multiple modes can be selected.
Address: https://www.webpagetest.org/
insert image description here
insert image description here

GTmetrix

Will make some suggestions
Online address: https://gtmetrix.com/insert image description here
insert image description here

Keycdn Tool

Will detail the loading time and HTTP headers of each resource
Online address: https://tools.keycdn.com/speed
insert image description here
insert image description here

GiftOfSpeed

Ordinary
online address: https://www.giftofspeed.com/
insert image description here
insert image description here

Pagelocity

It is said that it can be compared with competitors' pages, but it is actually useless.
Online address: https://pagelocity.com/
insert image description here

Loadtime Tester Juices

It can only measure the speed, it is useless
Online address: https://performance.sucuri.net/
insert image description here

CRAZY FLARES

Used to test DNS, security, performance, network and SEO issues of the website.
Online address: https://gf.dev/
insert image description here

Dotcom-monitor

You can test the screen compatibility by the way
. Online address: https://www.dotcom-tools.com/website-speed-test
insert image description here

Dareboost

Will compare the performance on different browsers
Online address: https://www.dareboost.com/en
insert image description here

What are performance analysis tools for?

In fact, I want to use tools to explain the specific indicators of page performance that the industry is concerned about:

  1. opening speed
  2. paging file size
  3. request speed
  4. blocking time
  5. Request Quantity
  6. dns resolution time
  7. Other enhancements

Apart from LightHouse, other online tools are somewhat useful for our positioning problems, but they are not very useful, especially now that they are basically hi single-page applications.
Don't pay too much attention, their only function is that they can be used as test reports when some customers agree.
Personally, I recommend pressing F12 to view the network, sorting to see the file size, loading time, and analyzing it yourself. How to analyze it and look down

Indicators that need to be paid attention to when using the network

insert image description here
insert image description here

  1. See how many things are loaded, which things are not necessary to load, and which files are too large
  2. The file loading time is actually affected by the network and server bandwidth. Generally, the time depends on the interface.
  3. The response header looks at the configuration of the cache, and whether the expiration time setting is reasonable
  4. The above picture does not open gzip
  5. Interfaces can be merged or split. Some interfaces are slow only for individual data. You can ask the backend to be disassembled separately. Normally, the interfaces should be very fast. Try to merge the interfaces that communicate with the business. Reducing http requests actually includes interface merging
  6. The file is too large to go to webpack analysis
  7. See these two articles for caching strategy
  • https://blog.csdn.net/qq_38217940/article/details/125360427
  • https://blog.csdn.net/qq_38217940/article/details/125349105

Use webpack's webpack-bundle-analyzer to analyze the packaged files

How to use see npm https://www.npmjs.com/package/webpack-bundle-analyzer
insert image description here
how to split js?
1. Use lazy loading of routes

import( /* webpackChunkName: "login" */ '@/layout')

2. Lazy loading of components (dynamic components)

<template>
    <div class="full-container container-wrap">
        <component v-bind:is="currentComponent" @change-page="changePage">
        </component>
    </div>
</template>

<script>
const Home = ()=>import(/* webpackChunkName: "organize-conpenents-home" */ "./components/home");
export default {
    
    
    data() {
    
    
        return {
    
    
            currentComponent: Home,
            currentComponentName: 'Home',
        };
    },
}

3. js dynamic loading

// 懒加载js
var loadScript = async (url) => {
    
    
    return new Promise((resolve, reject) => {
    
    
        const script = document.createElement('script');
        script.src = url;
        script.onload = resolve;
        script.onerror = reject;
        const head = document.getElementsByTagName('head')[0];
        head.appendChild(script);
    });
};

4. Turn off vue preloading (it is necessary to turn it off when looking at loading and caching, otherwise all preloading will affect the judgment)
preloading looks like this (prefetch attribute)

<link href=./css/app.02a07d15.css rel=prefetch>

close method

   chainWebpack: config => {
    
    
        // 移除 prefetch 插件
        config.plugins.delete('prefetch');
        // 移除 preload 插件
        config.plugins.delete('preload');
    },

5. Webpack merges js
webpackChunkName and writes the same

import( /* webpackChunkName: "login" */ '@/login')
import( /* webpackChunkName: "login" */ '@/layout')

Use webpack's speed-measure-webpack-plugin to see the packaging speed

How to use see npm https://www.npmjs.com/package/speed-measure-webpack-plugin
insert image description here
How to improve the packaging speed? (Just talk about the plan, Baidu according to the plan)

  1. Use vite instead (actually, webpack can also be used to configure similar effects. In addition, vite mainly improves the development experience, and subcontracts need to be configured by yourself)
  2. According to the input parameters, it is packaged on demand. If page=login is passed in, only the login route will be run. You can refer to the multi-page configuration
  3. Using cache is cache-loader, but this thing has pitfalls, and sometimes it will cause the modified page not to be updated
  4. Specify the entry, and the entry search time of webpack is very long
  5. Remove unnecessary loaders. For example, you can convert to es5 without babel for local running. The browser itself supports es6 if its version is high.
  6. Static files do not participate in packaging, and there is a copy operation for static files. Think about it, it may take several seconds to copy static files of hundreds of megabytes, although command line execution will be much faster. When starting locally, it can also be configured not to participate in packaging. It is necessary to modify the static file pointing of devSever, which is a little complicated.
  7. Turn off eslint, check the format with the vscode plug-in, the beautification format is actually quite good, as long as everyone is unified
  8. Multi-process builds like the classic HappyPack

image processing

cdn

Need to find a cloud server manufacturer to rent, such as Alibaba Cloud
https://www.aliyun.com/?utm_content=se_1011952919
insert image description here

compression

  1. Use https://tinypng.com/ to compress images online
  2. Modify the url-loader of webpack as needed, the default is to convert below 10k to base64, if you can change it to 5k
module: {
    
    
  rules: [
       {
    
    
        test: /\.(jpeg|jpg|png|svg|gif)$/,
        use: {
    
    
          loader: 'url-loader', // 默认使用的是es6模块
          options: {
    
     // 配置 
            esModule: false, // 使用common.js规范
            outputPath: 'images', // 输出的文件目录
            name: 'images/[contenthash:4].[ext]',
            limit: 20*1024 // 小于20k转为base64
          }
        }
      }
  ]
}
  1. Use background-image with caution, and only write the ones that need to be displayed quickly, because it will block loading. The worst thing is that it will be converted into base64 and packaged into js. Large images can be set as the background using positioning + zindex, and the img tag is loaded asynchronously (ps: this article is for low-version browsers, and high-version background-image is also asynchronous)
  2. Lazy loading of images outside the viewable area

The "stability" of the page

What causes the page to freeze and crash?
Many people come to Zhangkou: memory leaks.
I just want to say that
superficial
memory leaks are only one of the reasons for page freezes, and the reasons for page freezes include:

  • Too many dom nodes
  • Frequent page reflows (when there are many dom nodes, reflows will cause freezes, such as large-screen page elements with many elements)
  • There are too many page monitors, and if they are not closed in time, the monitors will occupy the cpu, so clean them up if not in use
  • Too many page timers and too many requests cause frequent js calculations
  • Memory leak, but in fact, the memory leak will be forcibly recycled. In fact, the impact is not that great. In fact, the cause of the freeze is that the garbage is not collected in time. For example, there are too many objects hanging on the window, and the objects on the window are not garbage. (even if you don't use it)
  • For data calculation of more than 100,000 levels, hundreds of thousands of computers with poor performance will cause the page to crash
  • A large amount of data is stored on winodw. In fact, it is one reason to keep up with the previous one, including JSON.parse and JSON.stringify. A large object will also crash, but some browser versions will report an error

Based on the above, we can simply say a few ways to avoid performance degradation

  1. Use less style in elements, try to use class or other selectors
  2. Animation uses position to avoid reflow
  3. You can give dead width and height, don’t adapt everything to flex
  4. Try not to exceed three layers of css, because when reflowing, it will parse the css to generate a rendering tree. When writing less, you often dolls, which is actually wrong.
  5. When there are many animations, use GPU to accelerate. When using transform, you can use Z axis, which will turn on GPU. For example, translate can use tranlate3d, translate uses CPU, and tranlate3d uses GPU
  6. There are too many dom nodes, you can consider using canvas to implement, or try to delete the invisible dom
  7. The monitors appear in pairs, and when on is off, such as vue's beforeDestroy
  8. Things on the window, when used up, assign null to clear
  9. A large number of calculations are handed over to the backend, otherwise they are placed in the web worker
  10. The same is true for timers, if you have a set, you must use clear

Use Performance to observe page changes

insert image description here

  1. The Nodes curve looks at the changes in the DOM of the page. If it keeps growing, it will get stuck, and the DOM will not be recycled in time.
  2. The listeners listener is analyzed in the same way as Nodes
  3. The heat curve, three consecutive rises and then an immediate plunge, three consecutive occurrences can be judged as a memory leak (see Ruan Yifeng's blog)
  4. Other indicators are relatively detailed and need to be practiced to understand, so I won’t go into details here

Use Memory to observe page memory usage

insert image description here
This is useful, the page crashes, and it freezes after a long time. Basically, there are a lot of things stored on the page, such as dom and listener. Some large objects are not cleaned up in time. This is the most common reason. For example, a large json file is the most serious
. Winodw, the 100M json is directly stored in the outermost layer of the window, and it may crash directly

I think it is in the supplement and continuous update

Detailed lazy loading strategy, unpacking strategy, etc., will be written later.

Guess you like

Origin blog.csdn.net/qq_38217940/article/details/126494506