Remember a failed attempt to optimize jQuery

I often complain about jQuery DOM manipulation performance is not good, and often try to use some of the methods to optimize, but the more optimized, more dismayed to find that jQuery is already doing very well, optimization can be performed from the user's point of view too limited (this does not mean the performance of jQuery are excellent, whereas only say that it is a relatively closed libraries, can not intervene to optimize from the outside). This article will record a failure optimization experience.

Optimization ideas

This time the idea of ​​the optimization from the database. When the database optimization, we often say "a large number of operations in a transaction submitted together, can improve efficiency." Although the database do not understand, I do not know why, but the thought of "affairs" but as I pointed out the direction (though wrong ......).

So I tried to introduce the concept of "transaction" to jQuery by "open" and "Submit" the transaction, some optimization of jQuery from the outside, the most important is to reduce the number of cycles each function.

As we all know, jQuery DOM manipulation to get all, set first as the standard, which is used to set the operating DOM attribute / style, almost all of the selected elements of the next iteration, jQuery.access function is one of the core part, wherein the means for circulating the code is as follows:

// Setting one attribute
if ( value !== undefined ) {
    // Optionally, function values get executed if exec is true
    exec = !pass && exec && jQuery.isFunction(value);

    for ( var i = 0; i < length; i++ ) {
        fn(
            elems[i], 
            key, 
            exec ? value.call(elems[i], i, fn(elems[i], key)) : value, 
            pass
        );
    }

    return elems;
}

For example jQuery.fn.css function is this:

jQuery.fn.css = function( name, value ) {
    // Setting 'undefined' is a no-op
    if ( arguments.length === 2 && value === undefined ) {
        return this;
    }

    return jQuery.access( this, name, value, true, function( elem, name, value ) {
        return value !== undefined ?
            jQuery.style( elem, name, value ) :
            jQuery.css( elem, name );
	});
};

Thus, following this code is assumed to be selected with a div element 5000, 10,000 nodes will have access cycles:

jQuery('div').css('height', 300).css('width', 200);

In my scenario, a "transaction" can be as general operation of the database, unified in time "commit transaction" by saving all operations, access to 10,000 nodes, reduced to 5000, equivalent to enhance the "oNE" performance.

Simple implementation

jQuery Operation "transaction" in the formula, providing two functions:

  • begin: Open a "transaction", it returns an object of a transaction. This object has all the functions of jQuery, but call the function does not take effect immediately, only after the "Submit transaction" to take effect.
  • commit: submit a "transaction" to ensure that all prior calls over the function takes effect, pay to return the original jQuery object.

It is also very easy to implement:

  1. Create a "business objects", copy all functions on jQuery.fn to the object.
  2. When you call a function, add the function name and parameters called in pre-prepared "queue" based.
  3. When the transaction is committed, the elements of the selected traversed once for each node traversed all application functions in "queue."

Simply code is as follows:

var slice = Array.prototype.slice;
jQuery.fn.begin = function() {
    var proxy = {
            _core: c,
            _queue: []
        },
        key,
        func;
    //复制jQuery.fn上的函数
    for (key in jQuery.fn) {
        func = jQuery.fn[key];
        if (typeof func == 'function') {
            //这里会因为for循环产生key始终是最后一个循环值的问题
            //因此必须使用一个闭包保证key的有效性(LIFT效应)
            (function(key) {
                proxy[key] = function() {
                    //将函数调用放到队列中
                    this._queue.push([key, slice.call(arguments, 0)]);
                    return this;
                };
            })(key);
        }
    }
    //避免commit函数也被拦截
    proxy.commit = jQuery.fn.commit;
    return proxy;
};

jQuery.fn.commit = function() {
    var core = this._core,
        queue = this._queue;
    //仅一个each循环
    core.each(function() {
        var i = 0,
            item,
            jq = jQuery(this);
        //调用所有函数
        for (; item = queue[i]; i++) {
            jq[item[0]].apply(jq, item[1]);
        }
    });
    return this.c;
};

test environment

Test using the following conditions:

  • Div 5000 in a container (<div id = "container"> </ div>) in.
  • Select this 5000 div using $ ( '# container> div').
  • Each div required to set a random background color (randomcolor function), and a random width (randomWidth function) 800px less.

Call to take the test method are three:

  • Normal use method:

    $('#container>div')
        .css('background-color', randomColor)
        .css('width', randomWidth);
  • Single round-robin:

    $('#container>div').each(function() {
        $(this).css('background-color', randomColor).css('width', randomWidth);
    });
  • Affairs Act:

    $('#container>div')
        .begin()
            .css('background-color', randomColor)
            .css('width', randomWidth)
        .commit();
  • Object assignment method:

    $('#container>div').css({
        'background-color': randomColor,
        'width': randomWidth
    });

Test series browser to select Chrome 8 (measured directly linked with the IE).

Sad results

The original forecast result, the efficiency of a single round-robin method is much higher than normal use, while the transaction method, although slower than some of the single round-robin, but it should be faster than using the normal method, and the object assignment method is actually inside jQuery support single round-robin, efficiency should be the highest.

Unfortunately, however, the results are as follows:

Act normal use Single round-robin Affairs Act Object assignment method
18435ms 18233ms 18918ms 17748ms

From the results point of view, the transaction method has become a method slowest. At the same time single cycle with normal use and there is no obvious advantage, even dependent objects inside jQuery assignment method implemented nor opened a big gap.

Since the operating element 5000 is already very large circulation, such a large circulation also failed to widen the gap in performance, usually the most commonly used operating about 10 elements less likely to have a distinct advantage, may even be a disadvantage expand of.

The reason, since the single cycle method itself would not significantly improve performance, and therefore relies on a single cycle, and the transaction process is built on top of an external single cycle, the natural cycle is based on a single needed create additional transaction object, save function queue, the queue traversal cost function, the results lost the normal method to use is also reasonable.

At this point, it can be announced that mimic the failed path optimization "affairs". But also further analysis of the results.

Where performance

First, use the code up analysis, to compare the normal use of the test method and the fastest method of assigning an object, it can be said that the only difference between the two different elements of the number of cycles (here set aside the internal problems of jQuery, in fact bad indeed achieve jQuery.access also suspected of hind legs dragging the object assignment method, but fortunately not serious), using the normal method is 10000 elements, objects assignment method is 5000 elements. So simply finds, 18435 - 17748 = 687ms is time-consuming elements of the 5000 cycle, which accounted for about 3.5% of the entire implementation process, not the backbone of the entire implementation process, it is really no need for optimization.

Then another 96.5% of the cost of it gone? Remember Doglas sentence, "In fact Javascript is not slow, slow is the DOM operation." In fact, the remaining 96.5% of spending, the removal of basic consumption function calls and so on, there are at least 95% of the time is spent on re-rendering on the style after the DOM element is changed.

After discovering this fact, in fact, will have a more accurate optimization direction, it is one of the basic principles of front-end performance: when modifying a large number of sub-elements, the first root parent DOM nodes out of the DOM tree. So if you use the following code to be tested again:

//没有重用$('#container')已经很糟糕了
$('#container').detach().find('div')
    .css('background-color', randomColor)
    .css('width', randomWidth);
$('#container').appendTo(document.body);

The test results always remain at around 900ms, the previous data is not one order of magnitude above, the real optimization success.

Lessons and summary

  • Be sure to find the correct performance bottlenecks and then optimize blind speculation and extreme only lead down the wrong path.
  • The data speak, in front of the data that nobody speak!
  • I do not think that a "transaction" in this direction is wrong, if jQuery native will be able to support the concept of such a "business", will have other points can be optimized? For example, a transaction will automatically parent element out of the DOM tree like ......

Reproduced in: https: //www.cnblogs.com/GrayZhang/archive/2011/02/05/a-failure-in-jquery-optimization.html

Guess you like

Origin blog.csdn.net/weixin_34341117/article/details/93272195