Interpretation of React's source code and principles (10): updateQueue and processUpdateQueue

Written at the beginning of the column (stack armor)

  1. The author is not an expert in front-end technology, but just a novice in front-end technology who likes to learn new things. He wants to learn source code just to cope with the rapidly declining front-end market and the need to find a job. This column is the author's own thinking in the process of learning And experience, there are also many parts that refer to other tutorials. If there are errors or problems, please point them out to the author. The author guarantees that the content is 100% correct. Please do not use this column as a reference answer.

  2. The reading of this column requires you to have a certain foundation of React, JavaScript and front-end engineering. The author will not explain many basic knowledge points, such as: what is babel, what is the syntax of jsx, please refer to it when necessary material.

  3. Many parts of this column refer to a large number of other tutorials. If there are similarities, the author plagiarized them. Therefore, this tutorial is completely open source. You can integrate, summarize and add your own understanding to various tutorials as the author.

Contents of this section

This chapter is a supplementary chapter, because I suddenly found out that our updateQueue was mentioned when adding lanes. I have talked about it before, but I found that I didn’t talk about it because it involved a lot of lanes. So before adding hooks, we Add an explanation of updateQueue and related operations. This article will give you a deeper understanding of how we rely on the lanes system for our updates after I learn the **lanes** system. Here is a very classic one Interview questions, we will also mention:

updateQueue

Let's first review what updateQueue is. Let's go back to our fiber structure. We can see that each fiber node has an update queue of updateQueue. Let's take a look at the data structure of this update queue. It is in the source code This location: packages/react-reconciler/src/ReactUpdateQueue.old.js

export type UpdateQueue<State> = {
    
    |
  baseState: State,      // 当前 state
  firstBaseUpdate: Update<State> | null,  // 上次渲染时遗的链表头节点
  lastBaseUpdate: Update<State> | null,   // 上次渲染时遗的链表尾节点
  shared: SharedQueue<State>, // 本次渲染时要执行的任务,
  effects: Array<Update<State>> | null, // 有回调函数的update
};

export function initializeUpdateQueue<State>(fiber: Fiber): void {
    
    
  const queue: UpdateQueue<State> = {
    
    
    baseState: fiber.memoizedState, // 前一次更新计算得出的状态,
    firstBaseUpdate: null, 
    lastBaseUpdate: null, 
    shared: {
    
    
      pending: null, // 更新操作的循环链表
      interleaved: null,
      lanes: NoLanes,
    },
    effects: null,
  };
  fiber.updateQueue = queue;
}

Let's first look at a part of the content that is well understood:

  • baseState This stores the value of the state before we render
  • shared is a data structure, which stores our series of updates, we need to calculate our updated value based on shared updates and baseState
  • Effects storage is an update with a callback function, because after the update is completed, we need to trigger their callback
  • Finally, let's look at firstBaseUpdate and lastBaseUpdate. Our knowledge of lanes is used here. Every time we schedule, we will judge whether the current task has enough priority to execute. If the priority is not enough, it will be stored again in the linked list for the next Rescheduling during the first rendering, so we need to solve these legacy tasks before starting our new task when we schedule a new one

When we create Fiber nodes, we can use initializeUpdateQueueto create ours UpdateQueue, using memoizedStatethe properties to create our baseState

update

Then let's look at the single update, which is also our old friend. It has been mentioned many times before, because our UpdateQueueis , so we need to put a next pointer in our single Update to point to our next a node:

const update: Update<*> = {
    
    
  eventTime, // 当前操作的时间
  lane, // 优先级
  tag: UpdateState, // 执行的操作
  payload: null,
  callback: null,
  next: null, // next指针
};

We mainly focus on these parameters:

  • eventTime: Task time, the number of milliseconds obtained through performance.now().

  • lane: Priority related field.

  • tag: The type of update, including UpdateState | ReplaceState | ForceUpdate | CaptureUpdate.

  • payload: Update the mounted data. Different types of components mount different data. For ClassComponent, payload is the first parameter of this.setState. For HostRoot, payload is the first parameter passed to ReactDOM.render.

  • callback: The updated callback function, which is the second parameter of setState.

  • next: Connect with other Updates to form a linked list. For example, if multiple setStates are triggered at the same time, multiple Updates will be formed, and then connected through next.

These parameters have been mentioned in our update creation process, so we won't elaborate here, let's continue to look at related operations:

enqueueUpdate

This function is used to add our update to our updateQueue, let's take a brief look at the logic:

  • First we get our updateQueue
  • We get the shared pending of our updateQueue, this property is where our update queue hangs
  • InterleavedUpdate is an interleaved update process. The update that occurs in the middle of the rendering process is called an interleaved update. In the update queue, there are two single-link list queue fields: pending and interleaved. When we schedule an interleaved update, it will be stored in the interleaved property. The entire field is then pushed onto a global array variable. After the current rendering ends, the global array variable is traversed, and the interleaved update is transferred to the pending queue.
  • When we insert an update, when there is already a node, pending points to the last node, and pending.next points to the first node. After we insert a node, let update's next point to the first node. The next pointer of the last node pending points to the update node, so that update enters the linked list. At this time, update is the last node of the linked list.
export function enqueueUpdate<State>(fiber: Fiber, update: Update<State>, lane: Lane) {
    
    
  const updateQueue = fiber.updateQueue;
  if (updateQueue === null) {
    
    
    // 只有在fiber已经被卸载了才会出现
    return;
  }
  const sharedQueue: SharedQueue<State> = (updateQueue: any).shared;
  // 交错更新
  if (isInterleavedUpdate(fiber, lane)) {
    
    
   if (interleaved === null) {
    
    
      update.next = update;
      pushInterleavedQueue(sharedQueue);
    } else {
    
    
      update.next = interleaved.next;
      interleaved.next = update;
    }
    sharedQueue.interleaved = update;

  } else {
    
    
    const pending = sharedQueue.sharedQueue;

    if (pending === null) {
    
    
      // 第一个节点
      update.next = update;
    } else {
    
    
      // 链表中有数据了,添加数据
      update.next = pending.next;
      pending.next = update;
    }
    sharedQueue.pending = update;
  }
}

We can see that the pending queue in our sharedQueue is a circular queue. The reason for using this data structure is:

  • We only need to operate the head and tail of the update queue to merge the queues, and traverse our queues, the linked list can complete the functions we need

  • The linked list allows us to insert a new update object very quickly without wasting space

  • The circular linked list can find the last and the first node only by using a pointer, while the ordinary linked list needs to maintain the head node and the tail node at the same time, otherwise the efficiency of the merge operation is very low

processUpdateQueue

Then our main event comes, we can now create and add our update, so how does the update work on our elements, it is on processUpdateQueuethis function, let's take a look at its logic:

So first to summarize, processUpdateQueuethe function does the following things:

  • We first disassemble our circular linked list; firstBaseUpdatethen lastBaseUpdatesplice the queue formed by and to queuethe front , and finally form a large linear UpdateQueue, firstBaseUpdateand as the starting point and end point of the queue lastBaseUpdaterespectively
  • workInProgressSynchronize the queue in the node to the current node
  • Traverse our queue and updatedetermine lanewhether each satisfies the priority of the update queue in turn (we said this in the previous article)
  • If the update priority condition is not satisfied, save the that does not meet the update condition in a updatelinked list, newFirstBaseUpdateand newLastBaseUpdateis the starting point and end point of the queue. If there is updatedelayed execution, the currently calculated will newStatebe saved once, and then updated newLanes, this is the lanes to be executed next time, and the priority that meets the conditions is stuffed into it, so that it can be executed when the next traversal arrives
  • If the update priority condition is satisfied, first judge that if an update has been postponed before, all subsequent tasks must enter the delayed queue, because the previous task may affect the subsequent task, and the delayed Updateobject willlane be set to NoLanelevel , because this priority will always be true when checking whether it can be run, after all, our task can be executed.
  • Then getStateFromUpdatecalculate the new one stateby and store it in newState; then save setStateit callback, which is the second parameter, then mark the current fiberone callbackand store it in flagsthe field
  • If the update is not delayed, newStateassignqueue.baseState
  • If there are delays update, mark which intervals updateare delayed, and only update workInProgress(new nodes)memoizedState
  • After this round of running is over, we can start a new round of scheduling again and continue to run our next batch update, which we mentioned in the previous tutorial
export function processUpdateQueue<State>(
  workInProgress: Fiber,
  props: any,
  instance: any,
  renderLanes: Lanes,
): void {
    
    
  const queue: UpdateQueue<State> = (workInProgress.updateQueue: any);

  // 获取上一次还没有渲染的队列 firstBaseUpdate 和 lastBaseUpdate
  let firstBaseUpdate = queue.firstBaseUpdate;
  let lastBaseUpdate = queue.lastBaseUpdate;
  
  //获取当前的队列
  let pendingQueue = queue.shared.pending;

  // 将 pendingQueue 拼接到,更新链表 queue.firstBaseUpdate 的后面,我们先处理遗留的,再处理当前的
  if (pendingQueue !== null) {
    
    
    queue.shared.pending = null;
    const lastPendingUpdate = pendingQueue;
    const firstPendingUpdate = lastPendingUpdate.next;
    lastPendingUpdate.next = null;
    if (lastBaseUpdate === null) {
    
    
      firstBaseUpdate = firstPendingUpdate;
    } else {
    
    
      lastBaseUpdate.next = firstPendingUpdate;
    }
    lastBaseUpdate = lastPendingUpdate;

    // 若workInProgress 树对应的在 current 树的那个 fiber 节点存在,
    const current = workInProgress.alternate;
    if (current !== null) {
    
    
      const currentQueue: UpdateQueue<State> = (current.updateQueue: any);
      const currentLastBaseUpdate = currentQueue.lastBaseUpdate;

      // 若current更新链表的最后那个节点与当前将要更新的链表的最后那个节点不一样则,把将要更新的链表也拼接到current中
      if (currentLastBaseUpdate !== lastBaseUpdate) {
    
    
        if (currentLastBaseUpdate === null) {
    
    
          currentQueue.firstBaseUpdate = firstPendingUpdate;
        } else {
    
    
          currentLastBaseUpdate.next = firstPendingUpdate;
        }
        currentQueue.lastBaseUpdate = lastPendingUpdate;
      }
    }
  }

  if (firstBaseUpdate !== null) {
    
    
    // 新的 state
    let newState = queue.baseState;
    // 新的 lane 优先级
    let newLanes = NoLanes;
    let newBaseState = null;
    let newFirstBaseUpdate = null;
    let newLastBaseUpdate = null;
    // 第一个 update
    let update = firstBaseUpdate;

    // 队列的循环处理
    do {
    
    
      const updateLane = update.lane;
      const updateEventTime = update.eventTime;
      if (!isSubsetOfLanes(renderLanes, updateLane)) {
    
    
        // 判断出当前 update 的 lane 不满足更新优先级条件,把不满足更新条件的 update 用存了起来
        const clone: Update<State> = {
    
    
          eventTime: updateEventTime,
          lane: updateLane,
          tag: update.tag,
          payload: update.payload,
          callback: update.callback,
          next: null,
        };
        if (newLastBaseUpdate === null) {
    
    
          newFirstBaseUpdate = newLastBaseUpdate = clone;
          // 如果出现有 update 被延迟执行,把当前已经计算好的 newState 先做一次保存
          newBaseState = newState;
        } else {
    
    
          newLastBaseUpdate = newLastBaseUpdate.next = clone;
        }
        // 更新 update 的 lane,使其下次遍历到时才能执行
        newLanes = mergeLanes(newLanes, updateLane);
      } else {
    
    
        // 满足更新条件的 update
        if (newLastBaseUpdate !== null) {
    
    
          //如果前面出现过有 update 被推迟,那么后面所有任务都必须进入到被延迟的队列中
          const clone: Update<State> = {
    
    
            eventTime: updateEventTime,
            lane: NoLane,
            tag: update.tag,
            payload: update.payload,
            callback: update.callback,
            next: null,
          };
          newLastBaseUpdate = newLastBaseUpdate.next = clone;
        }

        // 计算新的 state
        newState = getStateFromUpdate(
          workInProgress,
          queue,
          update,
          newState,
          props,
          instance,
        );
        // 保存 setState 的 callback,就是第二个参数
        const callback = update.callback;
        if (callback !== null) {
    
    
          // 标记当前 fiber 有 callback
          workInProgress.flags |= Callback;
          const effects = queue.effects;
          if (effects === null) {
    
    
            queue.effects = [update];
          } else {
    
    
            effects.push(update);
          }
        }
      }
      // 下一个 update 对象
      update = update.next;
      if (update === null) {
    
    
        pendingQueue = queue.shared.pending;
        if (pendingQueue === null) {
    
    
          // 循环处理结束
          break;
        } else {
    
    
          // 当前的 queue 处理完后,需要检查一下 queue.shared.pending 是否有更新,如果有更新那么把新的放进来继续
          const lastPendingUpdate = pendingQueue;
          const firstPendingUpdate = ((lastPendingUpdate.next: any): Update<State>);
          lastPendingUpdate.next = null;
          update = firstPendingUpdate;
          queue.lastBaseUpdate = lastPendingUpdate;
          queue.shared.pending = null;
        }
      }
    } while (true);

    if (newLastBaseUpdate === null) {
    
    
      // 没出现 update 被延迟的情况下,把的计算结果赋值给 newBaseState
      newBaseState = newState;
    }
    // 把 newBaseState 给到 baseState
    queue.baseState = ((newBaseState: any): State);

    // 保存被延迟的 update
    queue.firstBaseUpdate = newFirstBaseUpdate;
    queue.lastBaseUpdate = newLastBaseUpdate;
    
    // 交错更新
    const lastInterleaved = queue.shared.interleaved;
    if (lastInterleaved !== null) {
    
    
      let interleaved = lastInterleaved;
      do {
    
    
        newLanes = mergeLanes(newLanes, interleaved.lane);
        interleaved = ((interleaved: any).next: Update<State>);
      } while (interleaved !== lastInterleaved);
    } else if (firstBaseUpdate === null) {
    
    
      queue.shared.lanes = NoLanes;
    }

    // 标记哪些区间的update被延迟了
    markSkippedUpdateLanes(newLanes);
    workInProgress.lanes = newLanes;
    // 更新 workInProgress(新节点)的 memoizedState
    workInProgress.memoizedState = newState;
  }
}

Let's take a look at the getStateFromUpdate function:

  • First, judge the mode we need according to the tag set by our update
  • If it is ReplaceState, that is, we call it replaceState, indicating that we want to discard the old state and directly replace the old state with the new state. If our payload is a function, we pass in our original state to get our next state; otherwise we go straight back
  • If it is UpdateState, which is what we call setState, we need to merge the old and new states, we first get our new state, and then merge it with the previous state data
  • If it is ForceUpdate, that is, we have used to forceUpdatecreate an update, we directly set hasForceUpdate to true, and the return is still the old state, this hasForceUpdate will be used later in our judgment whether we need to update our components, for example updateClassComponent, in . The giant can view the previous code
function getStateFromUpdate<State>(
  workInProgress: Fiber,
  queue: UpdateQueue<State>,
  update: Update<State>,
  prevState: State,
  nextProps: any,
  instance: any,
): any {
    
    
  /**
   * 可以看到下面也是区分了几种情况
   *  ReplaceState:舍弃掉旧状态,直接用新状态替换到旧状态;
   *  UpdateState:新状态和旧状态的数据合并后再返回;
   *  ForceUpdate:只修改 hasForceUpdate 为true,不过返回的还是旧状态;
   */
  switch (update.tag) {
    
    
    case ReplaceState: {
    
    
      const payload = update.payload;
      if (typeof payload === 'function') {
    
    
        // 若 payload 是 function,则将 prevState 作为参数传入,执行payload()
        const nextState = payload.call(instance, prevState, nextProps);
        return nextState;
      }
      return payload;
    }
    case CaptureUpdate: {
    
    
      workInProgress.flags = (workInProgress.flags & ~ShouldCapture) | DidCapture;
    }
    case UpdateState: {
    
    
      const payload = update.payload;
      let partialState; // 用于存储计算后的新state结果
      if (typeof payload === 'function') {
    
    
        // 若payload是function,则将prevState作为参数传入,执行payload()
        partialState = payload.call(instance, prevState, nextProps);
      } else {
    
    
        // 若 payload 是变量,则直接赋值
        partialState = payload;
      }
      if (partialState === null || partialState === undefined) {
    
    
        // 若得到的结果是null或undefined,则返回之前的数据
        return prevState;
      }
      // 与之前的state数据进行合并
      return assign({
    
    }, prevState, partialState);
    }
    case ForceUpdate: {
    
    
      hasForceUpdate = true;
      return prevState;
    }
  }
  return prevState;
}

Summary and Expansion & Classic Interview Questions

We just explained the related operations of updateQueue , the summary is:

  • We use updateQueueto store our updates, which stores each of our updates, and updates may be created by functions such as setState , replaceStateor , forceUpdateetc.enqueueUpdate
  • When we start an update, we will iterate processUpdateQueuethrough our updateQueueand update our state
  • Because of our lanes priority, in our task scheduling process, the update task is divided into two parts, one with sufficient priority and one with insufficient priority.
  • For tasks with insufficient priority and tasks after it, we temporarily store them and process them in the next scheduling
  • For tasks with sufficient priority, we use getStateFromUpdateto get the updated data, and follow the mode we call to decide whether we will merge or replace the two states
  • If the forced update mode (update task created by ForceUpdate) is set, we set the flag and then update it in the render phase
  • Otherwise, we temporarily store the result of this update, but do not update it, and continue our next scheduling. Only when all tasks are updated and there are no delayed tasks will our state be updated

Then after the previous statement, you can definitely answer this interview question:

Is setState asynchronous or synchronous and why?

According to the previous explanation, the update created by our setState is not executed every time an update is created , and a batch of updates are processed each time , so its execution is asynchronous , and the update you create may not be immediately responded.

So React provides two methods to solve this problem, one is forceUpdate, we set it as a forced update through forceUpdate

There is also a callback function. We can see that if we have a related callback function, we will put it in the effects queue, and then mark it as currently fiberavailable callbackThis part will be executed in the later process. At this time, we have updated our state, so we can get the correct state

Ok, this tutorial was supplemented during May Day, and then we will use a few articles to talk about our Hooks according to the original plan

By the way, after writing the React source code tutorial, we will open a React SSR related project

Guess you like

Origin blog.csdn.net/weixin_46463785/article/details/130451530