react source code analysis 16 Concurrent mode

react source code analysis 16 Concurrent mode

Video Explanation (efficient learning): Enter learning

concurrent mode

react17 supports concurrent mode. The fundamental purpose of this mode is to make the application maintain the rapid response of cpu and io. It is a set of new functions, including Fiber, Scheduler and Lane. It can adjust the response speed of the application according to the user's hardware performance and network conditions. The core is to realize asynchronous and interruptible updates. Concurrent mode is also the main iteration direction of react in the future.

  • cup: let the time-consuming reconcile process give up the execution right of js to higher priority tasks, such as user input,
  • io: rely on suspend


Fiber was introduced before. Let's take a look at the meaning of fiber in concurrent mode. The reconcile before react15 is executed synchronously. When the number of components is large and the amount of calculation in reconcile is large, the page will be stuck, In order to solve this problem, we need a set of asynchronous and interruptible updates to make time-consuming calculations give js execution right to high priority tasks, and then execute these calculations when the browser is free. Therefore, we need a data structure to describe the real dom and updated information. When appropriate, we can interrupt the reconcile process in memory. This data structure is fiber.


The Scheduler is independent of react itself and is equivalent to a separate package. The significance of the Scheduler is that when the amount of calculation of the cup is large, we calculate the time of one frame according to the fps of the device, and execute the operation of the cup within this time. When the execution time of the task is faster than one frame, the execution of the task will be suspended to give the browser time to rearrange and redraw. Continue the task at the appropriate time.

In js, we know that the generator can also pause and continue tasks, but we also need to prioritize tasks, which the generator cannot complete. In the Scheduler, the time slicing is realized by using MessageChannel, and then the task priority is arranged with a small top heap to achieve asynchronous and interruptible update.

The Scheduler can use the expiration time to represent the priority.

The higher the priority, the shorter the expiration time, and the closer it is to the current time, that is, it will be executed in a short time.

The lower the priority, the longer the expiration time, and the longer it is from the current time, that is, it will take a long time for it to execute.


Lane uses binary bits to represent the priority of tasks to facilitate the calculation of priority. Different priorities occupy 'tracks' in different positions, and there is a concept of batch. The lower the priority, the more' tracks'. The high priority interrupts the low priority, and the new task needs to be given any priority, which are all the problems Lane needs to solve.


Simply put, multiple updates are triggered simultaneously in a context, and these updates are combined into one update, such as

onClick() {
  this.setState({ count: this.state.count + 1 });
  this.setState({ count: this.state.count + 1 });

In the previous react version, if it is separated from the current context, it will not be merged. For example, multiple updates are placed in setTimeout. The reason is that the executionContext of multiple setstates in the same context will contain BatchedContext, and the setstates containing BatchedContext will be merged. When the executionContext is equal to NoContext, The tasks in SyncCallbackQueue will be executed synchronously, so multiple setstates in setTimeout will not be merged and will be executed synchronously.

onClick() {
 setTimeout(() => {
    this.setState({ count: this.state.count + 1 });
    this.setState({ count: this.state.count + 1 });
export function batchedUpdates<A, R>(fn: A => R, a: A): R {
  const prevExecutionContext = executionContext;
  executionContext |= BatchedContext;
  try {
    return fn(a);
  } finally {
    executionContext = prevExecutionContext;
    if (executionContext === NoContext) {
       //When executionContext is NoContext, tasks in SyncCallbackQueue will be executed synchronously

In Concurrent mode, the above examples will also be merged into one update. The root cause is in the following simplified source code. If multiple setState callbacks are made, the priority of these setState callbacks will be compared. If the priority is the same, return it first and the subsequent render phase will not be carried out

function ensureRootIsScheduled(root: FiberRoot, currentTime: number) {
  const existingCallbackNode = root.callbackNode;//Callback of setState that has been called before
    if (existingCallbackNode !== null) {
    const existingCallbackPriority = root.callbackPriority;
    //If the callback priority of the new setState is equal to that of the previous setState, it will enter the logic of batchedUpdate
    if (existingCallbackPriority === newCallbackPriority) {
    //Starting point of scheduling render phase
    newCallbackNode = scheduleCallback(
    performConcurrentWorkOnRoot.bind(null, root),

In Concurrent mode, why is the priority of setState consistent in the setTimeout callback multiple times? Because in the function requestUpdateLane for obtaining lane, only the first setState satisfies currentEventWipLanes === NoLanes, so their currentEventWipLanes parameters are the same, while in findUpdateLane, the schedulerLanePriority parameters are the same (the scheduling priority is the same), So the returned lane is the same.

export function requestUpdateLane(fiber: Fiber): Lane {
  if (currentEventWipLanes === NoLanes) {//For the first time, setState satisfies currentEventWipLanes === NoLanes
    currentEventWipLanes = workInProgressRootIncludedLanes;
  //In setTimeout, schedulerlanepriority and currenteventwiplanes are the same, so the returned lane is also the same
  lane = findUpdateLane(schedulerLanePriority, currentEventWipLanes);

  return lane;


Suspend can display pending status when requesting data and display data after successful request. The reason is that the priority of components in suspend is very low, while the priority of off screen fallback components is high. When the component in suspend resolve s, it will reschedule the render stage. This process occurs in the updatesusponsecomponent function, See the video of debugging suspend for details


Fiber provides data level support for concurrent architecture.

Scheduler provides guarantee for concurrent time slice scheduling.

The Lane model provides an updated strategy for concurrent

The upper layer implements batchedUpdates and suspend

Keywords: React

Added by jokerfool on Wed, 09 Feb 2022 22:11:29 +0200