From the perspective of Flutter and the front-end, talk about how to ensure UI fluency under the single-threaded model

The topic of the article is "How to Ensure UI Fluency under Single Thread Model". This topic is aimed at the performance principle of Flutter, but the dart language is an extension of js, and many concepts and mechanisms are the same. Not specific. In addition, js is also a single-threaded model, which is similar to dart in terms of interface display and IO. Therefore, I will talk about it in combination with comparisons, to help sort out and make analogies, and it will be easier to grasp the theme of this article and expand the knowledge horizontally.

From the front-end perspective, analyze the event loop and event queue models. Let's talk about the relationship between the event queue on the dart side and synchronous and asynchronous tasks from the Flutter layer.

First, the design of the single-threaded model

1. The most basic single-threaded processing of simple tasks

Suppose there are several tasks:

  • Task 1: "Name:" + "Hangcheng Xiaoliu"
  • 任务2: "年龄:" + "1995" + "02" + "20"
  • Task 3: "Size:" + (2021 - 1995 + 1)
  • Task 4: Print the results of tasks 1, 2, 3

Executed in a single thread, the code might look like this:

//c
void mainThread () {
  string name = "姓名:" + "杭城小刘";
  string birthday = "年龄:" + "1995" + "02" + "20" 
  int age = 2021 - 1995 + 1;
	printf("个人信息为:%s, %s, 大小:%d", name.c_str(), birthday.c_str(), age);
}

The thread starts to execute the task. According to the requirements, a single thread executes each task in turn, and the thread exits immediately after the execution is completed.

2. How to deal with new tasks when the thread is running?

The threading model introduced in question 1 is too simple and ideal. It is impossible to determine n tasks from the beginning. In most cases, new m tasks will be received. Then the design in section1 cannot meet this requirement. ** To be able to accept and execute new tasks while the thread is running, an event loop mechanism is required. **The most basic event loop can be thought of as a loop.

// c++
int getInput() {
  int input = 0;
  cout<< "请输入一个数";
  cin>>input;
  return input;
}

void mainThread () {
  while(true) {
    int input1 = getInput();
    int input2 = getInput();
    int sum = input1 + input2;
    print("两数之和为:%d", sum);
  }
}

Compared with the first version of the thread design, this version has made the following improvements:

  • The loop mechanism is introduced , and the thread will not exit immediately after finishing the work.
  • Events are introduced . At the beginning, the thread will wait for user input. While waiting, the thread is in a suspended state. When the user input is completed, the thread gets the input information, and the thread is activated at this time. Perform the addition operation, and finally output the result. Constantly waiting for input and calculating output.

3. Handle tasks from other threads

The threading module in the real environment is far from simple. For example, in the browser environment, the thread may be drawing, and may receive an event from the user's mouse click, an event from the completion of loading css resources from the network, and so on. Although the second version of the thread model introduces an event loop mechanism, it can accept new event tasks, but have you found it? One of these tasks comes from within a thread, and the design cannot accept tasks from other threads.

As can be seen from the above figure, the main rendering thread will frequently receive some event tasks from the IO thread. When the received message after the resource loading is completed, the rendering thread will start DOM parsing; when it receives the message from the mouse click , the main rendering thread will execute the bound mouse click event script (js) to process the event.

Need a reasonable data structure to store and retrieve messages sent by other threads?

Everyone has heard the word message queue , and in GUI systems, event queues are a general solution.

A message queue (event queue) is a reasonable data structure. The task to be executed is added to the tail of the queue, and the task to be executed is taken from the head of the queue.

With message queues, the threading model has been upgraded. as follows:

It can be seen that the transformation is divided into 3 steps:

  • Build a message queue
  • New tasks generated by the IO thread will be added to the tail of the message queue
  • The rendering main thread will cyclically read tasks from the head of the message queue and execute the tasks

Fake code. Construct the queue interface part

class TaskQueue {
  public:
  Task fetchTask (); // 从队列头部取出1个任务
  void addTask (Task task); // 将任务插入到队列尾部
}

Transform the main thread

TaskQueue taskQueue;
void processTask ();
void mainThread () {
  while (true) {
  	Task task = taskQueue.fetchTask();
  	processTask(task);
  }
}

IO thread

void handleIOTask () {
  Task clickTask;
  taskQueue.addTask(clickTask);
}

Tips: The event queue is accessed by multiple threads, so it needs to be locked.

4. Handle tasks from other threads

In the browser environment, the rendering process often receives tasks from other processes, and the IO thread is dedicated to receiving messages from other processes. IPC deals specifically with communication across processes.

5. Task types in message queues

There are many message types in message queues. Internal messages: such as mouse scrolling, clicking, moving, macro tasks, micro tasks, file reading and writing, timers, etc.

There are also a large number of page-related events in the message queue. Such as JS execution, DOM parsing, style calculation, layout calculation, CSS animation, etc.

The above events are all executed in the main rendering thread, so you need to pay attention when coding to minimize the time occupied by these events.

6. How to exit safely

In the design of Chrome, when it is determined to exit the current page, the main thread of the page will set a variable of the exit flag, and the flag will be judged each time a task is executed. If set, interrupt the task and exit the thread

7. Disadvantages of Single Thread

The event queue is characterized by first-in, first-out, last-in, last-out. The backward task may be blocked by the previous task because the execution time is too long, and the latter task can be executed after the execution of the previous task is completed. There are 2 problems with this.

  • How to handle high priority tasks

    If you want to monitor the changes of DOM nodes (insert, delete, modify innerHTML), and then trigger the corresponding logic. The most basic approach is to design a set of listening interfaces. When the DOM changes, the rendering engine calls these interfaces synchronously. However, there is a big problem with this, that is, the DOM changes frequently. If the corresponding JS interface is triggered every time the DOM changes, the task will be executed for a long time, resulting in a decrease in execution efficiency .

    If these DOM changes are treated as asynchronous messages, if in the message queue. There may be a problem that the current DOM message will not be executed due to the execution of the previous task, which affects the real-time monitoring .

    How to balance efficiency and real-time performance? Microtasks solve this type of problem.

    Usually, we call the tasks in the message queue a macro task , and each macro task contains a micro task queue . During the execution of the macro task, if the DOM changes, the change will be added to the micro task of the macro task. In the task queue, the sub-efficiency problem can be solved.

    When the main function in the macro task is executed, the rendering engine will execute the micro task in the micro task queue. Therefore, the real-time problem is solved

  • How to solve the problem that a single task takes too long to execute

It can be seen that if the JS calculation times out and the animation paint times out, it will cause a freeze. In order to avoid this problem, the browser adopts the design of callback callback to avoid it, that is, delaying the execution of JS tasks.

Second, the single-threaded model in flutter

1. Event loop mechanism

Dart is single-threaded, which means that code is executed in order. In addition, Dart, as the development language of Flutter, the GUI framework, must support asynchronous.

A Flutter application contains one or more isolates , and the default method execution is in the main isolate ; an isolate contains 1 Event loop and 1 Task queue. Among them, the Task queue includes an Event queue and a MicroTask queue . as follows:

Why do you need async? Because in most scenarios, the application is not always doing calculations. For example, while waiting for the user's input, and then participate in the operation after the input. This is an IO scenario. So a single thread can do other things when it waits, and then process it when it really needs to process the operation. Therefore, although it is a single thread, it gives us the feeling that colleagues are doing a lot of things (doing other things in their free time)

If a task involves IO or asynchrony, the main thread will first do other things that require computation, and this action is driven by the event loop. Like JS, the role of storing event tasks in dart is the event queue.

Event queue is responsible for storing task events that need to be executed, such as DB reading.

There are two queues in Dart, a Microtask Queue and an Event Queue.

The event loop polls continuously, first determines whether the microtask queue is empty, and takes out the tasks to be executed from the head of the queue. If the microtask queue is empty, judge whether the event queue is empty, if not, take out the event (such as keyboard, IO, network event, etc.) from the head, and then execute its callback function in the main thread, as follows:

2. Asynchronous tasks

Microtasks are asynchronous tasks that complete in a short amount of time. Microtasks have the highest priority in the event loop. As long as the microtask queue is not empty, the event loop will continue to execute microtasks, and the tasks in the subsequent event queues will continue to wait. Microtask queues can be scheduleMicroTaskcreated .

Typically, microtasks are used in fewer scenarios. Internally, Flutter also uses microtasks in scenarios such as gesture recognition, text input, scroll views, and saving page effects that require high-quality execution.

Therefore, under general requirements, we use Event Queue with lower priority for asynchronous tasks. For example, IO, drawing, timer, etc. are all executed by driving the main thread through the event queue.

Dart provides a layer of encapsulation for the task of Event Queue, called Future. Putting a function body into a Future completes the packaging of synchronous tasks into asynchronous tasks (similar to submitting a task to a queue synchronously and asynchronously through GCD in iOS). Future has the ability to chain calls and can execute other tasks (functions) after asynchronous execution.

Look at a specific code:

void main() {
  print('normal task 1');
  Future(() => print('Task1 Future 1'));
  print('normal task 2');
  Future(() => print('Task1 Future 2'))
      .then((value) => print("subTask 1"))
      .then((value) => print("subTask 2"));
}
//
lbp@MBP  ~/Desktop  dart index.dart
normal task 1
normal task 2
Task1 Future 1
Task1 Future 2
subTask 1
subTask 2

In the main method, an ordinary synchronous task is added first, and then an asynchronous task is added in the form of a Future. Dart will add the asynchronous task to the event queue, and then understand the return. Subsequent code continues to execute as a synchronous task. Then 1 more normal sync task was added. Then an asynchronous task is added in the way of Future, and the asynchronous task is added to the event queue. At this point, there are 2 asynchronous tasks in the event queue. Dart takes out a task from the head of the event queue and executes it in a synchronous manner. After all executions (first-in, first-out) are completed, the subsequent then is executed.

Future and then share an event loop. If there are multiple thens, they are executed in order.

Example 2:

void main() {
  Future(() => print('Task1 Future 1'));
  Future(() => print('Task1 Future 2'));

  Future(() => print('Task1 Future 3'))
      .then((_) => print('subTask 1 in Future 3'));

  Future(() => null).then((_) => print('subTask 1 in empty Future'));
}
lbp@MBP  ~/Desktop  dart index.dart
Task1 Future 1
Task1 Future 2
Task1 Future 3
subTask 1 in Future 3
subTask 1 in empty Future

In the main method, Task 1 is added to Future 1, which is added to the Event Queue by Dart. Task 1 is added to Future 2, which is added to the Event Queue by Dart. Task 1 is added to Future 3 and added to the Event Queue by Dart. SubTask 1 and Task 1 share the Event Queue. The task in Future 4 is empty, so the code in then will be added to the Microtask Queue for execution in the next round of the event loop.

Comprehensive example

void main() {
  Future(() => print('Task1 Future 1'));
  Future fx = Future(() => null);
  Future(() => print("Task1 Future 3")).then((value) {
    print("subTask 1 Future 3");
    scheduleMicrotask(() => print("Microtask 1"));
  }).then((value) => print("subTask 3 Future 3"));

  Future(() => print("Task1 Future 4"))
      .then((value) => Future(() => print("sub subTask 1 Future 4")))
      .then((value) => print("sub subTask 2 Future 4"));

  Future(() => print("Task1 Future 5"));

  fx.then((value) => print("Task1 Future 2"));

  scheduleMicrotask(() => print("Microtask 2"));

  print("normal Task");
}
lbp@MBP  ~/Desktop  dart index.dart
normal Task
Microtask 2
Task1 Future 1
Task1 Future 2
Task1 Future 3
subTask 1 Future 3
subTask 3 Future 3
Microtask 1
Task1 Future 4
Task1 Future 5
sub subTask 1 Future 4
sub subTask 2 Future 4

explain:

  • Event Loop executes the synchronization task of the main method first, then executes the micro-task, and finally executes the asynchronous task of the Event Queue. So normal Task executes first
  • Similarly, Microtask 2 is executed
  • Second, Event Queue FIFO, Task1 Future 1 is executed
  • The interior of fx Future is empty, so the content in then is added to the microtask queue, and the microtask has the highest priority, so Task1 Future 2 is executed
  • Second, Task1 Future 3 is executed. Since there are 2 thens, the subTask 1 Future 3 in the first then is executed first, and then the microtask is encountered, so Microtask 1 is added to the microtask queue, waiting for the next Event Loop to be triggered. Then execute the subTask 3 Future 3 in the second then. With the arrival of the next Event Loop, Microtask 1 is executed
  • Second, Task1 Future 4 is executed. The task in the first then is then wrapped into an asynchronous task by Future, which is added to the Event Queue, and the content of the second then is also added to the Event Queue.
  • Next, execute Task1 Future 5. This event loop ends
  • When the next round of event loop comes, print sub subTask 1 Future 4 and sub subTask 1 Future 5 in the queue.

3. Asynchronous functions

The result of an asynchronous function will not be returned until some time in the future, so a Future object needs to be returned for the caller to use. The caller determines whether to register a then on the Future object and wait for the execution of the Future to complete and then perform asynchronous processing, or to wait synchronously until the execution of the Future ends. If the Future object needs to wait synchronously, you need to add await at the calling place , and the function where the Future is located needs to use the async keyword.

await is not a synchronous wait, but an asynchronous wait. Event Loop will also treat the function where the calling body is located as an asynchronous function, and add the context of the waiting statement as a whole to the Event Queue. Once it returns, Event Loop will take out the context code in the Event Queue, and the waiting code will continue to execute.

await blocks the execution of subsequent code in the current context, and cannot block the execution of subsequent code on the upper layer of its call stack

void main() {
  Future(() => print('Task1 Future 1'))
      .then((_) async => await Future(() => print("subTask 1 Future 2")))
      .then((_) => print("subTask 2 Future 2"));
  Future(() => print('Task1 Future 2'));
}
lbp@MBP  ~/Desktop  dart index.dart
Task1 Future 1
Task1 Future 2
subTask 1 Future 2
subTask 2 Future 2

Parse:

  • Task1 in the Future Future 1 is added to the Event Queue. Secondly, the first then is encountered. The then is an asynchronous task wrapped by a Future, so it Future(() => print("subTask 1 Future 2"))is added to the Event Queue, and the await function where it is located is also added to the Event Queue. The second then is also added to the Event Queue
  • The 'Task1 Future 2' in the second Future will not be blocked by await, because await is an asynchronous wait (added to the Event Queue). So execute 'Task1 Future 2. Then execute "subTask 1 Future 2, then take out await to execute subTask 2 Future 2

4. Isolate

In order to utilize multi-core CPUs, Dart isolates the intensive computing at the CPU level and provides a multi-threading mechanism, namely Isolate. Each Isolate resource is isolated and has its own Event Loop, Event Queue, and Microtask Queue. Resource sharing between Isolates communicates through a message mechanism (same as processes)

It is very simple to use, you need to pass a parameter when creating.

void coding(language) {
  print("hello " + language);
}
void main() {
  Isolate.spawn(coding, "Dart");
}
lbp@MBP  ~/Desktop  dart index.dart
hello Dart

Most of the time, more than just concurrent execution is required. It may also be necessary to tell the master Isolate the result after an Isolate operation is complete. Message communication can be achieved through Isolate's pipeline (SendPort). You can pass the pipeline as a parameter to the sub-Isolate in the main Isolate, and when the sub-Isolate operation is finished, use this pipeline to pass the result to the main Isolate

void coding(SendPort port) {
  const sum = 1 + 2;
  // 给调用方发送结果
  port.send(sum);
}

void main() {
  testIsolate();
}

testIsolate() async {
  ReceivePort receivePort = ReceivePort(); // 创建管道
  Isolate isolate = await Isolate.spawn(coding, receivePort.sendPort); // 创建 Isolate,并传递发送管道作为参数
	// 监听消息
  receivePort.listen((message) {
    print("data: $message");
    receivePort.close();
    isolate?.kill(priority: Isolate.immediate);
    isolate = null;
  });
}
lbp@MBP  ~/Desktop  dart index.dart
data: 3

In addition, Flutter provides a shortcut to perform concurrent computing tasks - the compute function . It encapsulates the creation of Isolate and two-way communication internally.

In fact, there are few scenarios in which compute is used in business development. For example, JSON encoding and decoding can use compute.

Calculate the factorial:

int testCompute() async {
  return await compute(syncCalcuateFactorial, 100);
}

int syncCalcuateFactorial(upperBounds) => upperBounds < 2
    ? upperBounds
    : upperBounds * syncCalcuateFactorial(upperBounds - 1);

Summarize:

  • Dart is single-threaded, but can be asynchronous through the event loop
  • Future is the encapsulation of asynchronous tasks. With the help of await and async, we can achieve non-blocking synchronous waiting through the event loop
  • Isolate is a multi-threading in Dart that can achieve concurrency, has its own event loop and Queue, and monopolizes resources. Isolates can communicate one-way through the message mechanism, and these passed messages drive each other to perform asynchronous processing through each other's event loop.
  • Flutter provides a CPU-intensive compute method that internally encapsulates the communication between Isolate and Isolate
  • The concept of event queue, event loop is very important in GUI system, it exists almost in front-end, Flutter, iOS, Android and even NodeJS.
{{o.name}}
{{m.name}}

Guess you like

Origin http://10.200.1.11:23101/article/api/json?id=324040636&siteId=291194637