Flutter uses FFI+CustomPainter to achieve full platform rendering video

Flutter video rendering series

Chapter 1 Android uses Texture to render video
Chapter 2 Windows uses Texture to render video
Chapter 3 Linux uses Texture to render video
Chapter 4 Full platform FFI+CustomPainter to render video (this chapter)



foreword

The previous chapters introduced how flutter uses textures to render videos, but there is a problem that a set of native code needs to be written to create textures on each platform, which is not conducive to code maintenance. The best way should be that a set of codes can run on every platform, so I came up with an idea to use c++ to realize cross-platform video capture, transfer the data to the dart interface through ffi, and draw the image through the canvas control. Finally, through testing, it is found that the available solution is ffi combined with CustomPainter to achieve video rendering. The video rendering realized in this way can make a set of codes run on all platforms (except web) .


1. How to achieve

1. C/C++ captures video frames

(1), write C++ code

The player is a kind of video collection, for example, the following code is a simple definition of the player.
insert image description here
The example of ffplay.h is as follows

//播放回调方法原型
typedef void(*DisplayEventHandler)(void*play,unsigned char* data[8], int linesize[8], int width, int height, AVPixelFormat format);
//创建播放器
void*play_create();
//销毁播放器
void play_destory(void*);
//设置渲染回调
void play_setDisplayCallback(void*, DisplayEventHandler callback);
//开始播放(异步)
void play_start(void*,const char*);
//开始播放(同步)
void play_exec(void*, const char*);
//停止播放
void play_stop(void*);

(2) Write CMakeList

cmake for each platform.

  • CMakeList for Windows and Linux (partial)
# Project-level configuration.
set(PROJECT_NAME "ffplay_plugin")
project(${PROJECT_NAME} LANGUAGES CXX)

# This value is used when generating builds using this plugin, so it must
# not be changed.
set(PLUGIN_NAME "ffplay_plugin_plugin")

# Define the plugin library target. Its name must not be changed (see comment
# on PLUGIN_NAME above).
#
# Any new source files that you add to the plugin should be added here.
add_library(${PLUGIN_NAME} SHARED
  "ffplay_plugin.cc"
"../ffi/ffplay.cpp"
"../ffi/DllImportUtils.cpp"
)
target_link_libraries(${PLUGIN_NAME} PRIVATE flutter  )
  • Android's jni CMakeList (partial)
add_library( # Sets the name of the library.
        ffplay_plugin_plugin
        # Sets the library as a shared library.
        SHARED
        # Provides a relative path to your source file(s).
        ../../../../ffi/ffplay.cpp
        ../../../../ffi/DllImportUtils.cpp
        )
target_link_libraries( # Specifies the target library.
                       ffplay_plugin_plugin
                       # Links the target library to the log library
                       # included in the NDK.
                       ${log-lib}
                       android
                       )

2. FFI imports C/C++ method

(1), dependent package

import 'dart:ffi'; // For FFI
import 'package:ffi/ffi.dart';
import 'dart:io'; // For Platform.isX

(2), load the dynamic library

Load the dynamic library according to different platforms, usually windows is dll and other platforms are so. The name of the dynamic library is determined by the above CMakeList.

final DynamicLibrary nativeLib = Platform.isWindows
    ? DynamicLibrary.open("ffplay_plugin_plugin.dll")
    : DynamicLibrary.open("libffplay_plugin_plugin.so");

(3), definition method

For example, the method in ffplay.h corresponds to the dart definition as follows:
main.dart

//播放回调方法原型
typedef display_callback = Void Function(Pointer<Void>, Pointer<Pointer<Uint8>>,
    Pointer<Int32>, Int32, Int32, Int32);
//创建播放器
final Pointer<Void> Function() play_create = nativeLib
    .lookup<NativeFunction<Pointer<Void> Function()>>('play_create')
    .asFunction();
//销毁播放器
final void Function(Pointer<Void>) play_destory = nativeLib
    .lookup<NativeFunction<Void Function(Pointer<Void>)>>('play_destory')
    .asFunction();
//设置渲染回调
final void Function(Pointer<Void>, Pointer<NativeFunction<display_callback>>)
    play_setDisplayCallback = nativeLib
        .lookup<
                NativeFunction<
                    Void Function(Pointer<Void>,
                        Pointer<NativeFunction<display_callback>>)>>(
            'play_setDisplayCallback')
        .asFunction();
//开始播放(异步)
final void Function(Pointer<Void>, Pointer<Int8>) play_start = nativeLib
    .lookup<NativeFunction<Void Function(Pointer<Void>, Pointer<Int8>)>>(
        'play_start')
    .asFunction();
//开始播放(同步)
final void Function(Pointer<Void>, Pointer<Int8>) play_exec = nativeLib
    .lookup<NativeFunction<Void Function(Pointer<Void>, Pointer<Int8>)>>(
        'play_exec')
    .asFunction();
//停止播放
final void Function(Pointer<Void>) play_stop = nativeLib
    .lookup<NativeFunction<Void Function(Pointer<Void>)>>('play_stop')
    .asFunction();

3. Isolate starts the acquisition thread

Since flutter's interface mechanism does not allow data sharing between threads , and global variables are all TLS, threads created in C/C++ cannot directly transfer playback data to the main thread for rendering, so you need to use dart to create an Isolate for C/C++ The player runs on it, and the data is sent to the main thread through sendPort.

(1), define the entry method

The entry method is equivalent to the child thread method.
main.dart

//Isolate通信端口
SendPort? m_sendPort;
//Isolate入口方法
  static isolateEntry(SendPort sendPort) async {
    
    
    //记录sendPort
    m_sendPort = sendPort;
    //播放逻辑,此处需要堵塞,简单点可以在播放逻辑中堵塞,也可以放一个C/C++消息队列给多路流线程通信做调度。
    //比如采用播放逻辑阻塞实现,阻塞后在渲染回调方法中使用sendPort将视频数据发送到主线程,回调必须在此线程中。
     
    //发送消息通知结束播放
    sendPort?.send([1]);
  }

(2), Create Isolate

With the entry method, you can create an Isolate, the example is as follows:
main.dart

  startPlay() async {
    
    
    ReceivePort receivePort = ReceivePort();
    //创建一个Isolate相当于创建一个子线程
    await Isolate.spawn(isolateEntry, receivePort.sendPort);
    // 监听Isolate子线程消息port
    await for (var msg in receivePort) {
    
    
      //处理Isolate子线程发过来的视频数据
      
      int type=msg[0];
      if(type==1)
      //结束播放
        break;
  }
}

4. Custom Painter drawing

(1), custom drawing

Custom painting needs to inherit CustomPainter and implement the paint method, and draw ui.image in the paint method. This ui.image can be obtained by transcoding the argb data.
main.dart

import 'dart:ui' as ui;
//渲染的image
ui.Image? image;
//通知控件绘制
ChangeNotifier notifier = ChangeNotifier();
//自定义panter
class MyCustomPainter extends CustomPainter {
    
    
  //触发绘制的标识
  ChangeNotifier flag;
  MyCustomPainter(this.flag) : super(repaint: flag);
  
  void paint(Canvas canvas, ui.Size size) {
    
    
    //绘制image
    if (image != null) canvas.drawImage(image!, Offset(0, 0), Paint());
  }
  
  bool shouldRepaint(MyCustomPainter oldDelegate) => true;
}

(2), layout interface

Use a custom CustomPainter in the interface, and pass in a ChangeNotifier object to trigger drawing.
main.dart

  
  Widget build(BuildContext context) {
    
    
    return Scaffold(
      appBar: AppBar(
        title: Text(widget.title),
      ),
      //控件布局
      body: Center(
        child: Row(
          mainAxisAlignment: MainAxisAlignment.center,
          children: <Widget>[
            Container(
              width: 640,
              height: 360,
              child: Center(
                child: CustomPaint(
                  foregroundPainter: MyCustomPainter(notifier),
                  child: Container(
                    width: 640,
                    height: 360,
                    color: Color(0x5a00C800),
                  ),
                ),
              ),
            )
          ],
        ),
      ),
      floatingActionButton: FloatingActionButton(
        onPressed: onClick,
        tooltip: 'play or stop',
        child: Icon(Icons.add),
      ),
    );
  }

(3), draw the video frame

After the playback data is sent to the main thread, the argb data needs to be converted into a ui.image object. We can directly use the ui.decodeImageFromPixels method.
main.dart

 ui.decodeImageFromPixels(pixels, width, height, PixelFormat.rgba8888,
            (result) {
    
    
          image = result;
          //通知绘制
          notifier.notifyListeners();
        }, rowBytes: linesize, targetWidth: 640, targetHeight: 360);

2. Effect preview

A basic running effect
insert image description here


3. Performance comparison

In fact, during the exploration process, the RawImage method was used to render the video, and the screen was successfully displayed, but the cpu usage rate was very high, so it could not be used for actual development. Finally, I found out that the performance of this method in this article is actually not very good. Compared with Texture rendering, there is still some gap, but it can be used.
Test platform: Windows 11
Test equipment: i7 8750h gpu uses nuclear display
Data recording: take 5 times within 30 seconds to calculate the average

This article renders

video control display size cpu usage (%) gpu usage (%)
h264 320p 30fps 320p 1.82 4.56
h264 1080p 30fps 360p 13.4 4.84
h264 1080p 30fps 1080p 13.04 15.14

Texture rendering

video control display size cpu usage (%) gpu usage (%)
h264 320p 30fps 320p 1.28 5.06
h264 1080p 30fps 360p 4.26 12.66
h264 1080p 30fps 1080p 4.78 14.72

It can be seen that the performance of the rendering method in this article is still acceptable when rendering small resolutions. When the resolution is relatively high, the cpu usage rate will increase a lot, and the gpu usage rate will be affected by the display size of the control. The texture method has better performance and less fluctuation.


4. Complete code

https://download.csdn.net/download/u013113678/87121930
Note: The implementation performance of this article is not particularly good, please download it according to your needs.
The flutter project that contains the complete code, versions 3.0.4 and 3.3.8 are running successfully, currently does not include ios, macos implementation. The catalog description is as follows.
insert image description here


Summarize

The above is what I want to talk about today. Using FFI+CustomPainter to realize video rendering is a method explored by the author. The principle is not complicated, and the performance can only be said to be barely usable, suitable for rendering small images . Writing it as an article and sending it out is also to serve as a node and continue to optimize on this basis. Overall, this is a good example and a solution worth exploring.

Guess you like

Origin blog.csdn.net/u013113678/article/details/127990764