Gstreamer basic explanation

Gstreamer Explained

Base

background

From a historical point of view, Linux has lagged far behind other operating systems in terms of multimedia. Microsoft's Windows and Apple's MacOS already have very good support for multimedia devices, multimedia creation, playback and real-time processing. On the other hand, Linux's comprehensive contribution to multimedia applications is relatively small, which also makes it difficult for Linux to compete with MS Windows and MacOS on professional-level software. GStreamer is designed to solve the current problems in Linux multimedia .

GStreamer is a very powerful and versatile streaming application framework. GStreamer is not limited to audio and video processing, it can handle any type of data stream. The main advantage is that its pluggable components can be easily connected to any pipeline. This advantage makes it possible to use GStreamer to write a universal editable audio and video application.

​The GStreamer framework is based on plugins . All plugins can be linked into any defined data pipeline.

Official website: https://gstreamer.freedesktop.org/

summary

Advantages of Gstreamer
1. Clear structure and powerful
GStreamer provides a set of clear interfaces, both application programmers and plug-in programmers who build media pipelines can use these APIs conveniently.
2. Object-oriented programming ideas
GStreamer is attached to the GLib 2.0 object model, using the mechanism of signals and object attributes.
3. Flexible and extensible performance
All GStreamer objects can be extended by means of GObject inheritance.
All plug-ins can be dynamically loaded, and can be extended or upgraded independently.
4. Separation of core library and plug-ins (core/plugins)
All media processing functions are provided to the kernel by plug-ins from the outside, and tell the kernel how to handle specific media types.

image-20220114212052396

Element

Element (Element) is the most important concept in GStreamer.

​ It is possible to create a series of components and connect them to allow data flow to be transmitted between the connected components.

Several components can be connected together to create a pipeline (pipeline) to complete a specific task, for example, media playback or recording.

​ For programmers, one of the most important concepts in GStreamer is the GstElement object. Elements are the basic blocks for building a media pipeline. Each element corresponds to a GstElement. Any decoder encoder, splitter, video/audio output component is actually a GstElement object.

​ The figure below is an example of a pipeline, which realizes the function of file->audio and video decoding->play separately

image-20220114215215203

source element

The source component produces data for the pipeline, such as reading data from a disk or sound card. The image below is a visualized source component, we always draw the **source pad** at the right end of the component.

image-20220114220749263

​ The source element does not receive data, it only generates data. You can see this from the image above as there is only one source pad (right end).

filter/filter-like element

​ Both Filters and Filter-like elements have both input and output pads. They operate on the data they get from the input pads and feed the data to the output pads. A volume element (filter), a video converter (convertor), an Ogg splitter, or a Vorbis decoder are all elements of this type.

​ class filter element can have any number of source pads or sink pads. Like a decoder there is only one source pad and one sink pad. A video demuxer, on the other hand, may have one sink pad and multiple source pads, one for each metadata stream.

image-20220114224254934

Receive element (sink element)

A receiving element is the end of a media pipeline that receives data but does not produce any data. Writing to disk, playing sound using a sound card, and video output are all implemented by the receiving component. The figure below shows the receiving element.

image-20220114225023291

Link components together

You can build a media pipeline by linking a source element, zero or more filter-like elements, and a sink element. Data will flow between these elements. This is the basic concept of handling media in GStreamer.

image-20220114225536564

Pads

​ Pads are used in GStreamer to link multiple elements, allowing data to flow through such links. The pad is the external interface of the component, which can be regarded as a socket or port of a component, and the link between the components depends on the pad. Data flows from the source pad of one element to the sink pad of another element. Pad capabilities (capabilities) determine the type of media a component can handle.

Liners have the ability to handle special data: a liner can restrict the type of data flow through. The condition for a successful link is that it is established only if the data types allowed by the two pads are the same. This is called negotiation

Pad information can be viewed through gst-inspect-1.0

Pad Templates:
  SINK template: 'sink'                    ------>sink pad:数据流入
    Availability: Always                   ------>pad时效性:永久型
    Capabilities:                          ------>pad支持的caps
      video/quicktime
      video/mj2
      audio/x-m4a
      application/x-3gp

  SRC template: 'video_%u'                 ------>src pad:数据流出
    Availability: Sometimes                ------>pad时效性:随机型
    Capabilities:
      ANY

  SRC template: 'audio_%u'
    Availability: Sometimes
    Capabilities:
      ANY

  SRC template: 'subtitle_%u'
    Availability: Sometimes
    Capabilities:
      ANY

As you can see from the above, each pad will have the following attributes: padname, direction, presence, caps.

  • padname: pad name

  • direction: the input and output direction of the pad, there are two kinds of src and sink

  • presence: the timeliness of the pad. There are permanent GST_PAD_ALWAYS, random GST_PAD_SOMETIMES, and request GST_PAD_REQUEST. The request type is only called, and the gst_element_request_pad()random type uses different pads according to different input data. The meanings of the three types of timeliness are just as the name suggests: Permanent pads will always exist, random pads only exist under certain conditions (pads that disappear randomly are also random), request pads Occurs only when explicitly requested by the application.

  • caps: the functions supported by the pad

References:
https://blog.csdn.net/houxiaoni01/article/details/98509594

Object Orientation of Gstreamer

​ Gstreamer uses C language to simulate object-oriented implementation, mainly based on the data type of the GObject library in the glib library. GObject is a program library that can help us write object-oriented programs in C language.

Many people have been instilled with the concept that to write object-oriented programs, you need to learn an object-oriented programming language, such as C++, Java, C#, etc., and C language is used to write structured programs. In fact, object-oriented is just a programming idea, not a programming language. In other words, object-oriented is a game rule, it is not a game. GObject tells us that when writing programs in C language, object-oriented programming ideas can be used.

​ In the GObject world, a class is a combination of two structures, one is an instance structure and the other is a class structure . For example, MyObject is an instance structure, and MyObjectClass is a class structure, and together they can be called the MyObject class.

#include <glib-object.h>

//实例结构体
typedef struct _MyObject{
    
    
        GObject parent_instance;
} MyObject;
//类结构体
typedef struct _MyObjectClass {
    
    
        GObjectClass parent_class;
} MyObjectClass;

//让GObjectx系统知道你定义了这个类
G_DEFINE_TYPE(MyObject, my_object, G_TYPE_OBJECT);

​ The generation of an object in GObject follows the following principles:

​ If the first instance of the class is generated, the Class structure is allocated first, and then the structure for the instance is allocated. Otherwise directly allocate the structure for the instance. That is to say, all the content in the Class structure is shared by the instances generated by the class. When each object is instantiated, a dedicated instance structure is assigned to it.

​ You may notice that the first member of the instance structure of the MyObject class is the GObject structure, and the first member of the class structure of the MyObject class is the GObjectClass structure. In fact, the GObject structure and the GObjectClass structure are the instance structure and the class structure of the GObject class respectively. When they are the first members of the instance structure and the class structure of the MyObject class respectively , it means that the MyObject class inherits from GObject class .

​ Each class must be defined as two structures: its class structure and its instance structure. The first member of all class structures must be a GTypeClass structure, and the first member of all instance structures must be a GTypeInstance structure.

Gstreamer multithreading

​ GStreamer is a framework that supports multi-threading, and it is absolutely thread-safe.

​ The multi-threading of Gstreamer is implemented by the component queue, and the two parts before and after the queue are divided into two threads for execution.

image-20220119215443674

Utilities

​ gst-inspect is used to view the information of a plug-in, under gstreamer1.0 version, it is gst-inspect-1.0

​ gst-launch is used to start a pipeline, under gstreamer1.0 version, it is gst-launch-1.0

​ gst-launch is mainly used to temporarily enable or debug a pipeline. If a pipeline has been finalized, it is necessary to use the C function provided by gstreamer to form a pipeline.

​ In order to enable our own plug-ins to be successfully detected by gstreamer, gstreamer supports adding the GST_PLUGIN_PATH environment variable to increase the search path for plug-ins.

​ Before gst-launch starts, it will first scan the so library corresponding to each plug-in that needs to be used. When the corresponding library is scanned but cannot be loaded successfully, the plug-in will be added to the blacklist. If you need to check the blacklist, just You need gst-inspect-1.0 -b.

​ A damaged plug-in will report the reason for adding to the blacklist when it is scanned for the first time, and will not report the reason for subsequent scans. If you need to check the reason for adding to the blacklist, you need to delete the corresponding cache and execute it again.

rm ~/.cache/gstreamer-1.0/registry.x86_64.bin
gst-inspect-1.0 -b

​ Gstreamer debugging level, by setting the environment variable GST_PLUGIN to set the debugging level of gstreamer runtime, different levels will print out different levels of information.

There are six levels in total [0, 5]
0: print nothing
1: print information of GST_ERROR ()
2: print information of GST_ERROR () GST_WARNING ()
3: print information of GST_ERROR () GST_WARNING () GST_INFO ()
4 : Print GST_ERROR () GST_WARNING () GST_INFO () GST_DEBUG () information
5: Print GST_ERROR () GST_WARNING () GST_INFO () GST_DEBUG () GST_LOG () information

Gstreamer common plug-in introduction

Gstreamer has some commonly used plug-ins for building pipelines, here is an introduction.

v4l2src(Video for Linux 2 source)

Video for Linux 2 is a unified interface provided by the kernel for applications to access audio and video drivers, and v4l2src is a plug-in element provided to gstreamer, which belongs to the plug-in video4linux2, and the dynamic link library file name is libgstvideo4linux2.so.

image-20220118223403315

In the v4l2src attribute, the commonly used ones are

device: used to specify Linux device files, such as /dev/video0

num-buffers: Used to specify how many frames to read from the device.

v4l2src can specify the caps type externally.

Discoverable by gst-inspect-1.0 v4l2src

image-20220118224209893

v4l2src instance:

gst-launch-1.0 v4l2src devce=/dev/video0 num-buffers=60 ! video/x-raw,width=1280,height=720 ! fakesink

Among them, fakesink is a universal end element, which is used to end a pipeline during debugging.

Official documentation:

https://gstreamer.freedesktop.org/documentation/video4linux2/v4l2src.html?gi-language=c

The content is similar to directly executing gst-inspect-1.0 v4l2src

filesrc

filesrc is one of the core plug-ins of gstreamer, which is used to read a file from the file system.

Common attributes:

location: file location

num-buffers: Used to specify how many frames to read from the device.

filesink

Used to store video streams into files

Common attributes:

location: file location

qtdemux(QuickTime Demuxer)

It is used to separate the audio and video in the file. Generally, this component should be connected after reading the file, but some video files do not need it. This is related to the file itself.

实例:gst-launch-1.0 filesrc location=flower_groundtruth.mp4 ! qtdemux ! h264parse ! omxh264dec ! omxh264enc !filesink location=output.mp4

Function: read file -> extract video -> parse h264 video stream -> h264 decode -> h264 encode -> write file

videotestsrc

The test video source that comes with gstreamer

Common attributes:

pattern: test video display pattern

image-20220119150355022

Example: gst-launch-1.0 videotestsrc pattern=2 ! ximagesink

Where ximagesink refers to the window display

fakesrc and fakesink

The universal src and sink that comes with gstreamer

videocrop

Crop video, specify caps externally like v4l2src

Common attributes:

image-20220119151006293

videoscale

Change the video size, specify caps externally like v4l2src

Common attributes:

method: interpolation method

n-threads: number of threads

videoconvert

Change the video format, usually used when the format of the two plug-ins does not match, as an intermediate plug-in to coordinate the format of the two plug-ins, and specify caps externally like v4l2src.

kmssink

Used to send video to HDMI output.

Common attributes:

sync: Synchronization, when it is True, when the processing speed is too slow, it will use the strategy of discarding some frames to process the latest frame. When it is false, no frame will be discarded, and there is a buffer inside to store unfinished frames.

bus-id: Fill in the device corresponding to the HDMI output. For example, a0007000.v_mix, you need to check the specific information in dmesg.

image-20220119154355851

jpegdec

jpeg decoding, when using a USB camera, the input is a JPEG image, which needs to be decoded to process normally.

For other useful plug-ins, please refer to the official gstreamer documentation.

gstreamer project record

​ If you want to integrate custom hardware under the gstreamer framework, you need to write a gstreamer plug-in.

About the construction of Gstreamer's x86 environment and aarch64 cross-compilation environment

​ For x86 Ubuntu environment

apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libgstreamer-plugins-bad1.0-dev gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav gstreamer1.0-doc gstreamer1.0-tools gstreamer1.0-x gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-gtk3 gstreamer1.0-qt5 gstreamer1.0-pulseaudio

​ For aarch64 cross-compilation environment

​ First follow the cross-compiler toolchain, which includes other necessary tools in addition to gcc

sudo apt-get install gcc-aarch64-linux-gnu

image-20220116184832612

	其次,准备好必要的依赖库。库依赖关系:

image-20220115164854318

​ If you want to prepare the cross-compilation environment of gst-plugins-base and gstreamer core, you need to prepare the aforementioned compilation environment according to the dependencies.

​Note : The version of the cross-compilation environment should be consistent with the gstreamer version of the development board. This is to ensure that the header files and connected dependent libraries are consistent. In particular, gstreamer versions before 1.12 and after 1.14 cannot be used universally, and the plug-ins of the two versions cannot be used universally.

  • Install the orc 0.4.27 support library

    If gtkdoc-mktmpl: command not found is prompted after installing gtk-doc-tools and libgtk2.0-doc, change ./autogen.sh --prefix=/usr to the following command

./configure --prefix=/home/rongyitong/aarch64 --host=aarch64-linux-gnu
make 
sudo make install
  • zlib 1.2.11 cross-compiled
export CC=arm-linux-gnu-gcc   #它的configure不支持用CC变量来指定交叉工具链
./configure --prefix=/home/rongyitong/aarch64  #最后生成的库、头文件和man文件都在该目录下
make && make install
  • libffi 3.4.2 cross-compiled

    export CC=aarch64-linux-gnu-gcc
    ./configure --prefix=/home/rongyitong/aarch64 --host=aarch64-linux-gnu
    make && sudo make install
    
  • glib 2.45.3 cross-compile
    a new glib.cache in the root directory of glib, and write in it

glib_cv_long_long_format=ll
glib_cv_stack_grows=no
glib_cv_have_strlcpy=no
glib_cv_have_qsort_r=yes
glib_cv_va_val_copy=yes
glib_cv_uscore=no
glib_cv_rtldglobal_broken=no
ac_cv_func_posix_getpwuid_r=yes
ac_cv_func_posix_getgrgid_r=yes
./autogen.sh --prefix=/home/rongyitong/aarch64 --host=aarch64-linux-gnu CC=aarch64-linux-gnu-gcc  LIBFFI_CFLAGS="-I/home/rongyitong/aarch64/lib/libffi-3.0.13/include" LIBFFI_LIBS="-L/home/rongyitong/aarch64/lib -lffi" -cache-file=glib.cache  --disable-selinux  --disable-xattr --disable-libelf ZLIB_CFLAGS="-I/home/rongyitong/aarch64/include" ZLIB_LIBS="-lz -L/home/rongyitong/aarch64/lib"

if encounter

image-20220103153940015

This is a compiler version problem, add #pragma GCC diagnostic ignored "-Wformat-nonliteral" in front of glib/gdate.c

  • gstreamer 1.12 cross-compilation (because the development board is version 1.12, so it needs to be consistent)

    Note that the environment variable PKG_CONFIG_PATH is added before, which is used in ./configure to use pkg-config to detect whether glib exists. If the x86 version of glib interferes, you can temporarily rename /usr/local/lib

 ./configure --prefix=/home/rongyitong/aarch64 --host=aarch64-linux-gnu GLIB_LIBS=`-lglib-2.0 -L/home/rongyitong/aarch64/lib` GLIB_CFLAGS=`-I /home/rongyitong/aarch64/include/glib-2.0 -I /home/rongyitong/aarch64/lib/glib-2.0/include` GIO_LIBS=`-lgio-2.0 -lgobject-2.0 -lglib-2.0 -L/home/rongyitong/aarch64/lib` GIO_CFLAGS=`-pthread -I/home/rongyitong/aarch64/include/glib-2.0 -I/home/rongyitong/aarch64/lib/glib-2.0/include` LIBS=`-lzlib -L=/home/rongyitong/aarch64/lib`
  • Compile the gst-plugin-base library
./configure --prefix=/home/rongyitong/aarch64 --host=aarch64-linux-gnu --disable-ogg --disable-vorbis

The pkg-config path has been configured here, if not, you need to configure the environment variables like gstreamer

Supplementary knowledge about dynamic link library (shared object library)

​ The gstreamer plug-in library is in the form of a dynamic link library, so it is very necessary to understand the knowledge of the dynamic link library.

​ Software libraries are generally divided into static link libraries and dynamic link libraries , where the extension of the static link library is .a (linux) or .lib (windows), and the extension of the dynamic link library is .so (Linux) or .dll (Windows) .

​ The basic idea of ​​dynamic linking is to split the program into relatively independent parts according to modules, and link them together to form a complete program when the program is running, instead of linking all program modules into A single executable file.

​ The dynamic link library under Linux is usually in the format of libxxx.so. To link the required dynamic library in gcc, it only needs to be abbreviated as -lxxx.

​ The static link library will be merged into the target program during the linking process, and the dynamic link library will only look for the function in the corresponding so file when the target program needs this function when the target program is running, so the dynamic link library often has dependency problems.

​ There are usually two types of dependency problems in dynamic link libraries:

  1. The corresponding .so file cannot be found, or the format of the corresponding .so file does not match the target program. This error is usually reported at link time.

  2. The required function cannot be found in the corresponding .so file. This error can be linked normally, but an error will be reported when the target program is executed.
    The error type is usually a symbol undefined problem, such as undefined symbol: pthread_create

    In dynamic and static link libraries, functions and variables are collectively called symbols (symbols), function names or variable names are symbol names (Symbol Names), and each library has a symbol table (symbol table)

    Utilities for dynamic link libraries:

    readelf

    common usage

    readelf -h libxxx.so

    Option -h (elf header), display the file header information at the beginning of the elf file.

    image-20220118141843116

    readelf -s libxxx.so

    Option -s, display the items in the symbol table segment, (if the Name is not displayed completely, you can add the -W parameter to indicate the complete display)

    image-20220118151931033

    image-20220118152014689

    image-20220118151427682

    ​ Ndx indicates the segment where the symbol is located, UND indicates undefined, indicating that the symbol is not defined in the file, only the symbol is referenced, and the symbol is defined in other files.

    ​ldd (list dynamic dependencies) tool

    ​ is used to view a library or other libraries that an executable depends on.

    image-20220118153659677

Sometimes using the ldd tool, it prompts

image-20220118154609043

This situation may be due to the mismatch between the architecture of the file and the platform on which the command is executed, for example, the aarch64 platform runs on the x86 platform. As a solution, you can use the aarch64 version of the ldd tool and port it to the aarch64 platform.

The ldd tool is essentially just a shell script, so just modify the code.

1. Pay attention to ensure that #!/bin/bash is consistent with the platform interpreter;
2. Modify the variable RTLDLIST value. Modify it to the link dynamic library (ld-linux-xx.so) under the aarch64 platform, most of which are located in the /lib directory;

image-20220118154942345

Write a plugin starting from a template

git clone https://gitlab.freedesktop.org/gstreamer/gst-template.git

To enter the git directory, you need to switch the branch to version 1.18. There are some problems with the template of the master version. After replacing the name with the make_element tool,

Taking the videocrop plugin as an example, refactoring based on the actual plugin

Plugin Inheritance

GObject
    ╰──GInitiallyUnowned
        ╰──GstObject
            ╰──GstElement
                ╰──GstBaseTransform
                    ╰──GstVideoFilter
                        ╰──videocrop

Plug-in reconstruction method, gstvideofilter.c and gstbasetransform.c are integrated into libgstvideo.so and libgstreamer-1.0.so respectively, if not necessary, do not modify the source code of the two, just link the corresponding dynamic library directly. If you directly modify the source code, the two c files will depend on a series of header files and other dependent libraries, some of which are generated when compiling the gstreamer core library. The header file environment is quite complicated, so this method is not recommended.

How images are stored in memory

​ When the video image is stored in memory, the end of each line of the image may contain some extended content. These extended content only affect how the image is stored in memory, but do not affect how the image is displayed;

​ Stride is the name of these extensions. Stride is also called Pitch. If there is an extension at the end of each row of pixels in the image, the value of Stride must be greater than the width of the image, as shown in the following figure
:Write picture description here

​ The two buffers contain video frames of the same size (width and height), but do not necessarily have the same Stride value. If you process a video frame, you must take Stride into account when calculating;

​ In addition, an image has two different storage sequences (arranged) in memory. For an image stored from top to bottom (Top-Down), the pixels in the top row are stored in the beginning of the memory. For For an image stored from the bottom up (Bottom-Up), the pixels of the last row are stored in the first part of the memory. The following diagram shows the two cases:

Write picture description here

​ YUV images are always represented from top to bottom, and RGB images are usually stored from bottom to top when stored in system memory;

Supplementary knowledge about YUV

YUV (YCrCb) refers to a pixel format that separates the brightness parameter Y and the chrominance parameter U/V, and is mainly used to optimize the transmission of color video signals.

Since the retinal rod cells in our eyes are more than the retinal cone cells, and the retinal rod cells recognize brightness, and the cone cells recognize chromaticity, so our eyes distinguish between light and dark than colors. That is, our eyes are more sensitive to brightness than to chroma. Then, when we store image information, in order to save space, it is not necessary to store all the chroma information.

The YUV pixel format is derived from the RGB pixel format. Through formula calculation, the three components of YUV can be restored to RGB.

YUV storage format

There are two categories of YUV formats: planar and packed.

  • For the YUV format of planar, the Y of all pixels is stored continuously, followed by the U of all pixels, and then the V of all pixels.
  • For the packed YUV format, Y, U, and V of each pixel are continuously interleaved.

YUV sampling format

The storage format of the YUV stream is actually closely related to its sampling method. There are three mainstream sampling methods, YUV4:4:4, YUV4:2:2, YUV4:2:0

Use three graphs to visually represent the acquisition method. The Y component of the pixel is represented by a black dot, and the UV component of the pixel is represented by a hollow circle.

Write picture description here

  • YUV 4:4:4 sampling, each Y corresponds to a set of UV components.
  • YUV 4:2:2 sampling, every two Y share a set of UV components.
  • YUV 4:2:0 sampling, every four Y shares a set of UV components.

Examples of storage methods:

<1>YUV422 storage type

<1-1>YUYV format (sampling format is YUV422, storage format is packed)

img

YUYV is one of the storage formats sampled by YUV422, and two adjacent Ys share the two adjacent Cb(U) and Cr(V). For the pixels Y'00 and Y'01, the values ​​of Cb and Cr are both Cb00 and Cr00, and the YUV values ​​of other pixels are deduced by analogy.

<1-2>UYVY format (sampling format is YUV422, storage format is packed)

img

<1-3>YUV422P (sampling format is YUV422, storage format is planar)

img

​ YUV422P is a Plane mode, that is, planar mode. It is not interleaved with the YUV data above, but stores all the Y components first, then stores all the U (Cb) components, and finally stores all the V (Cr) components. The YUV value extraction method of each pixel is also the most basic extraction method following the YUV422 format, that is, two Ys share one UV. For example, for pixel points Y'00 and Y'01, the values ​​of Cb and Cr are both Cb00 and Cr00.

<2>YUV420 storage type

The formats based on YUV 4:2:0 sampling mainly include YUV 420P and YUV 420SP, and each type corresponds to other specific formats.

  • YUV 420P type
    • YU12 format
    • YV12 format
  • YUV 420SP type
    • NV12 format

    • NV21 format

Both YUV 420P and YUV 420SP are stored based on the Planar planar format. After storing all the Y components first, the YUV420P type will first store all the U or V components, while YUV420SP stores in the alternating order of UV or VU up,

<2-1>YUV420sp (the sampling format is YUV420, the storage format is planar, divided into Y-planner and UV-planar, where the UV plane is packed)

img

img

Both NV21 and NV12 belong to the YUV420 format, which is a two-plane mode, that is, Y and UV are divided into two Planes, but UV (CbCr) is stored interleaved instead of divided into three planes. Its extraction method is similar to the above one, that is, Y'00, Y'01, Y'10, and Y'11 share Cr00 and Cb00.

<2-2>YUV420p (sampling format is YUV420, storage format is planar, divided into Y-planner, U-planar and V-planar)

img

img

YU12 (also known as I420) and YV12 belong to the YUV420 format, which is also a Plane mode. The Y, U, and V components are packaged separately and stored in sequence. The YUV data extraction of each pixel follows the extraction method of YUV420 format, that is, 4 Y components share a set of UV. As shown in the figure above, Y'00, Y'01, Y'10, and Y'11 share Cr00 and Cb00, and so on.

Note that the difference between YU12 and YV12 is whether U or V is stored first. For YU12, the storage order is YUV, that is, YCbCr; for YV12, the storage order is YVU, that is, YCrCb.

Guess you like

Origin blog.csdn.net/qq_37117214/article/details/126781955