foreword
Let's sort out the knowledge points to be learned from the primary stage to the advanced stage of C++ audio and video development!
C++ audio and video development skill tree (primary)
The first stage of audio and video basics
01. How to collect sound - the principle of analog-to-digital conversion
02. Why high-quality audio sampling rate>=44.1Khz
04. How many bits are used for a sampling point
05. Whether the sampling value is represented by an integer or a floating point number
06. The volume and the sampling value have a great relationship
07. How many sampling points are used as a frame of data
08. How to arrange the sampling data of the left and right channels
09. What is PCM (Pulse Code Modulation)
02. Why do you need YUV format
04. Resolution, frame rate, bit rate
05. YUV data storage format difference
06. YUV memory alignment problem
07. Why does the screen display a green screen
09.H264 IPB frame relationship
01. What is demultiplexing, such as MP4 format
02. Why do you need different formats of demultiplexing formats MP4/FLV/TS
03. Common taking formats MP4/FLV/TS
FFmpeg development environment construction
01. Three platforms: Windows, Ubuntu, and MAC
03.FFmpeg command line environment
06.vs2019 installation (win platform)
Common tools for audio and video development
01.MediaInfo--Analyze video files
03.EasyICE - analyze TS stream
06.audacity - analyze audio PCM
07. Elecard_streamEye - Analysis of H264
08. Hikvision YUVPlayer - Analysis of YUY
The second stage of FFmpeg combat
01. Audio PCM/AAC file extraction
02. Video YUY/H264 file extraction
03. Demultiplexing and multiplexing
05. Video cropping and merging
08. Watermark / PIP / Jiugong grid filter
SDL cross-platform multimedia development library actual combat
01. SDL environment construction
04. Video YUV screen rendering
02.FFmjpg memory reference counting model
03. Demultiplexing related AVFormat XXX, etc.
04. Codec related AVCodec XXX, etc.
07. FFmpeg object-oriented idea
08. Packet/Frame data zero copy
01.FFmpeg filtering chain framework
03. Video regularizer framework
04. Multi-channel audio mixing amix
06. Video area cropping and flipping
FFmpeg audio and video demultiplexing + decoding
04. Analysis of FLV package format
05. Analysis of MP4 package format
06. What is the difference between FLV and MP4 seek
07. Why FLV format can be used for live broadcast
08. Why MP4 cannot be used for live broadcast
09. Can MP4 be used for on-demand video?
13. Audio resampling in practice
14. Is the playback duration of the resampled data consistent?
15. How PTS is represented after resampling
16. YUV memory alignment problem after video decoding
17. PCM arrangement format problem after audio decoding
18. Hardware decoding dxva2/nvdec/cuvid/qsv
19. Hardware gpu data transfer to cpu
01. Master the meaning of ffplay.c
07. Screen rendering time interval
09. Screen size format conversion
10. Audio, video, external clock, synchronization difference
11. Audio resampling compensation when video is the benchmark
12. The essence of volume mute and size adjustment
13. Audio and video packet queue size limit
14. Audio and video packet queue thread safety
15. Audio and video frame queue size limit
16. Audio and video frame queue thread safety
17. Pause and play implementation mechanism
18. The screen card master problem caused by seek playback
19. Seek playback data queue, synchronous clock processing
20. How to play frame by frame
21. The main points of the process of player exit
FFmpeg audio and video decoding + multiplexing composite video
03. PCM+YUV multiplexing to MP4/FLV
05. The difference between IDR frame and I frame
06. Dynamically modify the encoding rate
07. GOP interval reference value
08. Multiplex synthesis MP4 audio and video out of sync problem
09. Coding and multiplexing timebase issues
10. MP4 synthesis IOS cannot play the problem
11. How to represent PTS after resampling
12. Video encoding YUV memory alignment problem
13. Hardware encoding dxva2/nvenc/cuvid/qsv
15. H264, H265 encoding mutual conversion
ffmpeg multimedia video processing tool
01. Master the meaning of ffmpeg.c
08. Audio and video file splicing
10. Command line parsing process
11. MP4 to FLV does not re-encode logic
12. MP4 to FLV re-encoding logic
01. Explanation of QMplay2 open source player
07. Audio and video synchronization
08.CUVID/D3D11VA hard solution
12. Picture brightness and saturation adjustment
14. Code stream information analysis
Analysis of source code of OBS push stream recording
01.OBS vs2019+QT5.15.2 compilation
02. Audio and video configuration and initialization analysis
03. Audio and video thread module collection and coding analysis
04. Video configuration and initialization analysis
05. Video thread module acquisition and coding analysis
06. Analysis of OBS initialization process
07. Analysis of recording process
08. Microphone acquisition analysis
09. Desktop acquisition and analysis
11. System sound + microphone mixing
12. Analysis of push flow module
The first stage of audio and video basics
Audio Basics
01. How to collect sound - the principle of analog-to-digital conversion
02. Why high-quality audio sampling rate>=44.1Khz
03. What is PCM
04. How many bits are used for a sampling point
05. Whether the sampling value is represented by an integer or a floating point number
06. The volume and the sampling value have a great relationship
07. How many sampling points are used as a frame of data
08. How to arrange the sampling data of the left and right channels
09. What is PCM (Pulse Code Modulation)
10. Audio coding principle
Video Basics
01. RGB color principle
02. Why do you need YUV format
03. What is a pixel
04. Resolution, frame rate, bit rate
05. YUV data storage format difference
06. YUV memory alignment problem
07. Why does the screen display a green screen
08. H264 encoding principle
09.H264 IPB frame relationship
Note: For the specific H264 encoding format, see the FFmpeg chapter
Demultiplexing Basics
01. What is demultiplexing, such as MP4 format
02. Why do you need different formats of demultiplexing formats MP4/FLV/TS
03. Common taking formats MP4/FLV/TS
Note: For details on the specific multiplexing format, see the FFmpeg chapter
FFmpeg development environment construction
01. Three platforms: Windows, Ubuntu, and MAC
02.QT installation
03.FFmpeg command line environment
04.FFmpeg API environment
05. FFmpeg compilation
06.vs2019 installation (win platform)
Common tools for audio and video development
01.MediaInfo--Analyze video files
02. VL Player - Play Test
03.EasyICE - analyze TS stream
04. flvAnalyser - analyze FLV
05.mp4box - analyze MP4
06.audacity - analyze audio PCM
07. Elecard_streamEye - Analysis of H264
08. Hikvision YUVPlayer - Analysis of YUY
The second stage of FFmpeg combat
FFmpeg command
01. Audio PCM/AAC file extraction
02. Video YUY/H264 file extraction
03. Demultiplexing and multiplexing
04. Audio and video recording
05. Video cropping and merging
06. Image/Video Conversion
07. Live stream push and pull
08. Watermark / PIP / Jiugong grid filter
Note: The purpose of mastering FFmpeg: 1. Quickly grasp what FFmpeg can do; 2. Deepen the understanding of audio and video
SDL cross-platform multimedia development library actual combat
01. SDL environment construction
02. SDL event handling
03.SDL thread processing
04. Video YUV screen rendering
05. Audio PCM sound output
Note: SDL is compatible with three platforms of Win, Ubuntu and Mac, and is mainly used for screen display and sound output of subsequent projects
The cornerstone of FFmpeg
01.FFmpeg framework
02.FFmjpg memory reference counting model
03. Demultiplexing related AVFormat XXX, etc.
04. Codec related AVCodec XXX, etc.
05. Compressed data AVPacket
06. Uncompressed data AVFrame
07. FFmpeg object-oriented idea
08. Packet/Frame data zero copy
Note: The purpose is to be familiar with the common structures and function interfaces of FFmpeg
FFmpeg filter
01.FFmpeg filtering chain framework
02. Audio Filter Framework
03. Video regularizer framework
04. Multi-channel audio mixing amix
05. Video watermark watermark
06. Video area cropping and flipping
07. Add logo to video
FFmpeg audio and video demultiplexing + decoding
01. Demultiplexing process
02. Audio decoding process
03. Video decoding process
04. Analysis of FLV package format
05. Analysis of MP4 package format
06. What is the difference between FLV and MP4 seek
07. Why FLV format can be used for live broadcast
08. Why MP4 cannot be used for live broadcast
09. Can MP4 be used for on-demand video?
10. AAC ADTS Analysis
11.H264 NALU analysis
12.AVIO memory input mode
13. Audio resampling in practice
14. Is the playback duration of the resampled data consistent?
15. How PTS is represented after resampling
16. YUV memory alignment problem after video decoding
17. PCM arrangement format problem after audio decoding
18. Hardware decoding dxva2/nvdec/cuvid/qsv
19. Hardware gpu data transfer to cpu
20.H265 decoding
Note: FFmpeg API learning: video demultiplexing -> decoding -> encoding -> multiplexing composite video
ffplay player
01. Master the meaning of ffplay.c
02.ffplay framework analysis
03. Demultiplexing threads
04. Audio decoding thread
05. Video decoding thread
06. Sound output callback
07. Screen rendering time interval
08. Audio resampling
09. Screen size format conversion
10. Audio, video, external clock, synchronization difference
11. Audio resampling compensation when video is the benchmark
12. The essence of volume mute and size adjustment
13. Audio and video packet queue size limit
14. Audio and video packet queue thread safety
15. Audio and video frame queue size limit
16. Audio and video frame queue thread safety
17. Pause and play implementation mechanism
18. The screen card master problem caused by seek playback
19. Seek playback data queue, synchronous clock processing
20. How to play frame by frame
21. The main points of the process of player exit
Note: ffplay.c is the source code of the ffplay command, mastering ffplay will have a multiplier effect on our own development of the player
FFmpeg audio and video decoding + multiplexing composite video
01.AAC audio coding
02.H264 video decoding
03. PCM+YUV multiplexing to MP4/FLV
04. H264 encoding principle
05. The difference between IDR frame and I frame
06. Dynamically modify the encoding rate
07. GOP interval reference value
08. Multiplex synthesis MP4 audio and video out of sync problem
09. Coding and multiplexing timebase issues
10. MP4 synthesis IOS cannot play the problem
11. How to represent PTS after resampling
12. Video encoding YUV memory alignment problem
13. Hardware encoding dxva2/nvenc/cuvid/qsv
14. H265 encoding principle
15. H264, H265 encoding mutual conversion
ffmpeg multimedia video processing tool
01. Master the meaning of ffmpeg.c
02.ffmpeg frame analysis
03. Audio and video coding
04. Package format conversion
05. Extract audio
06. Extract video
07. Logo overlay
08. Audio and video file splicing
09.filter mechanism
10. Command line parsing process
11. MP4 to FLV does not re-encode logic
12. MP4 to FLV re-encoding logic
13. MP4 to FLV timebase
14. MP4 to FLV scale
Note: ffmpeg.c is the source code of the ffmpeg command, mastering the general framework of ffmpeg.c is for us to realize some functions that we don’t know how to write code (you can use the ffmpeg command line but don’t know how to call the ffmpeg api, you can refer to ffmpeg.c logic) is extremely helpful, such as to crop the length of the video