A brief description of the FFmpeg media file playback format change process

For example, to play an MP4 file, what should the file go through?

 An MP4 file contains: video compression data stream (such as H.264, H.265) and audio compression data (such as aac, MP3)

First, it needs to be decoded into raw data , video raw data is YUV, audio raw data is PCM

Then the original data is adapted and played according to the format required by each platform.


Take Android as an example:

The screens of Android mobile phones are displayed in RGB , so YUV data needs to be converted into RGB data to display normally

The speaker of the Android device only supports 16 bits , while the PCM data is 32 bits, so it needs to be transcoded to play the audio normally


Then codec and transcoding can be done with FFmpeg , which is very convenient

Decompress video as YUV data:

ffmpeg -i input.mp4 -c:v rawvideo -pix_fmt yuv420p output.yuv

 Decompress audio as PCM data:

ffmpeg -i input.mp4 -c:a pcm_s16le output.wav

 YUV to RGB conversion:

ffmpeg -s:v widthxheight -pix_fmt yuv420p -i input.yuv -c:v rawvideo -pix_fmt rgb24 output.rgb

 PCM audio data 32-bit conversion to 16-bit:

ffmpeg -i input.wav -c:a pcm_s16le output.wav

Guess you like

Origin blog.csdn.net/weixin_47592544/article/details/130750469