FFmpeg command practice (Part 1)

1. FFmpeg command line environment construction

1. Go to FFmpeg's github and choose to download the required version. Here we take windows as an example.
Insert image description here
There are two compressed packages here. ffmpeg-master-latest-win64-gpl-shared contains the lib file of ffmpeg, which can be used for secondary development of ffmpeg. ffmpeg-master-latest-win64-gpl only contains the tools of ffmpeg. .
We use the shared version here. We download and open it and click on the bin directory. There are three executable files in the bin directory, namely ffmpeg, ffplay and ffprobe, and there are also dynamic libraries that need to be used to run them.
Insert image description here
We put the three executable files under the C:\Windows path so that they can be accessed directly through the command line, and then put the rest of the dynamic libraries into C:\Windows\SysWOW64 . This folder is used to store Dynamic library for windows applications.
Then open the command line cmd and enter ffmpeg, ffplay and ffprobe respectively. The corresponding version numbers will be displayed, and the installation is complete.

2.The difference between ffmpeg, ffplay and ffprobe

ffmpeg is an editing tool, ffplay is a player, and ffprobe is a multimedia analyzer.
ffmpeg view help documentation:

Basic information: ffmpeg -h
Advanced information: ffmpeg -h long
All information: ffmpeg -h full
Write output information to a file: ffmpeg -h > filename.log

ffplay view the help documentation:

All information: ffplay -h

ffprobe views the help documentation:

All information: ffprobe -h

The windows search output information command is findstr, which is similar to Linux grep.

ffmpeg -h full | findstr 264

3.ffmpeg processing flow

Insert image description here
We understand the process through a line of commands:

ffmpeg -i test_1920x1080.mp4 -acodec copy -vcodec libx265 -s 1280x720 test_1280x720.mkv

  • -i is the input file, obtains the encoded data packet, and may separate the video and audio data packets.
  • Then the data packet will be decoded to obtain the data frame. This step is a key step for us to process the data content, such as processing resolution, frame rate, adding watermark, etc. Here we are converting the resolution from 1920x1080 to 1280x720 .
  • Then after processing, the data frame will be encoded to obtain the data packet. acodec specifies the audio encoding format, and vcodec specifies the video encoding format. The acodec copy used here means that the original audio encoding format is used, which means there is no need to re-encode. , and vcodec libx265 needs to re-encode the original video stream into the libx265 encoding format.
  • Finally, the data packages are merged to obtain the file.

Insert image description here

4.ffmpeg command classification query

Insert image description here

  • ffmpeg -format displays D and E. Displaying D means that the demultiplexer supports this format, and E means that the multiplexer supports this format.
  • ffmpeg -devices lists all available devices supported by FFmpeg (i.e. input devices and output devices)
  • ffmpeg -bsfs is used to list all bitstream filters supported by FFmpeg.
    Bitstream Filters are a functional module that processes data when decoding or encoding audio and video streams. Bitstream filters can be used for modification, encoding, decoding, encryption, decryption and other operations. Common bitstream filters include H.264 CABAC conversion, H.264 SEI data insertion, AAC ADTS conversion, etc.
  • The relationship between bsfs and codecs, codecs is just an algorithm that encapsulates the data stream into the corresponding format, such as H.264, but it is not used to convert the format, it is only used to encapsulate, while bsfs is used to actually convert the data format.
    Insert image description here
  • ffmpeg -protocols is used to list all protocols supported by FFmpeg, that is, ffmpeg can access data through some of the displayed protocols, such as http protocol for accessing network resources through HTTP, file protocol for accessing local file system resources, and pipe protocol for accessing network resources through HTTP. For transferring data between processes, etc.
  • ffmpeg -filters is used to list all available filters supported by FFmpeg.
    In FFmpeg, a filter is a tool used to operate on audio and video data during audio and video processing. Filters can be used to crop, adjust the size of audio and video streams, adjust the encoding and decoding parameters of audio and video streams, add watermarks, add subtitles, etc.

5.ffplay playback control

Insert image description here

6.ffplay command options

  1. -x width opens the width of the player

  2. -y height opens the height of the player

  3. -video_size size Frame size setting displays frame storage (WxH format), only applicable to videos such as original YUV that do not contain frame size (WxH).
    For example:
    ffplay -pixel_format yuv420p -video_size 320x240 -framerate 5 yuv420p_320x240.yuv

  4. -pixel_format format format sets the pixel format.

  5. -fs opens in full screen

  6. -an do not play sound

  7. -vn do not play video

  8. -sn do not display subtitles

  9. -ss pos selects the progress bar at the beginning. The formats that can be replaced by pos are: 5 (5 seconds), 5:5 (5 minutes and 5 seconds), 5:5:5 (5 hours, 5 minutes and 5 seconds)

  10. -t duration sets the playback video/audio length, the time unit is such as the -ss option

  11. -bytes Position and drag by bytes (0=off 1=on -1=auto).

  12. -seek_interval interval Customize the left/right key positioning and dragging interval (in seconds), the default value is 10 seconds (the code does not see the implementation)

  13. -nodisp closes the graphical display window and the video will not be displayed

  14. -noborder borderless window

  15. volume vol sets the starting volume. Volume range [0 ~100]

  16. -f fmt Forces parsing using the set format. For example -f s16le

  17. -window_title title sets the window title (default is the input file name)

  18. -loop number sets the number of playback loops

  19. -showmode mode sets the display mode. Available mode values ​​are: 0 displays video, 1 displays audio waveform, and 2 displays audio spectrum. The default is 0, if the video does not exist, 2 will be automatically selected.

  20. -vf filtergraph Set video filter options

  21. -af filtergraph Set audio filter options

  22. -stats Prints multiple playback statistics, including showing stream duration, codec parameters, current position in the stream, and audio/video sync delta. Enabled by default, to explicitly disable it requires specifying -nostats. .

  23. -fast Multimedia compatibility optimization for non-standardized specifications.

  24. -genpts generate pts.

  25. -sync type The synchronization type sets the main clock to audio (type=audio), video (type=video) or external (type=ext). The default is audio as the main clock.

  26. -ast audio_stream_specifier specifies the audio stream index, such as -ast 3, plays the audio stream with stream index 3

  27. -vst video_stream_specifier specifies the video stream index, such as -vst 4, plays the video stream with stream index 4

  28. -sst subtitle_stream_specifier specifies the subtitle stream index, such as -sst 5, plays the subtitle stream with stream index 5

  29. -autoexit Exit after the video is finished playing.

  30. -exitonkeydown Press any key on the keyboard to exit playback

  31. -exitonmousedown Press any key with the mouse to exit playback

  32. -codec:media_specifier codec_name forces the use of the set multimedia codec. The available values ​​for media_specifier are a (audio), v (video) and s subtitles. For example -codec:v h264_qsv forces the video to be decoded with h264_qsv

  33. -acodec codec_name forces the use of the set audio codec for audio decoding

  34. -vcodec codec_name forces the use of the set video codec for video decoding

  35. -scodec codec_name forces the use of the set subtitle decoder for subtitle decoding

  36. -autorotate Automatically rotate videos based on file metadata. Value is 0 or 1, default is 1.

  37. -framedrop Drop video frames if the video is out of sync. It is enabled by default when the main clock is not the video clock. To disable it, use -noframedrop

  38. -infbuf does not limit the input buffer size. Read as much data as possible from the input as quickly as possible. Enabled by default when playing live streams, data may be discarded if it is not read in time. This option will not limit the buffer size. If you need to disable it, use -noinfbu

Guess you like

Origin blog.csdn.net/m0_60565784/article/details/130776330