(Audio and video study notes): FFmpeg filter

table of Contents

FFmpeg filter

Video crop

Text watermark

Picture watermark

FFmpeg generates picture-in-picture

FFmpeg video multi-grid processing

FFmpeg filter

Video crop

  • Video filter (filter): crop

  • Crop the width and height of the input video frame from the position indicated by the x and y values ​​to the specified width and height.
  • x and y are the coordinates of the upper left corner of the output, and the center of the coordination system is the upper left corner of the input video frame.
    • The calculated value of x (the number of pixels in the horizontal direction from the upper left corner) and y (the number of vertical pixels) are evaluated for each frame. The default value of x is (iw-ow)/2, and the default value of y is (ih-oh)/2
  • If the optional keep_aspect parameter is used, the output SAR (sample aspect ratio) will be changed to compensate for the new DAR (display aspect ratio)
  • The value of ow can be obtained from oh and vice versa, but not from x and y, because these values ​​are performed after ow and oh.
  • The value of x can be obtained from the value of y and vice versa.
    • For example, in the left third, middle third and right third of the input box, we can use the command:
ffmpeg -i input -vf crop=iw/3:ih:0:0 output

ffmpeg -i input -vf crop=iw/3:ih:iw/3:0 output

ffmpeg -i input -vf crop=iw/3:ih:iw/3*2:0 output
(1) 裁剪 100x100 的区域,起点为(12,34).
crop=100:100:12:34
相同效果:
crop=w=100:h=100:x=12:y=34

(2)裁剪中心区域,大小为 100x100
crop=100:100

(3)裁剪中心区域,大小为输入视频的 2/3
crop=2/3*in_w:2/3*in_h

(4)裁剪中心区域的正方形,高度为输入视频的高
crop=out_w=in_h
crop=in_h

(5) 裁剪偏移左上角 100 像素
crop=in_w-100:in_h-100:100:100

(6)裁剪掉左右 10 像素,上下 20 像素
crop=in_w-2*10:in_h-2*20

(7)裁剪右下角区域
crop=in_w/2:in_h/2:in_w/2:in_h/2
  • Tips: Use ffplay to view the effect, and finally use ffmpeg for cropping
ffplay-i input -vf crop=iw/3:ih:0:0 output
  • When using FFmpeg's Filter, you can use Filter's time-related built-in variables
    • FFmpeg filter basic built-in variables
      • t              is the timestamp in seconds, or NAN if the input time is unknown
      • n Enter the sequence number of the frame, starting from 0
      • pos the position of the input frame, or NAN if unknown
      • w The width of the input video frame
      • h The height of the input video frame

Text watermark

  • There are many conditions for adding text watermark to video:
    • Need to have relevant documents processed by the font library.
    • Need to support FreeType, FontConfig, iconv when compiling FFmpeg.
    • A related font library is required in the system.
  • Adding pure letter watermark in FFmpeg can be supported by drawtext  filter. Let’s take a look at the filter parameters of drawtext.

  • Add the watermark of the text to the upper left corner of the video
ffplay -i input.mp4 -vf "drawtext=fontsize=100:fontfile=FreeSerif.ttf:text='hello world':x=20:y=20"
  • Set the color of the font to green: fontcolor=green
ffplay -i input.mp4 -vf "drawtext=fontsize=100:fontfile=FreeSerif.ttf:text='hello world':fontcolor=green"
  • If you want to adjust the display position of the text watermark, adjust the values ​​of the x and y parameters: x=400:y=200
ffplay -i input.mp4 -vf "drawtext=fontsize=100:fontfile=FreeSerif.ttf:text='hello world':fontcolor=green:x=400:y=200"
  • Modify transparency: alpha
ffplay -i input.mp4 -vf "drawtext=fontsize=100:fontfile=FreeSerif.ttf:text='hello world':fontcolor=green:x=400:y=200:alpha=0.5
  • You can also add a box to the text watermark, and then add a background color to the box: box=1:boxcolor=yellow
fplay -i input.mp4 -vf "drawtext=fontsize=100:fontfile=FreeSerif.ttf:text='hello world':fontcolor=green:box=1:boxcolor=yellow"
  • I hope that the text watermark uses the local time as the watermark content, which can be completed with some special usage in the drawtext filter. The local current time is displayed in the text in the format of year, month, day, hour, minute and second.
    • text='%{localtime\:%Y\-%m\-%d %H-%M-%S}'
ffplay -i input.mp4 -vf "drawtext=fontsize=60:fontfile=FreeSerif.ttf:text='%{localtime\:%Y\-%m\-%d %H-%M-%S}':fontcolor=gree n:box=1:boxcolor=yellow"
  • When using ffmpeg to transcode and store to a file, you need to add  -re , otherwise the time is wrong
ffmpeg -re -i input.mp4 -vf "drawtext=fontsize=60:fontfile=FreeSerif.ttf:text='%{localtime\:%Y\-%m\-%d %H-%M-%S}':fontcolor=green:box=1:boxcolor=yellow" out.mp4
  • In some scenes, the watermark needs to be displayed regularly, and the watermark is not displayed regularly: enable=lt(mod(t\,3)\,1)
    • This method can also be processed with the drawtext filter. Use drawtext and enable to work together. For example, the text watermark is displayed every 3 seconds:
ffplay -i input.mp4 -vf "drawtext=fontsize=60:fontfile=FreeSerif.ttf:text='test':fontcolor=green:box=1:boxcolor=yellow:enable=lt(mod(t\,3)\,1)
  • You need to add -re when using ffmpeg to transcode and store to a file, otherwise the time is wrong.
  • Expression reference: http://www.ffmpeg.org/ffmpeg-utils.html 3 Expression Evaluation
    • lt(x, y) Return 1 if x is lesser than y, 0 otherwise.
    • mod(x, y) Compute the remainder of division of x by y.
  • Marquee effect
    • x=mod(100*t\,w):y=abs(sin(t))*h*0.7
ffplay -i input.mp4 -vf "drawtext=fontsize=100:fontfile=FreeSerif.ttf:text='helloworld':x=mod(100*t\,w):y=abs(sin(t))*h*0.7"
  • Modify font transparency, modify font color: enable=lt(mod(t\,3)\,1)
ffplay -i input.mp4 -vf "drawtext=fontsize=40:fontfile=FreeSerif.ttf:text='liaoqingfu':
x=mod(50*t\,w):y=abs(sin(t))*h*0.7:alpha=0.5:fontcolor=white:enable=lt(mod(t\,3)\,1)"

Picture watermark

  • Add a picture watermark to video can use movie filters
  • FFmpeg movie filter parameters
parameter Types of Description
filename String The input file name can be file, protocol, device
format_name, f String Input package format
stream_index, si Integer Entered stream index number
seek_point, sp Floating point The time position of the Seek input stream
streams, s String Stream information of multiple streams input
loop Integer Cycles
discontinuity Time difference Support jittery timestamp difference
  • Case:
ffmpeg -i input.mp4 -vf "movie=logo.png[watermark];[in][watermark]overlay=x=10:y=10[out]" output.mp4
  • Original video file path: input.mp4
  • Watermark image path: logo.png
  • Watermark position: (x,y)=(10,10)<=(left,top) 10 pixels from the left and top each;
  • Output file path: output.mp4
    • main_w video single frame image width
    • main_h video single frame image height
    • overlay_w The width of the watermark image
    • overlay_h The height of the watermark image
  • Correspondingly, the overlay parameter can be set to the following values ​​to change the position of the watermark image:
    • Watermark image location overlay value
    • Top left corner 10:10
    • Upper right corner main_w-overlay_w-10:10
    • Lower left corner 10: main_h-overlay_h-10
    • 右下角                     main_w-overlay_w-10:main_h-overlay_h-10

  • There are two ways to add image watermark in FFmpeg:
    • One is to specify the watermark file path through movie .
    • One way is to read the stream of the input file through the filter and specify it as a watermark , here is how to read a movie image file as a watermark
  • The picture logo.png will be typed into the input.mp4 video and displayed at the position of x coordinate 50 and y coordinate 20
ffplay -i input.mp4 -vf "movie=logo.png[logo];[in][logo]overlay=50:10[out]"
  • Since the background color of the logo.png picture is white, it will be displayed rather bluntly. If the watermark picture has a transparent background, the effect will be better. Let’s find a picture with a transparent background color and try it.
ffplay -i input.mp4 -vf "movie=logo2.png[watermark];[in][watermark]overlay=50:10[out]"
  • Display position
ffplay -i input.mp4 -vf "movie=logo.png[watermark];[in][watermark]overlay=10:10[out]"

ffplay -i input.mp4 -vf "movie=logo.png[watermark];[in][watermark]overlay=main_w-overlay_w-10:10[out]"

ffplay -i input.mp4 -vf "movie=logo.png[watermark];[in][watermark]overlay=10:main_h-overlay_h-10[out]"

ffplay -i input.mp4 -vf "movie=logo.png[watermark];[in][watermark]overlay=main_w-overlay_w-10:main_hoverlay_h-10[out]"
  • Marquee effect : overlay=x=mod(50*t\,main_w):y=abs(sin(t))*h*0.7[out]
ffplay -i input.mp4 -vf "movie=logo.png[watermark];[in][watermark]overlay=x=mod(50*t\,main_w):y=abs(sin(t))*h*0.7[out]"

FFmpeg generates picture-in-picture

  • When using FFmpeg to process streaming media files, sometimes you need to use the picture-in-picture effect.
  • In FFmpeg, multiple video streams, multiple multimedia capture devices, and multiple video files can be combined into one interface through overlay to generate a picture-in-picture effect.
  • In the use of the previous filter, and even in the future use of the filter, most of the processing related to video operations will be used in conjunction with the overlay filter, especially when used in layer processing and merging scenes. Let’s learn about it below. Let’s take a look at the overlay parameters,
parameter Types of Description
x String coordinate
Y String coordinate
eof_action Integer
  • The processing method when encountering eof, the default is repeated
    • repeat (value 0): repeat the previous frame
    • endcall (value is 1): stop all streams
    • pass (value 2): keep the main layer
shortest Boolean Terminate all when terminating the shortest video (default false)
format Integer
  • Set the pixel format of output, the default is yuv420
    • yuv420 (value is 0)
    • yuv422 (value is 1)
    • yuv444 (value is 2)
    • rgb (value 3)
  • As you can see from the parameter list, there are not many main parameters, but in fact, in the use of the overlay filter, there are many combinations of parameters that can be used, and some internal variables can be used, such as the width, height, coordinates of the overlay layer, etc. .

  • Show picture-in-picture effect
ffplay -i input.mp4 -vf "movie=sub_320x240.mp4[sub];[in][sub]overlay=x=20:y=20[out]"

ffplay -i input.mp4 -vf "movie=sub_320x240.mp4[sub];[in][sub]overlay=x=20:y=20:eof_action=1[out]"

ffplay -i input.mp4 -vf "movie=sub_320x240.mp4[sub];[in][sub]overlay=x=20:y=20:shortest =1[out]
  • Scaled sprite size: scale=640x480
ffplay -i input.mp4 -vf "movie=sub_320x240.mp4,scale=640x480[sub];[in][sub]overlay=x=20:y=20[out]"
  • Marquee
ffplay -i input.mp4 -vf "movie=sub_320x240.mp4[test];[in][test]overlay=x=mod(50*t\,main_w):y=abs(sin(t))*main_h*0.7[out]"

FFmpeg video multi-grid processing

  • In addition to the picture-in-picture display of the video, there is also a scene that is presented in a multi-grid manner. In addition to inputting video files, you can also input video streams, capture devices, and so on.
  • As you can see from the previous article, when processing video images, the overlay filter is the key canvas. You can create a canvas through FFmpeg, or you can use the default canvas.
  • If you want to display in multiple grids, you can create a large enough canvas yourself. Let’s take a look at an example of multiple grid displays:
ffmpeg -i 1.mp4 -i 2.mp4 -i 3.mp4 -i 4.mp4 -filter_complex "nullsrc=size=640x480[base];
[0:v] setpts=PTS-STARTPTS,scale=320x240[upperleft];
[1:v]setpts=PTS-STARTPTS,scale=320x240[upperright];
[2:v]setpts=PTS-STARTPTS, scale=320x240[lowerleft];
[3:v]setpts=PTS-STARTPTS,scale=320x240[lowerright];
[base][upperleft]overlay=shortest=1[tmp1];
[tmp1][upperright]overlay=shortest=1:x=320[tmp2];
[tmp2][lowerleft]overlay=shortest=1:y=240[tmp3];
[tmp3][lowerright]overlay=shortest=1:x=320:y=240" out.mp4
  • 1.2.3.4.mp4 is the file path, out.MP4 is the output file path, create an overlay canvas through nullsrc , the canvas size is 640:480, use [0:v][1:v][2:v][3:v] Remove the 4 input video streams, perform zooming processing separately, and then tile the video based on the canvas generated by nullsrc. In the command, customize upperleft, upperright, lowerleft, and lowerright to tile at different positions.

  • Only superimpose the upper left and upper right commands:
ffmpeg -i 1.mp4 -i 2.mp4 -i 3.mp4 -i 4.mp4 -filter_complex "nullsrc=size=640x480[base];[0:v]setpts=PTSSTARTPTS,scale=320x240[upperleft];[1:v]setpts=PTSSTARTPTS,scale=320x240[upperright];[base][upperleft]overlay=shortest=1[tmp1];[tmp1][upperright]overlay=sho
rtest=1:x=320" out2.mp4

 

Guess you like

Origin blog.csdn.net/baidu_41388533/article/details/112402753