FFMPEG compile


foreword

Recently, I wanted to learn more about the use of the FFMPEG open source library, so I started to make a wheel from scratch, and then built it according to my own ideas.


1. What is FFMPEG?

I won’t explain this, it’s a very powerful video codec library, so much for beginners, I’m going to explore a lot of content to improve notes.
Here are some codec functions to operate on the video.

2. How to use

1. Environment preparation

Since the compiled FFMPEG is ready to be used on the Android terminal, it is prepared to use the NDK compilation method to compile. I am using the linux compilation environment, and the Android terminal running is Android-9.

1.1 Source code download

Post a link to the official website: http://www.ffmpeg.org/olddownload.html , here you can find the version you want
Here you can choose the version you want. It should be noted that different versions need to match different NDK environments, otherwise the compilation will report an error.  Versions above 4.0 require NDK17, and my version 2.7 NDKr11 can be compiled.

1.2 NDK environment download

The NDK environment used here is android-ndk-r11. For the installation of the NDK environment, you can refer to the articles on the Internet. I have not installed it here because it is ready-made. =. =
Forgot, there is also a good codec library openh264, remember to download this or an error will be reported.

PS: In different NDK environments, when compiling FFMPEG, errors may occur due to changes in the version of FFMPEG, which need to be resolved manually. To avoid pitfalls, it is recommended to use the matching version that has been successfully compiled, unless some functions must be used.

2. Source code compilation

2.1 NDK compilation

First you need to compile FFMPEG, according to the compilation environment, target machine architecture, and NDK, you need to configure it. FFMPEG has a configuration script configure to configure according to different environmental requirements. The general situation is to compile and install after configuration. Here is a script to get it done once.

#!/bin/bash
NDK=/opt/android_build/android-ndk-r11
SYSROOT=$NDK/platforms/android-19/arch-arm/
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
CFLAGS="-O3 -Wall -DANDROID -DNDEBUG -nostdlib"
EXTRA_CFLAGS="-march=armv7-a -mfpu=neon \
              -mfloat-abi=softfp "
OPTIMIZE_CFLAGS="-marm"
CPU=arm
PREFIX=****/FFmpeg-2.7.2-use-zip/android-21

#ADDI_CFLAGS="-I/home/ndk/arm/x264/include"
#ADDI_LDFLAGS="-L/home/ndk/arm/x264/lib"
ADDI_CFLAGS="-I/home/hzg/mypath/simpleexample_ffmpeg/openh264/include"
ADDI_LDFLAGS="-L/home/hzg/mypath/simpleexample_ffmpeg/openh264/lib"
#--disable-demuxers**
#**--disable-decoders**
#**--disable-devices**
#**--disable-filters**
#**--enable-decoder=h264**
#**--enable-decoder=mp3***
#**--enable-demuxer=mpegts**

function build_one
{
    
    
#make distclean
./configure \
--prefix=$PREFIX \
--enable-shared \
--enable-nonfree \
--enable-gpl \
--enable-swscale \
--enable-asm \
--enable-yasm \
--enable-stripping \
--disable-libx264 \
--enable-libopenh264 \
--enable-demuxers \
--enable-decoders \
--disable-yasm \
--disable-devices \
--disable-filters \
--disable-programs \
--disable-doc \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--disable-avdevice \
--disable-doc \
--disable-symver \
--disable-debug \
--disable-network \
--disable-hwaccels \
--disable-indevs \
--disable-outdevs \
--disable-iconv \
--enable-fast-unaligned \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--target-os=linux \
--arch=arm \
--cpu=armv7-a \
--enable-cross-compile \
--sysroot=$SYSROOT  \
--extra-cflags="-march=armv7-a -mfpu=neon -mfloat-abi=softfp -DFF_OPT_ZIP -DFF_OPT_LESSDIRTY -DWITH_CHANGE_BY_ZTE -DRECTIFY_YUV422 -DPAL_ENCODE -O0 $ADDI_CFLAGS" \
--extra-ldflags="$ADDI_LDFLAGS"
#$ADDITIONAL_CONFIGURE_FLAG

#--extra-cflags="-Os -fpic -mfpu noen\
#--extra-cflags="-Os -fpic -mfpu neon\
#--extra-cflags="-Os -mfpu neon -fPIC -DANDROID -mfpu=neon -mfloat-abi=softfp ADDI_CFLAGS "
make clean
make -j32
make DESTDIR=/home/hzg/mypath/simpleexample_ffmpeg/FFmpeg-2.7.2-use-zip/android-21 install
}
build_one

Configure the following library name in the configure file

SLIBNAME_WITH_MAJOR='$(SLIBPREF)$(FULLNAME)-$(LIBMAJOR)$(SLIBSUF)'
LIB_INSTALL_EXTRA_CMD='$$(RANLIB) "$(LIBDIR)/$(LIBNAME)"'
SLIB_INSTALL_NAME='$(SLIBNAME_WITH_MAJOR)'
SLIB_INSTALL_LINKS='$(SLIBNAME)'

After compiling FFMPEG, the corresponding library, header file, and execution file will be generated in the target directory.
head File:

./build/include/libavutil/log.h
./build/include/libavutil/downmix_info.h
./build/include/libavutil/error.h
./build/include/libavutil/ripemd.h
./build/include/libavutil/fifo.h
./build/include/libavutil/imgutils.h
./build/include/libavutil/cpu.h
./build/include/libavutil/base64.h
./build/include/libavutil/random_seed.h
./build/include/libavutil/intreadwrite.h

library:

./android-21/lib/libavcodec-56.so
./android-21/lib/libswresample.so
./android-21/lib/libpostproc-53.so
./android-21/lib/libavcodec.so
./android-21/lib/libavutil-54.so
./android-21/lib/libavfilter.so
./android-21/lib/libpostproc.so
./android-21/lib/libswscale.so
./android-21/lib/libavformat.so
./android-21/lib/libavfilter-5.so
./android-21/lib/libavformat-56.so
./android-21/lib/libswresample-1.so
./android-21/lib/libswscale-3.so
./android-21/lib/libavutil.so

execute program

ffmpeg
ffprobe

2.2 GCC compilation

Create a new script build_linux.sh. The same first configuration, due to server reasons, you need to disable x86asm, then compile the dynamic library, and then add an installation path.

./configure\
        --disable-x86asm\
        --enable-shared\
        --prefix=./linux-build/
make install -j32

The same will generate config.h , which saves the compiled configuration information and is the result of running **./configure**. The second is to compile and install the program/library/header file

-rw-rw-r--  1 hzg hzg   84419 Apr 25 15:47 config.h
drwxrwxrwx  6 hzg hzg    4096 Apr 25 15:47 linux-build
drwxrwxr-x 12 hzg hzg   12288 Apr 26 15:45 libavutil
drwxrwxr-x  5 hzg hzg    4096 Apr 26 15:45 doc
drwxrwxr-x  6 hzg hzg    4096 Apr 26 15:45 libswresample
drwxrwxr-x  7 hzg hzg    4096 Apr 26 15:45 libswscale
drwxrwxr-x  2 hzg hzg    4096 Apr 26 15:45 fftools
drwxrwxr-x 14 hzg hzg   98304 Apr 26 15:45 libavcodec
drwxrwxr-x  3 hzg hzg   45056 Apr 26 15:45 libavformat
drwxrwxr-x  8 hzg hzg   40960 Apr 26 15:45 libavfilter
drwxrwxr-x  3 hzg hzg    4096 Apr 26 15:45 libavdevice
-rwxrwxr-x  1 hzg hzg  632840 Apr 26 15:45 ffprobe_g
-rwxrwxr-x  1 hzg hzg  158024 Apr 26 15:45 ffprobe
-rwxrwxr-x  1 hzg hzg 1096360 Apr 26 15:45 ffmpeg_g
-rwxrwxr-x  1 hzg hzg  284832 Apr 26 15:45 ffmpeg

Then when using it, just refer to the header files and libraries.

gcc -o 264toyuv test.c -I./ffmpeg/linux-build/include -L./ffmpeg/linux-build/lib -lavformat -lavdevice -lavutil -lavcodec -lswresample -lswscale
export LD_LIBRARY_PATH=/home_0421/hzg/mypath/ffmpeg/linux-build/lib:$LD_LIBRARY_PATH

Because it is an ordinary user authority, the so library generated by compiling by yourself will not be searched by the compiler in the generated path by default, so when running the program, it is inevitable that the load error will be reported. If these so libraries are added to the variable LD_LIBRARY_PATH, the program will go to the path of this variable every time it runs. So you need to add the installed library directory to the reference path through export LD_LIBRARY_PATH .
Here I want to test psnr, download a yuv test stream, and then use ffmpeg to encode the yuv stream to generate a h264 compressed stream, and then decode it into yuv through the ffmpeg library to verify the loss of ffmpeg codec (encoded image effect ). Decoding demo:

#include <stdio.h>      /* printf, NULL */
#include <stdlib.h>     /* strtod */
#include <errno.h>      /* err catch */
#include "libavformat/avformat.h"
#include "libswscale/swscale.h"
#include "libavcodec/avcodec.h"
#include "libavdevice/avdevice.h"
#include "libavutil/avutil.h"
#include "libavutil/frame.h"


int h264_to_yuv420p(char* input_file, char* output_file)
{
    
    
    if(input_file == NULL || output_file == NULL)
    {
    
    
        return -1;
    }
    char* in_file = input_file;
    char* out_file = output_file;
    AVFormatContext *fmt_ctx = NULL;
    AVCodecContext *cod_ctx = NULL;
    AVCodec *cod = NULL;
    struct SwsContext *img_convert_ctx = NULL;
    int ret = 0;
    AVPacket packet;

    //第一步创建输入文件AVFormatContext
    fmt_ctx = avformat_alloc_context();
    if (fmt_ctx == NULL)
    {
    
    
        ret = -1;
        printf("alloc fail");
        goto __ERROR;
    }
    if (avformat_open_input(&fmt_ctx, in_file, NULL, NULL) != 0)
    {
    
    
        ret = -1;
        printf("open fail");
        goto __ERROR;
    }

    //第二步 查找文件相关流,并初始化AVFormatContext中的流信息
    if (avformat_find_stream_info(fmt_ctx, NULL) < 0)
    {
    
    
        ret = -1;
        printf("find stream fail");
        goto __ERROR;
    }

    av_dump_format(fmt_ctx, 0, in_file, 0);

    //第三步查找视频流索引和解码器
    int stream_index = av_find_best_stream(fmt_ctx, AVMEDIA_TYPE_VIDEO, -1, -1, &cod, -1);

    //第四步设置解码器上下文并打开解码器
    AVCodecParameters *codecpar = fmt_ctx->streams[stream_index]->codecpar;
    if (!cod)
    {
    
    
        ret = -1;
        printf("find codec fail");
        goto __ERROR;
    }
    cod_ctx = avcodec_alloc_context3(cod);
    avcodec_parameters_to_context(cod_ctx, codecpar);
    ret = avcodec_open2(cod_ctx, cod, NULL);
    if (ret < 0)
    {
    
    
        printf("can't open codec");
        goto __ERROR;
    }

    //第五步打开输出文件
    FILE *out_fb = NULL;
    out_fb = fopen(out_file, "wb");
    if (!out_fb)
    {
    
    
        printf("can't open file");
        goto __ERROR;
    }

    //创建packet,用于存储解码前的数据
	av_init_packet(&packet);


    //第六步创建Frame,用于存储解码后的数据
    AVFrame *frame = av_frame_alloc();
    frame->width = codecpar->width;
    frame->height = codecpar->height;
    frame->format = codecpar->format;
    av_frame_get_buffer(frame, 32);

    AVFrame *yuv_frame = av_frame_alloc();
    yuv_frame->width = codecpar->width;
    yuv_frame->height = codecpar->height;
    yuv_frame->format = AV_PIX_FMT_YUV420P;
    av_frame_get_buffer(yuv_frame, 32);

    // size_t writesize = av_image_get_buffer_size(frame->format, frame->width,frame->height, 32);
    //第七步重采样初始化与设置参数
    // uint8_t **data = (uint8_t **)av_calloc((size_t)out_channels, sizeof(*data))

    img_convert_ctx = sws_getContext(codecpar->width,
                                    codecpar->height,
                                    codecpar->format,
                                    codecpar->width,
                                    codecpar->height,
                                    AV_PIX_FMT_YUV420P,
                                    SWS_BICUBIC,
                                    NULL, NULL, NULL);

    //while循环,每次读取一帧,并转码
    //第八步 读取数据并解码,重采样进行保存
    int count = 0;
    while (av_read_frame(fmt_ctx, &packet) >= 0)
    {
    
    
        if (packet.stream_index != stream_index)
        {
    
    
            av_packet_unref(&packet);
            continue;
        }


        ret = avcodec_send_packet(cod_ctx, &packet);
        if (ret < 0)
        {
    
    
            ret = -1;
            printf("decode error");
            goto __ERROR;
        }

        while (avcodec_receive_frame(cod_ctx, frame) >= 0)
        {
    
    
            printf("decode frame count = %d\n" , count++);
            sws_scale(img_convert_ctx,
                         (const uint8_t **)frame->data,
                        frame->linesize,
                        0,
                        codecpar->height,
                        yuv_frame->data,
                         yuv_frame->linesize);
            int y_size = cod_ctx->width * cod_ctx->height;
            fwrite(yuv_frame->data[0], 1, y_size, out_fb);
            fwrite(yuv_frame->data[1], 1, y_size/4, out_fb);
            fwrite(yuv_frame->data[2], 1, y_size/4, out_fb);
        }

        av_packet_unref(&packet);
    }

__ERROR:
    if (fmt_ctx)
    {
    
    
        avformat_close_input(&fmt_ctx);
        avformat_free_context(fmt_ctx);
    }

    if (cod_ctx)
    {
    
    
        avcodec_close(cod_ctx);
        avcodec_free_context(&cod_ctx);
    }

    if (out_fb)
    {
    
    
        fclose(out_fb);
    }

    if (frame)
    {
    
    
        av_frame_free(&frame);
    }

    if (yuv_frame)
    {
    
    
        av_frame_free(&yuv_frame);
    }

    if(img_convert_ctx)
    {
    
    
        sws_freeContext(img_convert_ctx);
    }
    return ret;
}

int main ()
{
    
    
  h264_to_yuv420p("carphone_qcif.h264","out.yuv");
  return 0;
}

psnr test

./psnr 176 144 420 ../../old/imagedata/carphone_qcif.yuv ../../old/imagedata/carphone_qcif_h264.yuv >carphone_qcif.txt

Test Results

psnr:   380 frames (CPU: 0 s) mean: 10.69 stdv: 0.86
carphone_qcif.txt
10.018
10.017
9.993
10.000
10.010
10.030
10.019
10.010
...
...
...
10.204
10.176
10.153
10.158
10.153
10.165
10.262
10.147
10.138

3. Quote

Android.mk

LOCAL_PATH:= $(call my-dir)
include $(CLEAR_VARS)
#include Prebuild_Android.mk
LOCAL_SRC_FILES:= \
        main.c
#       simpl.c \

BUILD_PATH:=$(LOCAL_PATH)/FFmpeg-2.7.2-use-zip/android-21

LOCAL_C_INCLUDES := \
        $(LOCAL_PATH)/FFmpeg-2.7.2-use-zip/build/include \
        $(LOCAL_PATH)/openh264/include \

LOCAL_CFLAGS :=
LOCAL_CFLAGS += -fPIC -Wformat-security
LOCAL_CFLAGS += $(CFLAGS) -g -O2 -DENABLE_FFMPEG=1

LOCAL_LDFLAGS :=
LOCAL_LDFLAGS += $(BUILD_PATH)/lib/libavformat.a
LOCAL_LDFLAGS += $(BUILD_PATH)/lib/libavfilter.a
LOCAL_LDFLAGS += $(BUILD_PATH)/lib/libpostproc.a
LOCAL_LDFLAGS += $(BUILD_PATH)/lib/libswscale.a
LOCAL_LDFLAGS += $(LOCAL_PATH)/FFmpeg-2.7.2-use-zip/build/lib/libavdevice.a
LOCAL_LDFLAGS += $(BUILD_PATH)/lib/libavcodec.a
LOCAL_LDFLAGS += $(BUILD_PATH)/lib/libavformat.a
LOCAL_LDFLAGS += $(BUILD_PATH)/lib/libavutil.a
LOCAL_LDFLAGS += $(BUILD_PATH)/lib/libswresample.a
LOCAL_LDFLAGS += $(LOCAL_PATH)/openh264/libopenh264.a

#数学库和libz压缩库依赖
#LOCAL_LDFLAGS += -lswresample
LOCAL_LDFLAGS += -lm -lz

LOCAL_STATIC_LIBRARIES :=
LOCAL_SHARED_LIBRARIES :=
LOCAL_MODULE:= test_ffmpeg
include $(BUILD_EXECUTABLE)
include Prebuild_Android.mk

Summarize

If you encounter an error report in the middle, record
the link error report

./FFmpeg-2.7.2-use-zip/build/lib/libavformat.a(id3v2.o):id3v2.c:function id3v2_parse: error: undefined reference to 'uncompress'
./FFmpeg-2.7.2-use-zip/build/lib/libavcodec.a(zip_decoder.o):zip_decoder.c:function zip_decompress.isra.3: error: undefined reference to 'inflateInit_'
./FFmpeg-2.7.2-use-zip/build/lib/libavcodec.a(zip_decoder.o):zip_decoder.c:function zip_decompress.isra.3: error: undefined reference to 'inflate'
./FFmpeg-2.7.2-use-zip/build/lib/libavcodec.a(zip_decoder.o):zip_decoder.c:function zip_decompress.isra.3: error: undefined reference to 'inflateEnd'
./FFmpeg-2.7.2-use-zip/build/lib/libavcodec.a(zip_decoder.o):zip_decoder.c:function zip_decompress.isra.3: error: undefined reference to 'inflateEnd'
./FFmpeg-2.7.2-use-zip/build/lib/libavcodec.a(zip_decoder.o):zip_decoder.c:function zip_decoder_decode_begin: error: undefined reference to 'zlibVersion'
./FFmpeg-2.7.2-use-zip/build/lib/libavcodec.a(zip_decoder.o):zip_decoder.c:function zip_decoder_decode_begin: error: undefined reference to 'zlibVersion'
./FFmpeg-2.7.2-use-zip/build/lib/libavcodec.a(zip_decoder.o):zip_decoder.c:function zip_decoder_decode_begin: error: undefined reference to 'zlibVersion'
./FFmpeg-2.7.2-use-zip/build/lib/libavcodec.a(zip_decoder.o):zip_decoder.c:function zip_decoder_decode_begin: error: undefined reference to 'zlibVersion'

Solution


It's because libavcodec includes some math and zlib headers, so you must link to the respective libraries as well

run error

CANNOT LINK EXECUTABLE DEPENDENCIES: library "libavformat.so.56" not found

Solution

It prompts that libavformat.so.56 cannot be found, where .56 is the corresponding library version number. When compiling, there is a way to refer to the predecessors to modify the name of the compiled library in the configure file, but .a will appear , .so, .so.56, here I will report an error if I refer to the dynamic library, and it is fine to refer to the static library. It’s just that a solution has been found. If you know the reason in depth, you can discuss it together.

Demo compilation

compile

g++ -o main list_codecinfo.cpp -I /home/hzg/mypath/ffmpeg/Dev/ffmpeg-win64-dev/ffmpeg-20200106-1e3f4b5-win64-dev/include/libavformat -I /home/hzg/mypath/ffmpeg/Dev/ffmpeg-win64-dev/ffmpeg-20200106-1e3f4b5-win64-dev/include -L/home/hzg/mypath/ffmpeg/Dev/ffmpeg-win64-dev/ffmpeg-20200106-1e3f4b5-win64-dev/lib -lavcodec.dll -lavformat.dll
gcc test.c -I ffmpeg/ -Lffmpeg/libavformat -Lffmpeg/libavdevice/ -static -lavformat -lavdevice -o 264toyuv

instruction

ffplay https://magiclen.org/ffmpeg-h265/

ffplay -video_size 3840x2160 -i "C:\Users\10257818\Desktop\imagedata\编码视频\kaoya2\ky_683.yuv"		指定分辨率播放YUV

The instructions to output visually lossless H.265/HEVC video through FFmpeg are as follows:

ffmpeg -s 3840x2160 -i C:\Users\10257818\Desktop\imagedata\yuv\ky_685.yuv -vcodec libx265 -crf 18 C:\Users\10257818\Desktop\imagedata\yuv\aaa.h265
ffmpeg -s 3840x2160 -i C:\Users\10257818\Desktop\imagedata\yuv\ky_685.yuv -vcodec libx265  C:\Users\10257818\Desktop\imagedata\yuv\bbb.h265

Use the highest compression effect to output visually lossless H.265/HEVC video, the command is as follows:

ffmpeg -i 输入的影音文件路径 -vcodec libx265 -crf 20 -preset placebo 输出的影音文件路径

Decode H265/H264 code stream into YUV data

ffmpeg -i C:\Users\10257818\Desktop\imagedata\yuv\bbb.h265 -vcodec libx265  C:\Users\10257818\Desktop\imagedata\yuv\ccc.yuv
ffmpeg -i C:\Users\10257818\Desktop\imagedata\yuv\carphone_qcif.h264 -vcodec libx264  C:\Users\10257818\Desktop\imagedata\yuv\carphone_qcif_h264.yuv

Encode YUV data into H265/H264 stream

ffmpeg -s 3840x2160 -i C:\Users\10257818\Desktop\imagedata\yuv\ky_685.yuv -vcodec libx265  C:\Users\10257818\Desktop\imagedata\yuv\bbb.h265
ffmpeg -s 3840x2160 -i C:\Users\10257818\Desktop\imagedata\yuv\ky_685.yuv -vcodec libx265 -crf 18 C:\Users\10257818\Desktop\imagedata\yuv\aaa.h265
ffmpeg -s 176x144 -i C:\Users\10257818\Desktop\imagedata\yuv\carphone_qcif.yuv -vcodec libx265 -crf 18  C:\Users\10257818\Desktop\imagedata\yuv\carphone_qcif.h265
ffmpeg -s 176x144 -i C:\Users\10257818\Desktop\imagedata\yuv\carphone_qcif.yuv -vcodec libx264 -crf 18  C:\Users\10257818\Desktop\imagedata\yuv\carphone_qcif.h264

lossless compression

ffmpeg -i 输入的影音文件路径 -vcodec libx265 -x265-params lossless=1 -preset placebo 输出的影音文件路径

psnr

./psnr 176 144 420 ../../old/imagedata/carphone_qcif.yuv ../../old/imagedata/carphone_qcif_h264.yuv >carphone_qcif.txt

ffmpeg encode YUV420 video sequence

ffmpeg -s 1280x720 -i 720p50_parkrun_ter.yuv -r 50 720p50_parkrun_ter.h264

The following command realizes reading data from the camera and encoding it into H.264, and finally saving it as mycamera.mkv.

ffmpeg -f dshow -i video="HD 720P Webcam" -vcodec libx264 mycamera.mkv

Use ffplay to directly play the data of the camera, the command is as follows

ffplay -f dshow -i video="HD 720P Webcam"

Screen recording, accompanied by microphone input sound

ffmpeg -f dshow -i video="screen-capture-recorder" -f dshow -i audio="内装麦克风 (Conexant 20672 SmartAudi" -r 5 -vcodec libx264 -preset:v ultrafast -tune:v zerolatency -acodec libmp3lame MyDesktop.mkv

Screen recording, accompanied by the sound input from the earphone

ffmpeg -f dshow -i video="screen-capture-recorder" -f dshow -i audio="virtual-audio-capturer" -r 5 -vcodec libx264 -preset:v ultrafast -tune:v zerolatency -acodec libmp3lame MyDesktop.mkv

screen capture

ffmpeg -f gdigrab -i desktop out.mpg

Start from the (10,20) point of the screen, capture the 640x480 screen, set the frame rate to 5

ffmpeg -f gdigrab -framerate 5 -offset_x 10 -offset_y 20 -video_size 640x480 -i desktop out.mpg

Stitch two WAVs together

ffmpeg -i 1.wav -i 2.wav -i 3.wav ...... -i {
    
    n}.wav -filter_complex '[0:0][1:0]......[{n-1}:0]concat=n={n}:v=0:a=1[out]' -map '[out]' final.wav
ffmpeg -i input.avi output.mp4	【视频格式转换/视频容器转换,常用视频格式.avi .mp4 .ts .flv .rmvb .mkv】
ffmpeg -i 晓松奇谈.mp4 -acodec copy -vn output.aac	【提取音频】
ffmpeg -i input.mp4 -vcodec copy -an output.mp4		【提取视频】
ffmpeg -ss 00:00:15 -t 00:00:05 -i input.mp4 -vcodec copy -acodec copy output.mp4	【视频剪切,-ss表示开始切割的时间,-t表示要切多少,从时间为00:00:15开始,截取5秒钟的视频】
ffmpeg -i input.mp4 -b:v 2000k -bufsize 2000k -maxrate 2500k output.mp4	【码率控制,-b:v主要是控制平均码率,配套-bufsize使用,-maxrate,-minrate分别表示最大和最小波动】
ffmpeg -i input.mp4 -vcodec h264 output.mp4		【视频编码格式转换,将MPEG4编码格式的视频转换成H264】
ffmpeg -i input.mp4 -c:v libx265 output.mp4		【视频编码格式转换,ffmpeg编译的时候,添加了外部的x265】
ffmpeg -i input.mp4 –vcodec copy –an –f m4v output.h264	【只提取视频ES数据】
ffmpeg -i 1.ts -vcodec copy -an -f rawvideo es.raw		【只提取视频ES数据,ts流转es流】
ffmpeg -i 1.mp4 -vcodec copy -an -f rawvideo -vbsf h264_mp4toannexb es.raw	【只提取视频ES数据,mp4流转es流】
ffmpeg -i input.mp4 -vf scale=960:540 output.mp4	【过滤器使用,将输入的1920*1082缩放到960*540输出】
./ffmpeg -i input.mp4 -i iQIYI_logo.png -filter_complex overlay output.mp4	【过滤器使用,为视频添加logo,默认左上角】-vf代替的-filer-complex
ffmpeg -i F:\VideoEncDec\ffmpeg\VideoSimple\wuxiannizhuan.mp4 -i C:\Users\10257818\Desktop\ZTE.png -filter_complex 
"[1:v][0:v]scale2ref=(W/H)*ih/8/sar:ih/8[wm][base];[base][wm]overlay=10:10"
 -pix_fmt yuv420p -c:a copy C:\Users\10257818\Desktop\filterZTEzishiying.mp4
./ffmpeg -i input.mp4 -i logo.png -filter_complex overlay=W-w output.mp4	【过滤器使用,为视频添加logo,右上角】
./ffmpeg -i input.mp4 -i logo.png -filter_complex overlay=0:H-h output.mp4	【过滤器使用,为视频添加logo,左下角】
./ffmpeg -i input.mp4 -i logo.png -filter_complex overlay=W-w:H-h output.mp4	【过滤器使用,为视频添加logo,右下角】
ffmpeg -i input.mp4 -vf delogo=0:0:220:90:100:1 output.mp4	【过滤器使用,去掉视频的logo】

Syntax: -vf delogo=x:y:w:h[:t[:show]]
x:y coordinates from upper left corner
w:h width and height of logo
t: thickness of rectangle edge default value 4
show: if set 1 has a green rectangle, default 0.

ffmpeg -i input.mp4 -r 1 -q:v 2 -f image2 pic-%03d.jpeg				【抓取视频的一些帧,存为jpeg图片】

Syntax: -r means a few frames per second
-q:v means the image quality of storing jpeg, generally 2 is high quality

ffmpeg -i input.mp4 -ss 00:00:20 -t 10 -r 1 -q:v 2 -f image2 pic-%03d.jpeg	【抓取视频的一些帧,存为jpeg图片,设置抓取时间和时间间隔】

Grammar: -ss indicates the start time
-t indicates the total time

ffmpeg -i input.mp4 output.yuv	【输出YUV420原始数据,可以使用RawPlayer播放】
ffmpeg -i input.mp4 -ss 00:00:20 -t 10 -r 1 -q:v 2 -f image2 pic-%03d.jpeg    +     ffmpeg -i pic-001.jpeg -s 1440x1440 -pix_fmt yuv420p xxx3.yuv	【抽取某一帧YUV数据,先从视频中抽出jpeg帧图片,然后再将jpeg转换成YUV】
ffmpeg -i input -vf “trim=start_frame=0:end_frame=1” out.yuv	【评论给出的建议】
ffmpeg -i input.mp4 -profile:v baseline -level 3.0 output.mp4	【控制profile&level,以适应于不同的设备,解码能力和文件大小平衡,baseline,main,high,extended】
ffmpeg -i input.mp4 -c:v libx264 -x264-params "profile=high:level=3.0" output.mp4	【控制profile&level,ffmpeg编译时添加了external的libx264】
ffmpeg -i input.mp4 -c:v libx265 -x265-params "profile=high:level=3.0" output.mp4	【H265(HEVC)编码tile&level控制】
ffmpeg -list_devices true -f dshow -i dummy	【列出当前音视频设备】

Open the Cmd command line console, enter the Bin directory of FFmpeg, and enter the following command:

ffmpeg -list_devices true -f dshow -i dummy  
[dshow @ 0000022eb0b2a540] DirectShow video devices (some may be both video and audio devices)
[dshow @ 0000022eb0b2a540]  "HD 720P Webcam"
[dshow @ 0000022eb0b2a540]     Alternative name "@device_pnp_\\?\usb#vid_0c45&pid_6340&mi_00#6&17bbfbbc&0&0000#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global"
[dshow @ 0000022eb0b2a540] DirectShow audio devices
[dshow @ 0000022eb0b2a540]  "楹﹀厠椋?(2- USB Microphone)"[dshow @ 0000022eb0b2a540]     Alternative name "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{54896F77-1473-4AB0-8A17-109275DBF87B}"

Two devices are listed in the command line window above, one is a video capture device and the other is an audio capture device. In addition, we found that the name of the audio device has garbled characters because there are Chinese names in it. We will mention the solution to this problem when we talk about collecting data with API later. Then we enter another command line:

ffmpeg -list_options true -f dshow -i video="HD 720P Webcam"

The function of this command line is to obtain the resolution, frame rate and pixel format supported by the specified video capture device, and return a list

[dshow @ 0000018e1c16b540]   pixel_format=yuyv422  min s=1184x656 fps=10 max s=1184x656 fps=10
[dshow @ 0000018e1c16b540]   pixel_format=yuyv422  min s=1184x656 fps=10 max s=1184x656 fps=10
[dshow @ 0000018e1c16b540]   vcodec=mjpeg  min s=1280x720 fps=15 max s=1280x720 fps=33
[dshow @ 0000018e1c16b540]   vcodec=mjpeg  min s=1280x720 fps=15 max s=1280x720 fps=33

Next, we execute another command to save the image from the camera and the audio recording from the microphone into a file. The command is as follows:

ffmpeg -f dshow -i video="HD 720P Webcam" -f dshow -i audio="麦克风 (2- USB Microphone)" -vcodec libx264 -acodec aac -strict -2 mycamera.mkv

The above command line uses video= to specify the video device, and audio= to specify the audio device. The following parameters define the format and properties of the encoder, and the output is a file named mycamera.mkv. After the command is run, the console will print the FFmpeg running log, and press the "Q" key to abort the command. Some readers here may ask: Doesn't the capture device support multiple resolutions? How to set which resolution output to use when collecting? The answer is to use the "-s" parameter setting. If you add "-s 720x576" to the above command line, FFmpeg will capture at a resolution of 720x576. If not set, it will output at the default resolution.

Guess you like

Origin blog.csdn.net/qq_38750519/article/details/119940016