Compilation and use of ffmpeg-Android

I won't go into details about what ffmpeg can do. Almost everything related to audio and video can be done, only what you can't think of...


Compiler Environment:

Ubuntu 16.04 64位

ffmpeg-3.0.5

fdkaac-0.1.5

libx264-20161204 

android-ndk-r10d

MARS eclipse


ffmpeg download address: http://www.ffmpeg.org/download.html

fdkaac download address: https://sourceforge.net/projects/opencore-amr/files/fdk-aac/

libx264 download address: http://www.videolan.org/developers/x264.html


Compile the library (under Ubuntu)

1. Go to the above website to download the source code of ffmpeg, fdkaac and libx264 respectively;

(1) After downloading and decompressing, put it in the same folder, my folder name is ffmpeg_x264_fdkaac, so that you can see the compilation script later;
(2) Create a new folder out under ffmpeg_x264_fdkaac to store the compiled library.

2. Download ndk and put it in the directory you like;


3. Compile fdkaac or libx264 first, then compile ffmpeg, and paste the script I use here;

The location where the script is stored, in principle, you can put it wherever you want. But if you want to make minimal changes to my script , create a new folder under ffmpeg_x264_fdkaac and put all three scripts in this folder. Because "cd.." is written in the script when looking for the directory before configure.

4. Changes you need to make to the script:

(1) The ndk path in each script. For example, mine is export NDK=/home/sandy/android-ndk-r10d, you can change it to yours;
(2) Compile the scripts of fdkaac and libx264. If you have the same folder structure as mine, you do not need to change other than the NDK path. After compiling, you can see the compiled ;
(3) Compile the ffmpeg script, configure you need to configure it according to your actual situation. The configuration here is also easy to understand. For example, --enable-encoder=libx264 is to open the h264 encoder. For others, it is recommended that you use ./configure --help in the ffmpeg directory to carefully see what configuration should be used;
(4) Each Each script will export PREFIX=/home/sandy/ffmpeg_x264_fdkaac/out/..., and then configure will have --prefix=$PREFIX, which is the path where the compiled library is declared, if not declared, the compiled The library will be stored in the system root directory, so it is convenient to find it later.

build_faac.sh
export NDK=/home/sandy/android-ndk-r10d
export ANDROID_ROOT=$NDK/platforms/android-14/arch-arm
export ANDROID_BIN=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86_64/bin
export PREFIX=/home/sandy/ffmpeg_x264_fdkaac/out/fdkaac

export CFLAGS="-DANDROID -fPIC -ffunction-sections -funwind-tables -fstack-protector -march=armv7-a -mfloat-abi=softfp -mfpu=vfpv3-d16 -fomit-frame-pointer -fstrict-aliasing -funswitch-loops -finline-limit=300"

export LDFLAGS="-Wl,--fix-cortex-a8"
export CC="arm-linux-androideabi-gcc --sysroot=$ANDROID_ROOT"
export CXX="arm-linux-androideabi-g++ --sysroot=$ANDROID_ROOT"

export PATH=$ANDROID_BIN:$PATH

cd ../fdk-aac-0.1.5
./configure --host=arm-linux-androideabi  --with-sysroot="$ANDROID_ROOT" --enable-static --disable-shared --prefix=$PREFIX
make clean
make -j4
make install

build_x264.sh
export NDK=/home/sandy/android-ndk-r10d
export PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt
export PLATFORM=$NDK/platforms/android-14/arch-arm
export PREFIX=/home/sandy/ffmpeg_x264_fdkaac/out/x264
cd ../x264
./configure --prefix=$PREFIX \
--enable-static \
--disable-shared \
--enable-pic \
--disable-asm \
--disable-cli \
--host=arm-linux \
--cross-prefix=$PREBUILT/linux-x86_64/bin/arm-linux-androideabi- \
--sysroot=$PLATFORM
make
make install


build_ffmpeg.sh
#!/bin/bash
export NDK=/home/sandy/android-ndk-r10d
SYSROOT=$NDK/platforms/android-14/arch-arm/
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86_64
CPU=arm 
PREFIX=/home/sandy/ffmpeg_x264_fdkaac/out/ffmpeg
ADDI_CFLAGS="-marm"
export outfaac=/home/sandy/ffmpeg_x264_fdkaac/out/fdkaac
export outx264=/home/sandy/ffmpeg_x264_fdkaac/out/x264


function build_one    
{    
cd ../ffmpeg-3.0.5
./configure \
--enable-nonfree \
--enable-version3 \
--prefix=$PREFIX \
--enable-static \
--enable-cross-compile \
--enable-gpl \
--disable-shared \
--disable-doc \
--disable-ffserver \
--disable-ffprobe \
--disable-devices \
--disable-avdevice \
--disable-encoders \
--disable-decoders \
--disable-protocols \
--disable-muxers \
--disable-demuxers \
--disable-bsfs \
--disable-network \
--enable-libx264 \
--enable-encoder=libx264 \
--enable-libfdk_aac \
--enable-decoder=pcm_alaw \
--enable-encoder=pcm_alaw \
--enable-decoder=pcm_mulaw \
--enable-decoder=pcm_mulaw \
--enable-decoder=h264 \
--enable-encoder=aac \
--enable-decoder=aac \
--enable-protocol=file \
--enable-protocol=rtsp \
--enable-muxer=mp4 \
--enable-muxer=mov \
--enable-demuxer=mp4 \
--enable-demuxer=mov \
--enable-demuxer=flv \
--enable-demuxer=avi \
--enable-bsf=h264_mp4toannexb \
--enable-bsf=aac_adtstoasc \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--target-os=linux \
--arch=arm \
--sysroot=$SYSROOT \
--extra-cflags="-I$outx264/include -I$outfaac/include -fPIC -DANDROID -D__thumb__ -mthumb -Wfatal-errors -Wno-deprecated -mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=armv7-a" \
--extra-ldflags="-L$outx264/lib -L$outfaac/lib"
$ADDITIONAL_CONFIGURE_FLAG
}
  
build_one
make clean
make -j4
make install


库的使用




这里以在新工程中的使用为例
1.新建好工程后,右键工程名选择Android Tools->Add Native Support,填写so库的名字,确定, 这时工程会多出jni文件夹,并且文件夹下有Android.mk文件和以so库名字命名的.cpp文件;

2.导入库。在jni文件夹下创建两个文件夹include和prebuilt,名字可以随便起,总之一个是放头文件的,一个是放静态库的,将刚才编译好的头文件和静态库放在这两个文件夹里;

3.编辑java文件。新建一个类,这里以h264编码为例,直接上代码;
public class FFmpeg {
	
	static {
		try {
			System.loadLibrary("ckffmpegutil");
		} catch (Exception e) {
			// TODO: handle exception
			Log.d("ckdebug", "can not load ckffmpegutil");
		}
	}
	
	public static native int InitH264Encoder(int width, int height, int framerate);
	public static native void ReleaseH264Encoder();

}


4.运用javah命令生成头文件,具体步骤:cd到工程目录下->敲命令javah -classpath bin/classes -d jni com.example.ckffmpeg.ffmpeg(上面新建的Java文件的全类名),jni文件夹下会生成头文件com_example_ckffmpeg_FFmpeg.h,再新建对应的资源文件com_example_ckffmpeg_FFmpeg.c(删掉系统生成的.cpp文件),编辑资源文件,这里顺便贴上ffmpeg编码的关键部分代码;

#include <stdio.h>

#include "H264Encoder.h"
#include <com_android_concox_FFmpeg.h>

#include <libavformat/avformat.h>
#include <libavcodec/avcodec.h>

AVCodecContext *h264_enc_ctx;
AVFrame *h264_enc_frame;
unsigned char I420Buffer[1280 * 720 * 3];
long long frameNumber;
int frame_rate;

void yv12_2_yuv420p(unsigned char *yv12_buf, unsigned char *I420_buf, int width, int height)
{
	unsigned int wh = width*height;

	// 拷贝y分量
	int i = 0, j = 0;
	for (i=0; i<height; ++i)
	{
		for (j=0; j<width; ++j)
		{
			I420_buf[i*width + j] = yv12_buf[i*width + j];
		}
	}

	// 拷贝U、V分量
	int half_height = height/4;
	for (i=0; i<half_height; i++)
	{
		for (j=0; j<width; j++)
		{
			I420_buf[wh + i*width + j] = yv12_buf[wh + half_height*width + i*width + j];//拷贝U分量
			I420_buf[wh + half_height*width + i*width + j] = yv12_buf[wh + i*width + j];//拷贝V分量
		}
	}
}

/* 检查支持的图像格式 */
void check_pixel_fmt(AVCodec *codec)
{
    const enum AVPixelFormat *p = codec->pix_fmts;
    // 打印出支持的图像格式
    while (*p != -1) {
        LOGD("支持的图像格式:%d\n", *p);
        p++;
    }
}

int initH264Encoder(int width, int height, long bitrate, int framerate)
{
	av_register_all();
	avcodec_register_all();

	AVCodec *pCodec = avcodec_find_encoder(AV_CODEC_ID_H264);
	if (pCodec == NULL) {
		LOGD("could not find H264 encoder!");
		return -1;
	}

	if (h264_enc_ctx) {
		avcodec_free_context(&h264_enc_ctx);
	}

	if (h264_enc_frame) {
		av_frame_free(&h264_enc_frame);
	}

	h264_enc_ctx = avcodec_alloc_context3(pCodec);
	if (h264_enc_ctx == NULL) {
		LOGD("could not init H264 encode context!");
		return -2;
	}

	h264_enc_ctx->bit_rate = bitrate;
	h264_enc_ctx->time_base = (AVRational){1, framerate};
	h264_enc_ctx->width = width;
	h264_enc_ctx->height = height;
	h264_enc_ctx->gop_size = framerate; // 1s一帧关键帧
	h264_enc_ctx->pix_fmt = AV_PIX_FMT_YUV420P;
	h264_enc_ctx->max_b_frames = 0;
	h264_enc_ctx->color_range = AVCOL_RANGE_MPEG;
	h264_enc_ctx->qmin = 10;
	h264_enc_ctx->qmax = 51;

	AVDictionary *options = {0};
	av_dict_set(&options, "preset", "fast", 0);// 编码速度设为slow,画面质量更好
	av_dict_set(&options, "tune", "zerolatency", 0);// 0延迟

	if (avcodec_open2(h264_enc_ctx, pCodec, &options) < 0) {
		LOGD	("could not open H264 encoder!\n");
		avcodec_free_context(&h264_enc_ctx);
		return -3;
	}
	av_dict_free(&options);

	h264_enc_frame = av_frame_alloc();
	if (h264_enc_frame == NULL) {
		LOGD("could not allocate video frame!\n");
		avcodec_free_context(&h264_enc_ctx);
		return -4;
	}

	int ret = av_image_fill_arrays(h264_enc_frame->data, h264_enc_frame->linesize, NULL, AV_PIX_FMT_YUV420P, width, height, 1);
	if (ret < 0) {
		LOGD("could not fill video frame array\n");
		releaseH264Encoder();
		return -5;
	}
	LOGD("required video frame src bytes: %d\n", ret);

	check_pixel_fmt(pCodec);
	frameNumber = 0;
	frame_rate = framerate;
	return 0;
}

void releaseH264Encoder()
{
	if (h264_enc_ctx) {
		avcodec_free_context(&h264_enc_ctx);
	}

	if (h264_enc_frame) {
		av_frame_free(&h264_enc_frame);
	}
}

int H264Encode(unsigned char *inbuf, unsigned char *outbuf)
{
	if (!h264_enc_ctx || !h264_enc_frame) {
		LOGD("the H.264 encoder has not been inited!");
		return -1;
	}

	AVPacket pkt = {0};
	av_init_packet(&pkt);
	pkt.data = NULL;
	pkt.size = 0;

	h264_enc_frame->width = h264_enc_ctx->width;
	h264_enc_frame->height = h264_enc_ctx->height;
	h264_enc_frame->format = h264_enc_ctx->pix_fmt;
	h264_enc_frame->color_range = h264_enc_ctx->color_range;
	h264_enc_frame->pts = (frameNumber++ / frame_rate) / av_q2d(h264_enc_ctx->time_base);

	memset(I420Buffer, 0, 1280*720*3);
	yv12_2_yuv420p(inbuf, I420Buffer, h264_enc_ctx->width, h264_enc_ctx->height);

	h264_enc_frame->data[0] = I420Buffer;
	h264_enc_frame->data[1] = I420Buffer + h264_enc_ctx->width * h264_enc_ctx->height;
	h264_enc_frame->data[2] = I420Buffer + h264_enc_ctx->width * h264_enc_ctx->height * 5/4;

	int got_pic;
	int ret = avcodec_encode_video2(h264_enc_ctx, &pkt, h264_enc_frame, &got_pic);
	if (ret < 0) {
		LOGD("encode video failed!");
		av_packet_unref(&pkt);
		return -2;
	}

	if (got_pic) {
		memcpy(outbuf, pkt.data, pkt.size);
		ret = pkt.size;
		av_packet_unref(&pkt);
		return ret;
	}

	LOGD("encode success but has no picture");
	return 0;
}


5.编辑Android.mk文件

LOCAL_PATH := $(call my-dir)

include $(CLEAR_VARS)


LOCAL_C_INCLUDES +=	$(LOCAL_PATH)/include
LOCAL_MODULE    := libckffmpegutil
LOCAL_SRC_FILES := com_example_ckffmpeg_FFmpeg.c

LOCAL_LDLIBS := -llog -ljnigraphics -lz -landroid -lm -pthread

LOCAL_LDLIBS +=$(LOCAL_PATH)/prebuilt/libfdk-aac.a
LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libavformat.a
LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libavfilter.a
LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libavcodec.a
LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libswscale.a
LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libavutil.a
LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libswresample.a
LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libpostproc.a
LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libx264.a

include $(BUILD_SHARED_LIBRARY)



PS:此文中贴的h264编码,我在实际使用过程中发现编码效率比较低,编640x480的图像,一帧需要将近200ms,项目中也不需要这个了,后面也没有再继续改进。视音频编解码这个东西水太深了,不是一两年能搞出名堂的……


这里主要也是记录一下ffmpeg的编译,能用到的朋友可以点个赞吐舌头

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324820080&siteId=291194637