ffmpeg能做什么不再赘述,几乎所有和音视频有关的东西都能做,只有你想不到的……
编译环境:
Ubuntu 16.04 64位
ffmpeg-3.0.5
fdkaac-0.1.5
libx264-20161204
android-ndk-r10d
eclipse MARS
ffmpeg下载地址:http://www.ffmpeg.org/download.html
fdkaac下载地址:https://sourceforge.net/projects/opencore-amr/files/fdk-aac/
libx264下载地址:http://www.videolan.org/developers/x264.html
库的编译(Ubuntu下进行)
(1)下载完解压后放在同一个文件夹下,我的文件夹名字叫做ffmpeg_x264_fdkaac,这样方便后面你看编译脚本;1.到上面网址分别下载ffmpeg、fdkaac和libx264源码;
2.下载ndk,放到你喜欢的目录下;
3.先编译fdkaac或libx264,再编译ffmpeg,这里贴上我用的脚本;
(1)每个脚本中的ndk路径。例如我的是 export NDK=/home/sandy/android-ndk-r10d,你改成你的;4.你需要对脚本做的改动:
(2)编译fdkaac和libx264的脚本,你如果和我的文件夹结构一样,除了NDK路径以外其它不用改,编译完后在out文件夹下就能看到编 译好的库;
(3)编译ffmpeg的脚本,configure你需要根据自己的实际情况配置一下。这里的配置也很好理解,比如--enable-encoder=libx264就是打开h264编码器,其它的建议你在ffmpeg目录下,用./configure --help来仔细看应该用什么配置;
(4)每个脚本中都会export PREFIX=/home/sandy/ffmpeg_x264_fdkaac/out/...,然后configure下面都会有--prefix=$PREFIX,这是声明编译后的库存放的路径,如果不声明,编译后的库会存放在系统根目录下,所以声明了方便后面寻找。
export NDK=/home/sandy/android-ndk-r10d export ANDROID_ROOT=$NDK/platforms/android-14/arch-arm export ANDROID_BIN=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86_64/bin export PREFIX=/home/sandy/ffmpeg_x264_fdkaac/out/fdkaac export CFLAGS="-DANDROID -fPIC -ffunction-sections -funwind-tables -fstack-protector -march=armv7-a -mfloat-abi=softfp -mfpu=vfpv3-d16 -fomit-frame-pointer -fstrict-aliasing -funswitch-loops -finline-limit=300" export LDFLAGS="-Wl,--fix-cortex-a8" export CC="arm-linux-androideabi-gcc --sysroot=$ANDROID_ROOT" export CXX="arm-linux-androideabi-g++ --sysroot=$ANDROID_ROOT" export PATH=$ANDROID_BIN:$PATH cd ../fdk-aac-0.1.5 ./configure --host=arm-linux-androideabi --with-sysroot="$ANDROID_ROOT" --enable-static --disable-shared --prefix=$PREFIX make clean make -j4 make install
export NDK=/home/sandy/android-ndk-r10d export PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt export PLATFORM=$NDK/platforms/android-14/arch-arm export PREFIX=/home/sandy/ffmpeg_x264_fdkaac/out/x264 cd ../x264 ./configure --prefix=$PREFIX \ --enable-static \ --disable-shared \ --enable-pic \ --disable-asm \ --disable-cli \ --host=arm-linux \ --cross-prefix=$PREBUILT/linux-x86_64/bin/arm-linux-androideabi- \ --sysroot=$PLATFORM make make install
#!/bin/bash export NDK=/home/sandy/android-ndk-r10d SYSROOT=$NDK/platforms/android-14/arch-arm/ TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86_64 CPU=arm PREFIX=/home/sandy/ffmpeg_x264_fdkaac/out/ffmpeg ADDI_CFLAGS="-marm" export outfaac=/home/sandy/ffmpeg_x264_fdkaac/out/fdkaac export outx264=/home/sandy/ffmpeg_x264_fdkaac/out/x264 function build_one { cd ../ffmpeg-3.0.5 ./configure \ --enable-nonfree \ --enable-version3 \ --prefix=$PREFIX \ --enable-static \ --enable-cross-compile \ --enable-gpl \ --disable-shared \ --disable-doc \ --disable-ffserver \ --disable-ffprobe \ --disable-devices \ --disable-avdevice \ --disable-encoders \ --disable-decoders \ --disable-protocols \ --disable-muxers \ --disable-demuxers \ --disable-bsfs \ --disable-network \ --enable-libx264 \ --enable-encoder=libx264 \ --enable-libfdk_aac \ --enable-decoder=pcm_alaw \ --enable-encoder=pcm_alaw \ --enable-decoder=pcm_mulaw \ --enable-decoder=pcm_mulaw \ --enable-decoder=h264 \ --enable-encoder=aac \ --enable-decoder=aac \ --enable-protocol=file \ --enable-protocol=rtsp \ --enable-muxer=mp4 \ --enable-muxer=mov \ --enable-demuxer=mp4 \ --enable-demuxer=mov \ --enable-demuxer=flv \ --enable-demuxer=avi \ --enable-bsf=h264_mp4toannexb \ --enable-bsf=aac_adtstoasc \ --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \ --target-os=linux \ --arch=arm \ --sysroot=$SYSROOT \ --extra-cflags="-I$outx264/include -I$outfaac/include -fPIC -DANDROID -D__thumb__ -mthumb -Wfatal-errors -Wno-deprecated -mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=armv7-a" \--extra-ldflags="-L$outx264/lib -L$outfaac/lib" $ADDITIONAL_CONFIGURE_FLAG } build_one make clean make -j4 make install
库的使用
1.新建好工程后,右键工程名选择Android Tools->Add Native Support,填写so库的名字,确定, 这时工程会多出jni文件夹,并且文件夹下有Android.mk文件和以so库名字命名的.cpp文件;
2.导入库。在jni文件夹下创建两个文件夹include和prebuilt,名字可以随便起,总之一个是放头文件的,一个是放静态库的,将刚才编译好的头文件和静态库放在这两个文件夹里;
3.编辑java文件。新建一个类,这里以h264编码为例,直接上代码;
public class FFmpeg { static { try { System.loadLibrary("ckffmpegutil"); } catch (Exception e) { // TODO: handle exception Log.d("ckdebug", "can not load ckffmpegutil"); } } public static native int InitH264Encoder(int width, int height, int framerate); public static native void ReleaseH264Encoder(); }
4.运用javah命令生成头文件,具体步骤:cd到工程目录下->敲命令javah -classpath bin/classes -d jni com.example.ckffmpeg.ffmpeg(上面新建的Java文件的全类名),jni文件夹下会生成头文件com_example_ckffmpeg_FFmpeg.h,再新建对应的资源文件com_example_ckffmpeg_FFmpeg.c(删掉系统生成的.cpp文件),编辑资源文件,这里顺便贴上ffmpeg编码的关键部分代码;
#include <stdio.h> #include "H264Encoder.h" #include <com_android_concox_FFmpeg.h> #include <libavformat/avformat.h> #include <libavcodec/avcodec.h> AVCodecContext *h264_enc_ctx; AVFrame *h264_enc_frame; unsigned char I420Buffer[1280 * 720 * 3]; long long frameNumber; int frame_rate; void yv12_2_yuv420p(unsigned char *yv12_buf, unsigned char *I420_buf, int width, int height) { unsigned int wh = width*height; // 拷贝y分量 int i = 0, j = 0; for (i=0; i<height; ++i) { for (j=0; j<width; ++j) { I420_buf[i*width + j] = yv12_buf[i*width + j]; } } // 拷贝U、V分量 int half_height = height/4; for (i=0; i<half_height; i++) { for (j=0; j<width; j++) { I420_buf[wh + i*width + j] = yv12_buf[wh + half_height*width + i*width + j];//拷贝U分量 I420_buf[wh + half_height*width + i*width + j] = yv12_buf[wh + i*width + j];//拷贝V分量 } } } /* 检查支持的图像格式 */ void check_pixel_fmt(AVCodec *codec) { const enum AVPixelFormat *p = codec->pix_fmts; // 打印出支持的图像格式 while (*p != -1) { LOGD("支持的图像格式:%d\n", *p); p++; } } int initH264Encoder(int width, int height, long bitrate, int framerate) { av_register_all(); avcodec_register_all(); AVCodec *pCodec = avcodec_find_encoder(AV_CODEC_ID_H264); if (pCodec == NULL) { LOGD("could not find H264 encoder!"); return -1; } if (h264_enc_ctx) { avcodec_free_context(&h264_enc_ctx); } if (h264_enc_frame) { av_frame_free(&h264_enc_frame); } h264_enc_ctx = avcodec_alloc_context3(pCodec); if (h264_enc_ctx == NULL) { LOGD("could not init H264 encode context!"); return -2; } h264_enc_ctx->bit_rate = bitrate; h264_enc_ctx->time_base = (AVRational){1, framerate}; h264_enc_ctx->width = width; h264_enc_ctx->height = height; h264_enc_ctx->gop_size = framerate; // 1s一帧关键帧 h264_enc_ctx->pix_fmt = AV_PIX_FMT_YUV420P; h264_enc_ctx->max_b_frames = 0; h264_enc_ctx->color_range = AVCOL_RANGE_MPEG; h264_enc_ctx->qmin = 10; h264_enc_ctx->qmax = 51; AVDictionary *options = {0}; av_dict_set(&options, "preset", "fast", 0);// 编码速度设为slow,画面质量更好 av_dict_set(&options, "tune", "zerolatency", 0);// 0延迟 if (avcodec_open2(h264_enc_ctx, pCodec, &options) < 0) { LOGD ("could not open H264 encoder!\n"); avcodec_free_context(&h264_enc_ctx); return -3; } av_dict_free(&options); h264_enc_frame = av_frame_alloc(); if (h264_enc_frame == NULL) { LOGD("could not allocate video frame!\n"); avcodec_free_context(&h264_enc_ctx); return -4; } int ret = av_image_fill_arrays(h264_enc_frame->data, h264_enc_frame->linesize, NULL, AV_PIX_FMT_YUV420P, width, height, 1); if (ret < 0) { LOGD("could not fill video frame array\n"); releaseH264Encoder(); return -5; } LOGD("required video frame src bytes: %d\n", ret); check_pixel_fmt(pCodec); frameNumber = 0; frame_rate = framerate; return 0; } void releaseH264Encoder() { if (h264_enc_ctx) { avcodec_free_context(&h264_enc_ctx); } if (h264_enc_frame) { av_frame_free(&h264_enc_frame); } } int H264Encode(unsigned char *inbuf, unsigned char *outbuf) { if (!h264_enc_ctx || !h264_enc_frame) { LOGD("the H.264 encoder has not been inited!"); return -1; } AVPacket pkt = {0}; av_init_packet(&pkt); pkt.data = NULL; pkt.size = 0; h264_enc_frame->width = h264_enc_ctx->width; h264_enc_frame->height = h264_enc_ctx->height; h264_enc_frame->format = h264_enc_ctx->pix_fmt; h264_enc_frame->color_range = h264_enc_ctx->color_range; h264_enc_frame->pts = (frameNumber++ / frame_rate) / av_q2d(h264_enc_ctx->time_base); memset(I420Buffer, 0, 1280*720*3); yv12_2_yuv420p(inbuf, I420Buffer, h264_enc_ctx->width, h264_enc_ctx->height); h264_enc_frame->data[0] = I420Buffer; h264_enc_frame->data[1] = I420Buffer + h264_enc_ctx->width * h264_enc_ctx->height; h264_enc_frame->data[2] = I420Buffer + h264_enc_ctx->width * h264_enc_ctx->height * 5/4; int got_pic; int ret = avcodec_encode_video2(h264_enc_ctx, &pkt, h264_enc_frame, &got_pic); if (ret < 0) { LOGD("encode video failed!"); av_packet_unref(&pkt); return -2; } if (got_pic) { memcpy(outbuf, pkt.data, pkt.size); ret = pkt.size; av_packet_unref(&pkt); return ret; } LOGD("encode success but has no picture"); return 0; }
5.编辑Android.mk文件
LOCAL_PATH := $(call my-dir) include $(CLEAR_VARS) LOCAL_C_INCLUDES += $(LOCAL_PATH)/include LOCAL_MODULE := libckffmpegutil LOCAL_SRC_FILES := com_example_ckffmpeg_FFmpeg.c LOCAL_LDLIBS := -llog -ljnigraphics -lz -landroid -lm -pthread LOCAL_LDLIBS +=$(LOCAL_PATH)/prebuilt/libfdk-aac.a LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libavformat.a LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libavfilter.a LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libavcodec.a LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libswscale.a LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libavutil.a LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libswresample.a LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libpostproc.a LOCAL_LDFLAGS+=$(LOCAL_PATH)/prebuilt/libx264.a include $(BUILD_SHARED_LIBRARY)
PS:此文中贴的h264编码,我在实际使用过程中发现编码效率比较低,编640x480的图像,一帧需要将近200ms,项目中也不需要这个了,后面也没有再继续改进。视音频编解码这个东西水太深了,不是一两年能搞出名堂的……
这里主要也是记录一下ffmpeg的编译,能用到的朋友可以点个赞