android-mediacodec-google官方demo

官方地址

https://github.com/PhilLab/Android-MediaCodec-Examples/blob/master/ExtractMpegFramesTest.java

掘金概述文章

https://juejin.im/entry/586289ea1b69e6006cea8f41

https://bigflake.com/mediacodec/

Android MediaCodec 编解码详解及 demo

阅读 2665

收藏 56

2016-12-27

原文链接:www.jianshu.com

免费试学限时拿,腾讯技术总监授课,入职BAT不遥远!ke.qq.com

原文地址

Android MediaCodec stuff

这篇文章是关于 MediaCodec 这一系列类,它主要是用来编码和解码音视频数据。并且包含了一些源码示例的集合以及常见问题的解答。
在API23之后,官方的文档 official 就已经十分的详细了。这里的一些信息可以帮你了解一些编解码方面的知识,为了考虑兼容性,这里的代码大部分都是运行在API18及以上的环境中,当然如果你的目标是Lollipop 以上的用户,你可以有更多的选择,这些都没有在这里提及。

概述

MediaCodec 第一次可用是在 Android 4.1版本(API16 ),一开始是用来直接访问设备的媒体编解码器。它提供了一种极其原始的接口。MediaCodec类同时存在 Java和C++层中,但是只有前者是公共访问方法。
在Android 4.3 (API18)中,MediaCodec被扩展为包含一种通过 Surface 提供输入的方法(通过 createInputSurface 方法),这允许输入来自于相机的预览或者是经过OpenGL ES呈现。而且Android4.3也是 MediaCodec 的第一个经过CTS测试(Compatibility Test Suite,CTS是google推出的一种设备兼容性测试规范,用来保证不同设备一致的用户体验,同时Google也提供了一份兼容性标准文档 CDD)的 release 版本。
而且Android4.3还引入了 MediaMuxer,它允许将AVC编解码器(原始H.264基本流)的输出转换为.MP4​​格式,可以和音频流一起转码也可以单独转换。
Android5.0(API21)引入了“异步模式”,它允许应用程序提供一个回调方法,在缓冲区可用时执行。但是整个文章链接里的代码都没有用到这个,因为兼容性保持到API 18+。

基本使用

所有的同步模式的 MediaCodec API都遵循一个模式:

  • 创建并配置一个 MediaCodec 对象
  • 循环直到完成:
    如果输入缓冲区就绪,读取一个输入块,并复制到输入缓冲区中
    如果输出缓冲区就绪,复制输出缓冲区的数据
  • 释放 MediaCodec 对象

MediaCodec的一个实例会处理一种类型的数据,(比如,MP3音频或H.264视频),编码或是解码。它对原始数据操作,所有任何的文件头,比如ID3(一般是位于一个mp3文件的开头或末尾的若干字节内,附加了关于该mp3的歌手,标题,专辑名称,年代,风格等信息,该信息就被称为ID3信息)这些信息会被擦除。它不与任何高级的系统组件通信,也不会通过扬声器来播放音频,或是通过网络来获取视频流数据,它只是一个会从缓冲区取数据,并返回数据的中间层。
一些编解码器对于它们的缓冲区是比较特殊的,它们可能需要一些特殊的内存对齐或是有特定的最小最大限制,为了适应广泛的可能性,buffer缓冲区分配是由编解码器自己实现的,而不是应用程序的层面。你并不需要一个带有数据的缓冲区给 MediaCodec,而是直接向它申请一个缓冲区,然后把你的数据拷贝进去。
这看起来和“零拷贝”原则是相悖的,但大部分情况发生拷贝的几率是比较小的,因为编解码器并不需要复制或调整这些数据来满足要求,而且大多数我们可以直接使用缓冲区,比如直接从磁盘或网络读取数据到缓冲区中,不需要复制。
MediaCodec的输入必须在“access units”中完成,在编码H.264视频时意味着一帧,在解码时意味着是一个NAL单元,然而,它看起来感觉更像是流,你不能提交一个单块,并期望不久后就出现,实际上,编解码器可能会在输出前入队好几个buffers。
这里强烈建议直接从下面的示例代码中学习,而不是直接从官方文档上手。

例子

EncodeAndMuxTest.java (requires 4.3, API 18)
使用OpenGL ES生成一个视频,通过 MediaCodec 使用H.264进行编码,而且通过 MediaMuxer 将流转换成一个.MP4文件,这里通过CTS 测试编写,也可以直接转成其他环境的代码。

CameraToMpegTest.java (requires 4.3, API 18)
通过相机预览录制视频并且编码成一个MP4文件,同样通过 MediaCodec 使用H.264进行编码,以及 MediaMuxer 将流转换成一个.MP4文件,作为一个扩展版,还通过GLES片段着色器在录制的时候改变视频,同样是一个CTS test,可以转换成其他环境的代码。

Android Breakout game recorder patch (requires 4.3, API 18)
这是 Android Breakout v1.0.2版本的一个补丁,添加了游戏录制功能,游戏是在60fps的全屏分辨率下,通过一个30fps 720p的配置使用AVC编解码器来录制视频,录制文件被保存在一个应用的私有空间,比如 ./data/data/com.faddensoft.breakout/files/video.mp4。这个本质上和 EncodeAndMuxTest.java 是一样的,不过这个是完全的真实环境不是CTS test,一个关键的区别在于EGL的创建,这里允许通过将显示和视频context以共享纹理的方式。

EncodeDecodeTest.java (requires 4.3, API 18)
CTS test,总共有三种test做着相同的事情,但是是不同的方式。每一个都是:
生成video的帧,通过AVC进行编码,生成解码流,看看是否和原始数据一样
上面的生成,编码,解码,检测基本是同时的,帧被生成后,传递给编码器,编码器拿到的数据会传递给解码器,然后进行校验,三种方式分别是
Buffer到Buffer,buffers是软件生成的YUV帧数据,这种方式是最慢的,但是能够允许应用程序去检测和修改YUV数据。
Buffer到Surface,编码是一样的,但是解码会在surface中,通过OpenGL ES的 getReadPixels()进行校验
Surface到Surface,通过OpenGL ES生成帧并解码到Surface中,这是最快的方式,但是需要YUV和RGB数据的转换。

DecodeEditEncodeTest.java (requires 4.3, API 18)
CTS test,主要是生成一系列视频帧,通过AVC进行编码,编码数据流保存在内存中,使用 MediaCodec解码,通过OpenGL ES片段着色器编辑帧数据(交换绿/蓝颜色信道),解码编辑后的视频流,验证输出。

ExtractMpegFramesTest.java (requires 4.1, API 16)
ExtractMpegFramesTest.java (requires 4.2, API 17)
提取一个.mp4视频文件的开始10帧,并保持成一个PNG文件到sd卡中,使用 MediaExtractor 提取 CSD 数据,并将单个 access units给 MediaCodec 解码器,帧被解码到一个SurfaceTexture的surface中,离屏渲染,并通过 glReadPixels() 拿到数据后使用 Bitmap#compress() 保存成一个PNG 文件。

常见问题

Q1:我怎么播放一个由MediaCodec创建的“video/avc”格式的视频流?
A1.这个被创建的流是原始的H.264流数据,Linux的Totem Movie Player可以播放,但大部分其他的都播放不了,你可以使用 MediaMuxer 将其转换为MP4文件,看前面的EncodeAndMuxTest例子。

Q2:当我创建一个编码器时,调用 MediaCodec的configure()方法会失败并抛出一个IllegalStateException异常?
A2.这通常是因为你没有指定所有编码器需要的关键命令,可以看一个这个例子 this stackoverflow item

Q3:我的视频解码器配置好了但是不接收数据,这是为什么?
A3.一个比较常见的错误就是忽略设置Codec-Specific Data(CSD),这个在文档中简略的提到过,有两个key,“csd-0”,“csd-1”,这个相当于是一系列元数据的序列参数集合,我们只需要直到这个会在MediaCodec 编码的时候生成,并且在MediaCodec 解码的时候需要它。
如果你直接把编码器输出传递给解码器,就会发现第一个包里面有BUFFER_FLAG_CODEC_CONFIG 的flag,这个参数需要确保传递给了解码器,这样解码器才会开始接收数据,或者你可以直接设置CSD数据给MediaFormat,通过 configure() 方法设置给解码器,这里可以参考 EncodeDecodeTest sample 这个例子。
实在不行也可以使用 MediaExtractor ,它会帮你做好一切。

Q4:我可以直接将流数据给解码器么?
A4.不一定,解码器需要的是 "access units"格式的流,不一定是字节流。对于视频解码器,这意味着你需要保存通过编码器(比如H.264的NAL单元)创建的“包边界”,这里可以参考 DecodeEditEncodeTest sample 是如何操作的,一般不能读任意的块数据并传递给解码器。

Q5:我在编码由相机预览拿到的YUV数据时,为什么看起来颜色有问题?
A5.相机输出的颜色格式和MediaCodec 在编码时的输入格式是不一样的,相机支持YV12(平面 YUV 4:2:0) 以及 NV21 (半平面 YUV 4:2:0),MediaCodec支持以下一个或多个:
.#19 COLOR_FormatYUV420Planar (I420)
.#20 COLOR_FormatYUV420PackedPlanar (also I420)
.#21 COLOR_FormatYUV420SemiPlanar (NV12)
.#39 COLOR_FormatYUV420PackedSemiPlanar (also NV12)
.#0x7f000100 COLOR_TI_FormatYUV420PackedSemiPlanar (also also NV12)
I420的数据布局相当于YV12,但是Cr和Cb却是颠倒的,就像NV12和NV21一样。所以如果你想要去处理相机拿到的YV12数据,可能会看到一些奇怪的颜色干扰,比如这样 these images。直到Android4.4版本,依然没有统一的输入格式,比如Nexus 7(2012),Nexus 10使用的COLOR_FormatYUV420Planar,而Nexus 4, Nexus 5, and Nexus 7(2013)使用的是COLOR_FormatYUV420SemiPlanar,而Galaxy Nexus使用的COLOR_TI_FormatYUV420PackedSemiPlanar。
一种可移植性更高,更有效率的方式就是使用API18 的Surface input API,这个在 CameraToMpegTest sample 中已经演示了,这样做的缺点就是你必须去操作RGB而不是YUV数据,这是一个图像处理的问题,如果你可以通过片段着色器来实现图像操作,可以利用GPU来处理这些转换和计算。

Q6: EGL_RECORDABLE_ANDROID flag是用来干什么的?
A6.这会告诉EGL,创建surface的行为必须是视频编解码器能兼容的,没有这个flag,EGL可能使用 MediaCodec 不能理解的格式来操作。

Q7:我是不是必须要在编码时设置 presentation time stamp (pts)?
A7.是的,一些设备如果没有设置合理的值,那么在编码的时候就会采取丢弃帧和低质量编码的方式。
需要注意的一点就是MediaCodec所需要的time格式是微秒,大部分java代码中的都是毫秒或者纳秒。

Q8:为什么有时输出混乱(比如都是零,或者太短等等)?
A8.这常见的错误就是没有去适配ByteBuffer的position和limit,这些东西MediaCodec并没有自动的去做,
我们需要手动的加上一些代码:

  int bufIndex = codec.dequeueOutputBuffer(info, TIMEOUT);
  ByteBuffer outputData = outputBuffers[bufIndex];
  if (info.size != 0) {
      outputData.position(info.offset);
      outputData.limit(info.offset + info.size);
  }

在输入端,你需要在将数据复制到缓冲区之前调用 clear()

Q9: 有时候会发现 storeMetaDataInBuffers 会打出一些错误log?
A9.是的,比如在Nexus 5上,看起来是这样的

E OMXNodeInstance: OMX_SetParameter() failed for StoreMetaDataInBuffers: 0x8000101a
E ACodec  : [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -2147483648

不过可以忽略这些,不会出现什么问题。

Android-MediaCodec-Examples/ExtractMpegFramesTest.java

cbfcfbd on May 4, 2016

@PhilLab PhilLab Initial upload of all bigflake samples

822 lines (729 sloc) 34.6 KB

/*
* Copyright 2013 The Android Open Source Project
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
 
package android.media.cts;
 
import android.graphics.Bitmap;
import android.graphics.SurfaceTexture;
import android.media.MediaCodec;
import android.media.MediaExtractor;
import android.media.MediaFormat;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import android.opengl.Matrix;
import android.os.Environment;
import android.test.AndroidTestCase;
import android.util.Log;
import android.view.Surface;
 
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
 
import javax.microedition.khronos.egl.EGL10;
import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.egl.EGLContext;
import javax.microedition.khronos.egl.EGLDisplay;
import javax.microedition.khronos.egl.EGLSurface;
 
//20131122: minor tweaks to saveFrame() I/O
//20131205: add alpha to EGLConfig (huge glReadPixels speedup); pre-allocate pixel buffers;
// log time to run saveFrame()
//20131210: switch from EGL14 to EGL10 for API 16 compatibility
//20140123: correct error checks on glGet*Location() and program creation (they don't set error)
//20140212: eliminate byte swap
 
/**
* Extract frames from an MP4 using MediaExtractor, MediaCodec, and GLES. Put a .mp4 file
* in "/sdcard/source.mp4" and look for output files named "/sdcard/frame-XX.png".
* <p>
* This uses various features first available in Android "Jellybean" 4.1 (API 16).
* <p>
* (This was derived from bits and pieces of CTS tests, and is packaged as such, but is not
* currently part of CTS.)
*/
public class ExtractMpegFramesTest extends AndroidTestCase {
private static final String TAG = "ExtractMpegFramesTest";
private static final boolean VERBOSE = false; // lots of logging
 
// where to find files (note: requires WRITE_EXTERNAL_STORAGE permission)
private static final File FILES_DIR = Environment.getExternalStorageDirectory();
private static final String INPUT_FILE = "source.mp4";
private static final int MAX_FRAMES = 10; // stop extracting after this many
 
/** test entry point */
public void testExtractMpegFrames() throws Throwable {
ExtractMpegFramesWrapper.runTest(this);
}
 
/**
* Wraps extractMpegFrames(). This is necessary because SurfaceTexture will try to use
* the looper in the current thread if one exists, and the CTS tests create one on the
* test thread.
*
* The wrapper propagates exceptions thrown by the worker thread back to the caller.
*/
private static class ExtractMpegFramesWrapper implements Runnable {
private Throwable mThrowable;
private ExtractMpegFramesTest mTest;
 
private ExtractMpegFramesWrapper(ExtractMpegFramesTest test) {
mTest = test;
}
 
@Override
public void run() {
try {
mTest.extractMpegFrames();
} catch (Throwable th) {
mThrowable = th;
}
}
 
/** Entry point. */
public static void runTest(ExtractMpegFramesTest obj) throws Throwable {
ExtractMpegFramesWrapper wrapper = new ExtractMpegFramesWrapper(obj);
Thread th = new Thread(wrapper, "codec test");
th.start();
th.join();
if (wrapper.mThrowable != null) {
throw wrapper.mThrowable;
}
}
}
 
/**
* Tests extraction from an MP4 to a series of PNG files.
* <p>
* We scale the video to 640x480 for the PNG just to demonstrate that we can scale the
* video with the GPU. If the input video has a different aspect ratio, we could preserve
* it by adjusting the GL viewport to get letterboxing or pillarboxing, but generally if
* you're extracting frames you don't want black bars.
*/
private void extractMpegFrames() throws IOException {
MediaCodec decoder = null;
CodecOutputSurface outputSurface = null;
MediaExtractor extractor = null;
int saveWidth = 640;
int saveHeight = 480;
 
try {
File inputFile = new File(FILES_DIR, INPUT_FILE); // must be an absolute path
// The MediaExtractor error messages aren't very useful. Check to see if the input
// file exists so we can throw a better one if it's not there.
if (!inputFile.canRead()) {
throw new FileNotFoundException("Unable to read " + inputFile);
}
 
extractor = new MediaExtractor();
extractor.setDataSource(inputFile.toString());
int trackIndex = selectTrack(extractor);
if (trackIndex < 0) {
throw new RuntimeException("No video track found in " + inputFile);
}
extractor.selectTrack(trackIndex);
 
MediaFormat format = extractor.getTrackFormat(trackIndex);
if (VERBOSE) {
Log.d(TAG, "Video size is " + format.getInteger(MediaFormat.KEY_WIDTH) + "x" +
format.getInteger(MediaFormat.KEY_HEIGHT));
}
 
// Could use width/height from the MediaFormat to get full-size frames.
outputSurface = new CodecOutputSurface(saveWidth, saveHeight);
 
// Create a MediaCodec decoder, and configure it with the MediaFormat from the
// extractor. It's very important to use the format from the extractor because
// it contains a copy of the CSD-0/CSD-1 codec-specific data chunks.
String mime = format.getString(MediaFormat.KEY_MIME);
decoder = MediaCodec.createDecoderByType(mime);
decoder.configure(format, outputSurface.getSurface(), null, 0);
decoder.start();
 
doExtract(extractor, trackIndex, decoder, outputSurface);
} finally {
// release everything we grabbed
if (outputSurface != null) {
outputSurface.release();
outputSurface = null;
}
if (decoder != null) {
decoder.stop();
decoder.release();
decoder = null;
}
if (extractor != null) {
extractor.release();
extractor = null;
}
}
}
 
/**
* Selects the video track, if any.
*
* @return the track index, or -1 if no video track is found.
*/
private int selectTrack(MediaExtractor extractor) {
// Select the first video track we find, ignore the rest.
int numTracks = extractor.getTrackCount();
for (int i = 0; i < numTracks; i++) {
MediaFormat format = extractor.getTrackFormat(i);
String mime = format.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
if (VERBOSE) {
Log.d(TAG, "Extractor selected track " + i + " (" + mime + "): " + format);
}
return i;
}
}
 
return -1;
}
 
/**
* Work loop.
*/
static void doExtract(MediaExtractor extractor, int trackIndex, MediaCodec decoder,
CodecOutputSurface outputSurface) throws IOException {
final int TIMEOUT_USEC = 10000;
ByteBuffer[] decoderInputBuffers = decoder.getInputBuffers();
MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
int inputChunk = 0;
int decodeCount = 0;
long frameSaveTime = 0;
 
boolean outputDone = false;
boolean inputDone = false;
while (!outputDone) {
if (VERBOSE) Log.d(TAG, "loop");
 
// Feed more data to the decoder.
if (!inputDone) {
int inputBufIndex = decoder.dequeueInputBuffer(TIMEOUT_USEC);
if (inputBufIndex >= 0) {
ByteBuffer inputBuf = decoderInputBuffers[inputBufIndex];
// Read the sample data into the ByteBuffer. This neither respects nor
// updates inputBuf's position, limit, etc.
int chunkSize = extractor.readSampleData(inputBuf, 0);
if (chunkSize < 0) {
// End of stream -- send empty frame with EOS flag set.
decoder.queueInputBuffer(inputBufIndex, 0, 0, 0L,
MediaCodec.BUFFER_FLAG_END_OF_STREAM);
inputDone = true;
if (VERBOSE) Log.d(TAG, "sent input EOS");
} else {
if (extractor.getSampleTrackIndex() != trackIndex) {
Log.w(TAG, "WEIRD: got sample from track " +
extractor.getSampleTrackIndex() + ", expected " + trackIndex);
}
long presentationTimeUs = extractor.getSampleTime();
decoder.queueInputBuffer(inputBufIndex, 0, chunkSize,
presentationTimeUs, 0 /*flags*/);
if (VERBOSE) {
Log.d(TAG, "submitted frame " + inputChunk + " to dec, size=" +
chunkSize);
}
inputChunk++;
extractor.advance();
}
} else {
if (VERBOSE) Log.d(TAG, "input buffer not available");
}
}
 
if (!outputDone) {
int decoderStatus = decoder.dequeueOutputBuffer(info, TIMEOUT_USEC);
if (decoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
// no output available yet
if (VERBOSE) Log.d(TAG, "no output from decoder available");
} else if (decoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
// not important for us, since we're using Surface
if (VERBOSE) Log.d(TAG, "decoder output buffers changed");
} else if (decoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat newFormat = decoder.getOutputFormat();
if (VERBOSE) Log.d(TAG, "decoder output format changed: " + newFormat);
} else if (decoderStatus < 0) {
fail("unexpected result from decoder.dequeueOutputBuffer: " + decoderStatus);
} else { // decoderStatus >= 0
if (VERBOSE) Log.d(TAG, "surface decoder given buffer " + decoderStatus +
" (size=" + info.size + ")");
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
if (VERBOSE) Log.d(TAG, "output EOS");
outputDone = true;
}
 
boolean doRender = (info.size != 0);
 
// As soon as we call releaseOutputBuffer, the buffer will be forwarded
// to SurfaceTexture to convert to a texture. The API doesn't guarantee
// that the texture will be available before the call returns, so we
// need to wait for the onFrameAvailable callback to fire.
decoder.releaseOutputBuffer(decoderStatus, doRender);
if (doRender) {
if (VERBOSE) Log.d(TAG, "awaiting decode of frame " + decodeCount);
outputSurface.awaitNewImage();
outputSurface.drawImage(true);
 
if (decodeCount < MAX_FRAMES) {
File outputFile = new File(FILES_DIR,
String.format("frame-%02d.png", decodeCount));
long startWhen = System.nanoTime();
outputSurface.saveFrame(outputFile.toString());
frameSaveTime += System.nanoTime() - startWhen;
}
decodeCount++;
}
}
}
}
 
int numSaved = (MAX_FRAMES < decodeCount) ? MAX_FRAMES : decodeCount;
Log.d(TAG, "Saving " + numSaved + " frames took " +
(frameSaveTime / numSaved / 1000) + " us per frame");
}
 
 
/**
* Holds state associated with a Surface used for MediaCodec decoder output.
* <p>
* The constructor for this class will prepare GL, create a SurfaceTexture,
* and then create a Surface for that SurfaceTexture. The Surface can be passed to
* MediaCodec.configure() to receive decoder output. When a frame arrives, we latch the
* texture with updateTexImage(), then render the texture with GL to a pbuffer.
* <p>
* By default, the Surface will be using a BufferQueue in asynchronous mode, so we
* can potentially drop frames.
*/
private static class CodecOutputSurface
implements SurfaceTexture.OnFrameAvailableListener {
private ExtractMpegFramesTest.STextureRender mTextureRender;
private SurfaceTexture mSurfaceTexture;
private Surface mSurface;
private EGL10 mEgl;
 
private EGLDisplay mEGLDisplay = EGL10.EGL_NO_DISPLAY;
private EGLContext mEGLContext = EGL10.EGL_NO_CONTEXT;
private EGLSurface mEGLSurface = EGL10.EGL_NO_SURFACE;
int mWidth;
int mHeight;
 
private Object mFrameSyncObject = new Object(); // guards mFrameAvailable
private boolean mFrameAvailable;
 
private ByteBuffer mPixelBuf; // used by saveFrame()
 
/**
* Creates a CodecOutputSurface backed by a pbuffer with the specified dimensions. The
* new EGL context and surface will be made current. Creates a Surface that can be passed
* to MediaCodec.configure().
*/
public CodecOutputSurface(int width, int height) {
if (width <= 0 || height <= 0) {
throw new IllegalArgumentException();
}
mEgl = (EGL10) EGLContext.getEGL();
mWidth = width;
mHeight = height;
 
eglSetup();
makeCurrent();
setup();
}
 
/**
* Creates interconnected instances of TextureRender, SurfaceTexture, and Surface.
*/
private void setup() {
mTextureRender = new ExtractMpegFramesTest.STextureRender();
mTextureRender.surfaceCreated();
 
if (VERBOSE) Log.d(TAG, "textureID=" + mTextureRender.getTextureId());
mSurfaceTexture = new SurfaceTexture(mTextureRender.getTextureId());
 
// This doesn't work if this object is created on the thread that CTS started for
// these test cases.
//
// The CTS-created thread has a Looper, and the SurfaceTexture constructor will
// create a Handler that uses it. The "frame available" message is delivered
// there, but since we're not a Looper-based thread we'll never see it. For
// this to do anything useful, CodecOutputSurface must be created on a thread without
// a Looper, so that SurfaceTexture uses the main application Looper instead.
//
// Java language note: passing "this" out of a constructor is generally unwise,
// but we should be able to get away with it here.
mSurfaceTexture.setOnFrameAvailableListener(this);
 
mSurface = new Surface(mSurfaceTexture);
 
mPixelBuf = ByteBuffer.allocateDirect(mWidth * mHeight * 4);
mPixelBuf.order(ByteOrder.LITTLE_ENDIAN);
}
 
/**
* Prepares EGL. We want a GLES 2.0 context and a surface that supports pbuffer.
*/
private void eglSetup() {
final int EGL_OPENGL_ES2_BIT = 0x0004;
final int EGL_CONTEXT_CLIENT_VERSION = 0x3098;
 
mEGLDisplay = mEgl.eglGetDisplay(EGL10.EGL_DEFAULT_DISPLAY);
if (mEGLDisplay == EGL10.EGL_NO_DISPLAY) {
throw new RuntimeException("unable to get EGL14 display");
}
int[] version = new int[2];
if (!mEgl.eglInitialize(mEGLDisplay, version)) {
mEGLDisplay = null;
throw new RuntimeException("unable to initialize EGL14");
}
 
// Configure EGL for pbuffer and OpenGL ES 2.0, 24-bit RGB.
int[] attribList = {
EGL10.EGL_RED_SIZE, 8,
EGL10.EGL_GREEN_SIZE, 8,
EGL10.EGL_BLUE_SIZE, 8,
EGL10.EGL_ALPHA_SIZE, 8,
EGL10.EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
EGL10.EGL_SURFACE_TYPE, EGL10.EGL_PBUFFER_BIT,
EGL10.EGL_NONE
};
EGLConfig[] configs = new EGLConfig[1];
int[] numConfigs = new int[1];
if (!mEgl.eglChooseConfig(mEGLDisplay, attribList, configs, configs.length,
numConfigs)) {
throw new RuntimeException("unable to find RGB888+recordable ES2 EGL config");
}
 
// Configure context for OpenGL ES 2.0.
int[] attrib_list = {
EGL_CONTEXT_CLIENT_VERSION, 2,
EGL10.EGL_NONE
};
mEGLContext = mEgl.eglCreateContext(mEGLDisplay, configs[0], EGL10.EGL_NO_CONTEXT,
attrib_list);
checkEglError("eglCreateContext");
if (mEGLContext == null) {
throw new RuntimeException("null context");
}
 
// Create a pbuffer surface.
int[] surfaceAttribs = {
EGL10.EGL_WIDTH, mWidth,
EGL10.EGL_HEIGHT, mHeight,
EGL10.EGL_NONE
};
mEGLSurface = mEgl.eglCreatePbufferSurface(mEGLDisplay, configs[0], surfaceAttribs);
checkEglError("eglCreatePbufferSurface");
if (mEGLSurface == null) {
throw new RuntimeException("surface was null");
}
}
 
/**
* Discard all resources held by this class, notably the EGL context.
*/
public void release() {
if (mEGLDisplay != EGL10.EGL_NO_DISPLAY) {
mEgl.eglDestroySurface(mEGLDisplay, mEGLSurface);
mEgl.eglDestroyContext(mEGLDisplay, mEGLContext);
//mEgl.eglReleaseThread();
mEgl.eglMakeCurrent(mEGLDisplay, EGL10.EGL_NO_SURFACE, EGL10.EGL_NO_SURFACE,
EGL10.EGL_NO_CONTEXT);
mEgl.eglTerminate(mEGLDisplay);
}
mEGLDisplay = EGL10.EGL_NO_DISPLAY;
mEGLContext = EGL10.EGL_NO_CONTEXT;
mEGLSurface = EGL10.EGL_NO_SURFACE;
 
mSurface.release();
 
// this causes a bunch of warnings that appear harmless but might confuse someone:
// W BufferQueue: [unnamed-3997-2] cancelBuffer: BufferQueue has been abandoned!
//mSurfaceTexture.release();
 
mTextureRender = null;
mSurface = null;
mSurfaceTexture = null;
}
 
/**
* Makes our EGL context and surface current.
*/
public void makeCurrent() {
if (!mEgl.eglMakeCurrent(mEGLDisplay, mEGLSurface, mEGLSurface, mEGLContext)) {
throw new RuntimeException("eglMakeCurrent failed");
}
}
 
/**
* Returns the Surface.
*/
public Surface getSurface() {
return mSurface;
}
 
/**
* Latches the next buffer into the texture. Must be called from the thread that created
* the CodecOutputSurface object. (More specifically, it must be called on the thread
* with the EGLContext that contains the GL texture object used by SurfaceTexture.)
*/
public void awaitNewImage() {
final int TIMEOUT_MS = 2500;
 
synchronized (mFrameSyncObject) {
while (!mFrameAvailable) {
try {
// Wait for onFrameAvailable() to signal us. Use a timeout to avoid
// stalling the test if it doesn't arrive.
mFrameSyncObject.wait(TIMEOUT_MS);
if (!mFrameAvailable) {
// TODO: if "spurious wakeup", continue while loop
throw new RuntimeException("frame wait timed out");
}
} catch (InterruptedException ie) {
// shouldn't happen
throw new RuntimeException(ie);
}
}
mFrameAvailable = false;
}
 
// Latch the data.
mTextureRender.checkGlError("before updateTexImage");
mSurfaceTexture.updateTexImage();
}
 
/**
* Draws the data from SurfaceTexture onto the current EGL surface.
*
* @param invert if set, render the image with Y inverted (0,0 in top left)
*/
public void drawImage(boolean invert) {
mTextureRender.drawFrame(mSurfaceTexture, invert);
}
 
// SurfaceTexture callback
@Override
public void onFrameAvailable(SurfaceTexture st) {
if (VERBOSE) Log.d(TAG, "new frame available");
synchronized (mFrameSyncObject) {
if (mFrameAvailable) {
throw new RuntimeException("mFrameAvailable already set, frame could be dropped");
}
mFrameAvailable = true;
mFrameSyncObject.notifyAll();
}
}
 
/**
* Saves the current frame to disk as a PNG image.
*/
public void saveFrame(String filename) throws IOException {
// glReadPixels gives us a ByteBuffer filled with what is essentially big-endian RGBA
// data (i.e. a byte of red, followed by a byte of green...). To use the Bitmap
// constructor that takes an int[] array with pixel data, we need an int[] filled
// with little-endian ARGB data.
//
// If we implement this as a series of buf.get() calls, we can spend 2.5 seconds just
// copying data around for a 720p frame. It's better to do a bulk get() and then
// rearrange the data in memory. (For comparison, the PNG compress takes about 500ms
// for a trivial frame.)
//
// So... we set the ByteBuffer to little-endian, which should turn the bulk IntBuffer
// get() into a straight memcpy on most Android devices. Our ints will hold ABGR data.
// Swapping B and R gives us ARGB. We need about 30ms for the bulk get(), and another
// 270ms for the color swap.
//
// We can avoid the costly B/R swap here if we do it in the fragment shader (see
// http://stackoverflow.com/questions/21634450/ ).
//
// Having said all that... it turns out that the Bitmap#copyPixelsFromBuffer()
// method wants RGBA pixels, not ARGB, so if we create an empty bitmap and then
// copy pixel data in we can avoid the swap issue entirely, and just copy straight
// into the Bitmap from the ByteBuffer.
//
// Making this even more interesting is the upside-down nature of GL, which means
// our output will look upside-down relative to what appears on screen if the
// typical GL conventions are used. (For ExtractMpegFrameTest, we avoid the issue
// by inverting the frame when we render it.)
//
// Allocating large buffers is expensive, so we really want mPixelBuf to be
// allocated ahead of time if possible. We still get some allocations from the
// Bitmap / PNG creation.
 
mPixelBuf.rewind();
GLES20.glReadPixels(0, 0, mWidth, mHeight, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE,
mPixelBuf);
 
BufferedOutputStream bos = null;
try {
bos = new BufferedOutputStream(new FileOutputStream(filename));
Bitmap bmp = Bitmap.createBitmap(mWidth, mHeight, Bitmap.Config.ARGB_8888);
mPixelBuf.rewind();
bmp.copyPixelsFromBuffer(mPixelBuf);
bmp.compress(Bitmap.CompressFormat.PNG, 90, bos);
bmp.recycle();
} finally {
if (bos != null) bos.close();
}
if (VERBOSE) {
Log.d(TAG, "Saved " + mWidth + "x" + mHeight + " frame as '" + filename + "'");
}
}
 
/**
* Checks for EGL errors.
*/
private void checkEglError(String msg) {
int error;
if ((error = mEgl.eglGetError()) != EGL10.EGL_SUCCESS) {
throw new RuntimeException(msg + ": EGL error: 0x" + Integer.toHexString(error));
}
}
}
 
 
/**
* Code for rendering a texture onto a surface using OpenGL ES 2.0.
*/
private static class STextureRender {
private static final int FLOAT_SIZE_BYTES = 4;
private static final int TRIANGLE_VERTICES_DATA_STRIDE_BYTES = 5 * FLOAT_SIZE_BYTES;
private static final int TRIANGLE_VERTICES_DATA_POS_OFFSET = 0;
private static final int TRIANGLE_VERTICES_DATA_UV_OFFSET = 3;
private final float[] mTriangleVerticesData = {
// X, Y, Z, U, V
-1.0f, -1.0f, 0, 0.f, 0.f,
1.0f, -1.0f, 0, 1.f, 0.f,
-1.0f, 1.0f, 0, 0.f, 1.f,
1.0f, 1.0f, 0, 1.f, 1.f,
};
 
private FloatBuffer mTriangleVertices;
 
private static final String VERTEX_SHADER =
"uniform mat4 uMVPMatrix;\n" +
"uniform mat4 uSTMatrix;\n" +
"attribute vec4 aPosition;\n" +
"attribute vec4 aTextureCoord;\n" +
"varying vec2 vTextureCoord;\n" +
"void main() {\n" +
" gl_Position = uMVPMatrix * aPosition;\n" +
" vTextureCoord = (uSTMatrix * aTextureCoord).xy;\n" +
"}\n";
 
private static final String FRAGMENT_SHADER =
"#extension GL_OES_EGL_image_external : require\n" +
"precision mediump float;\n" + // highp here doesn't seem to matter
"varying vec2 vTextureCoord;\n" +
"uniform samplerExternalOES sTexture;\n" +
"void main() {\n" +
" gl_FragColor = texture2D(sTexture, vTextureCoord);\n" +
"}\n";
 
private float[] mMVPMatrix = new float[16];
private float[] mSTMatrix = new float[16];
 
private int mProgram;
private int mTextureID = -12345;
private int muMVPMatrixHandle;
private int muSTMatrixHandle;
private int maPositionHandle;
private int maTextureHandle;
 
public STextureRender() {
mTriangleVertices = ByteBuffer.allocateDirect(
mTriangleVerticesData.length * FLOAT_SIZE_BYTES)
.order(ByteOrder.nativeOrder()).asFloatBuffer();
mTriangleVertices.put(mTriangleVerticesData).position(0);
 
Matrix.setIdentityM(mSTMatrix, 0);
}
 
public int getTextureId() {
return mTextureID;
}
 
/**
* Draws the external texture in SurfaceTexture onto the current EGL surface.
*/
public void drawFrame(SurfaceTexture st, boolean invert) {
checkGlError("onDrawFrame start");
st.getTransformMatrix(mSTMatrix);
if (invert) {
mSTMatrix[5] = -mSTMatrix[5];
mSTMatrix[13] = 1.0f - mSTMatrix[13];
}
 
// (optional) clear to green so we can see if we're failing to set pixels
GLES20.glClearColor(0.0f, 1.0f, 0.0f, 1.0f);
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
 
GLES20.glUseProgram(mProgram);
checkGlError("glUseProgram");
 
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTextureID);
 
mTriangleVertices.position(TRIANGLE_VERTICES_DATA_POS_OFFSET);
GLES20.glVertexAttribPointer(maPositionHandle, 3, GLES20.GL_FLOAT, false,
TRIANGLE_VERTICES_DATA_STRIDE_BYTES, mTriangleVertices);
checkGlError("glVertexAttribPointer maPosition");
GLES20.glEnableVertexAttribArray(maPositionHandle);
checkGlError("glEnableVertexAttribArray maPositionHandle");
 
mTriangleVertices.position(TRIANGLE_VERTICES_DATA_UV_OFFSET);
GLES20.glVertexAttribPointer(maTextureHandle, 2, GLES20.GL_FLOAT, false,
TRIANGLE_VERTICES_DATA_STRIDE_BYTES, mTriangleVertices);
checkGlError("glVertexAttribPointer maTextureHandle");
GLES20.glEnableVertexAttribArray(maTextureHandle);
checkGlError("glEnableVertexAttribArray maTextureHandle");
 
Matrix.setIdentityM(mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(muMVPMatrixHandle, 1, false, mMVPMatrix, 0);
GLES20.glUniformMatrix4fv(muSTMatrixHandle, 1, false, mSTMatrix, 0);
 
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
checkGlError("glDrawArrays");
 
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0);
}
 
/**
* Initializes GL state. Call this after the EGL surface has been created and made current.
*/
public void surfaceCreated() {
mProgram = createProgram(VERTEX_SHADER, FRAGMENT_SHADER);
if (mProgram == 0) {
throw new RuntimeException("failed creating program");
}
 
maPositionHandle = GLES20.glGetAttribLocation(mProgram, "aPosition");
checkLocation(maPositionHandle, "aPosition");
maTextureHandle = GLES20.glGetAttribLocation(mProgram, "aTextureCoord");
checkLocation(maTextureHandle, "aTextureCoord");
 
muMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
checkLocation(muMVPMatrixHandle, "uMVPMatrix");
muSTMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uSTMatrix");
checkLocation(muSTMatrixHandle, "uSTMatrix");
 
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
 
mTextureID = textures[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTextureID);
checkGlError("glBindTexture mTextureID");
 
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_NEAREST);
GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_CLAMP_TO_EDGE);
checkGlError("glTexParameter");
}
 
/**
* Replaces the fragment shader. Pass in null to reset to default.
*/
public void changeFragmentShader(String fragmentShader) {
if (fragmentShader == null) {
fragmentShader = FRAGMENT_SHADER;
}
GLES20.glDeleteProgram(mProgram);
mProgram = createProgram(VERTEX_SHADER, fragmentShader);
if (mProgram == 0) {
throw new RuntimeException("failed creating program");
}
}
 
private int loadShader(int shaderType, String source) {
int shader = GLES20.glCreateShader(shaderType);
checkGlError("glCreateShader type=" + shaderType);
GLES20.glShaderSource(shader, source);
GLES20.glCompileShader(shader);
int[] compiled = new int[1];
GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compiled, 0);
if (compiled[0] == 0) {
Log.e(TAG, "Could not compile shader " + shaderType + ":");
Log.e(TAG, " " + GLES20.glGetShaderInfoLog(shader));
GLES20.glDeleteShader(shader);
shader = 0;
}
return shader;
}
 
private int createProgram(String vertexSource, String fragmentSource) {
int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexSource);
if (vertexShader == 0) {
return 0;
}
int pixelShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentSource);
if (pixelShader == 0) {
return 0;
}
 
int program = GLES20.glCreateProgram();
if (program == 0) {
Log.e(TAG, "Could not create program");
}
GLES20.glAttachShader(program, vertexShader);
checkGlError("glAttachShader");
GLES20.glAttachShader(program, pixelShader);
checkGlError("glAttachShader");
GLES20.glLinkProgram(program);
int[] linkStatus = new int[1];
GLES20.glGetProgramiv(program, GLES20.GL_LINK_STATUS, linkStatus, 0);
if (linkStatus[0] != GLES20.GL_TRUE) {
Log.e(TAG, "Could not link program: ");
Log.e(TAG, GLES20.glGetProgramInfoLog(program));
GLES20.glDeleteProgram(program);
program = 0;
}
return program;
}
 
public void checkGlError(String op) {
int error;
while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {
Log.e(TAG, op + ": glError " + error);
throw new RuntimeException(op + ": glError " + error);
}
}
 
public static void checkLocation(int location, String label) {
if (location < 0) {
throw new RuntimeException("Unable to locate '" + label + "' in program");
}
}
}

}

猜你喜欢

转载自blog.csdn.net/u010029439/article/details/84659771