MediaCodec implements video hard decoding

Reprinted from http://blog.csdn.net/halleyzhang3/article/details/11473961#

Android uses MediaCodec to realize video hard decoding

This article tells you how to use android standard API (MediaCodec) to realize video hardware encoding and decoding. The routine will start by capturing video from the camera, then H264 encoding, decoding, and displaying. I will try to keep it short and clear and not show irrelevant code.
1. To collect video from the camera

      , you can obtain video data through the callback of the camera Preview.

      First create the camera and set the parameters:

   cam = Camera.open();
cam.setPreviewDisplay(holder);
Camera.Parameters parameters = cam.getParameters();
parameters.setFlashMode("off"); // no flash
parameters.setWhiteBalance (Camera.Parameters.WHITE_BALANCE_AUTO);
parameters.setSceneMode(Camera.Parameters.SCENE_MODE_AUTO);
parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
parameters.setPreviewFormat(ImageFormat.YV12);    
parameters.setPictureSize(camWidth, camHeight);
parameters.setPreviewSize(camWidth, camHeight);
//If these two properties are set differently from the real phone, an error will be reported
cam.setParameters(parameters); width and height must be the size supported by the camera, otherwise an error will be reported. To get all supported sizes, use getSupportedPreviewSizes, which will not be repeated here. It is said that all parameters must be set in full, and an error may be reported if one is omitted, but it is only said that I only set a few attributes and there is no error. Then start Preview:


buf = new byte[camWidth * camHeight * 3 / 2];
cam.addCallbackBuffer(buf);
cam.setPreviewCallbackWithBuffer(this);
cam.startPreview(); setPreviewCallbackWithBuffer is necessary, otherwise every time The callback system all re-allocates the buffer, and the efficiency will be very low.


    The original image can be obtained in onPreviewFrame (of course, this must implements PreviewCallback). Here we are passing it to the encoder:


public void onPreviewFrame(byte[] data, Camera camera) {
if (frameListener != null) {
frameListener.onFrame(data, 0, data.length, 0);
}
cam.addCallbackBuffer(buf);
}
2、编码

    首先要初始化编码器:


           mediaCodec = MediaCodec.createEncoderByType("Video/AVC");
    MediaFormat mediaFormat = MediaFormat.createVideoFormat(type, width, height);
    mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 125000);
    mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
    mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
    mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
    mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
    mediaCodec.start();
    然后就是给他喂数据了,这里的数据是来自摄像头的:


public void onFrame(byte[] buf, int offset, int length, int flag) {
    ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
    ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
    int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
    if (inputBufferIndex >= 0)
        ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
        inputBuffer.clear();
        inputBuffer.put(buf, offset, length);
        mediaCodec.queueInputBuffer(inputBufferIndex, 0, length, 0, 0);
    }
    MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);
    while (outputBufferIndex >= 0) {
        ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
        if (frameListener != null)
            frameListener.onFrame(outputBuffer, 0, length, flag);
        mediaCodec. releaseOutputBuffer(outputBufferIndex, false);
        outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
    } First feed the data from the camera to it, and then feed the compressed data from it to the decoder.
3. Decoding and Display

     First initialize the decoder:


    mediaCodec = MediaCodec.createDecoderByType("Video/AVC");
    MediaFormat mediaFormat = MediaFormat.createVideoFormat(mime, width, height);
    mediaCodec.configure(mediaFormat, surface, null, 0);
    mediaCodec.start();
             Here by giving the decoder a surface, the decoder can directly display the picture.

     Then it's time to process the data:


public void onFrame(byte[] buf, int offset, int length, int flag) {
        ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
            int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
        if (inputBufferIndex >= 0) {
            ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
            inputBuffer.clear();
            inputBuffer.put(buf, offset, length);
            mediaCodec.queueInputBuffer(inputBufferIndex, 0, length, mCount * 1000000 / FRAME_RATE, 0);
                    mCount++ ;
        }

        MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
        int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);
        while (outputBufferIndex >= 0) {
            mediaCodec.releaseOutputBuffer(outputBufferIndex, true);
            outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
        }
}
        The third parameter of queueInputBuffer is the timestamp. In fact, it doesn't matter how you write it, as long as it increases linearly with time, and you can just get one here. The following code is to release the buffer, because we directly let the decoder display the decoded data, but we must release it like this, otherwise the decoder will always keep it for you, and the memory should not be enough. .




Okay, so far, that's basically it. If you're lucky, you can now see the video, like on my Samsung phone. However, I have tried several other platforms, and most of them can't. There are always various problems. If you want to develop a platform-independent application, there are still many problems to be solved. Talk about some situations I have encountered:




1. Video size

     Generally, it can support the size of 176X144/352X288, but if it is larger, there are many cameras with 640X480. As for why, I don't know. Of course, this size must be consistent with the size of the camera preview, and the size of the preview can be enumerated.

2. Color space

    According to the ANdroid SDK documentation, ensure that all hardware platforms support colors. The preview output of the camera is YUV12, and the input of the encoder is COLOR_FormatYUV420Planar, which is set in the previous code. However, documents are documents after all, otherwise Android would not be Android.

    On some platforms, these two color formats are the same, and the output of the camera can be directly used as the input of the encoder. There are also some platforms, the two are different. The former is YUV12, and the latter is equal to I420. The UV component of the former needs to be reversed. The following code is not efficient, for reference.


byte[] i420bytes = null;
private byte[] swapYV12toI420(byte[] yv12bytes, int width, int height) {
    if (i420bytes == null)
    i420bytes = new byte[yv12bytes.length];
    for (int i = 0; i < width*height; i++)
        i420bytes[i] = yv12bytes[i];
    for (int i = width*height; i < width*height + (width/2*height/2); i++)
        i420bytes[i] = yv12bytes[i + (width/2*height/2)];
    for (int i = width*height + (width/2*height/2); i < width*height + 2*(width/ 2*height/2); i++)
        i420bytes[i] = yv12bytes[i - (width/2*height/2)];
    return i420bytes;
} The difficulty here is that I don't know how to tell if this conversion is needed. It is said that Android 4.3 avoids this problem by eliminating the need to take images from the camera's PreView. Here is an example, although I haven't read it, but it looks very powerful, there should be no mistake (I think it should be). http://bigflake.com/mediacodec/CameraToMpegTest.java.txt




3. The format of the input and output buffers is

    not specified in the SDK. However, in this case, the format of H264 is basically Appendix B. However, there are also more distinctive ones. It does not carry the StartCode, that is, the 0x000001, so that he can't solve it by himself. Fortunately, we can add our own.


            ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
            byte[] outData = new byte[bufferInfo.size + 3];
                    outputBuffer.get(outData, 3, bufferInfo.size);
            if (frameListener != null) {
            if ((outData[3]==0 && outData[4]==0 && outData[5]==1)
                || (outData[3]==0 && outData[4]==0 && outData[5]==0 && outData[6]==1))
                {
            frameListener.onFrame(outData, 3, outData.length-3, bufferInfo.flags);
                }
                else
                {
                outData[0] = 0;
                outData[1] = 0;
                outData[2] = 1;
            frameListener.onFrame(outData, 0, outData.length, bufferInfo.flags);
                }
            }

4. Sometimes it will die on dequeueInputBuffer(-1)

According to the SDK documentation, the parameter of dequeueInputBuffer represents the waiting time (milliseconds), -1 means waiting all the time, and 0 means not waiting. According to common sense, you can pass -1, but in fact, it will hang up on many machines. There is no way, or pass 0. It is better to lose frames than hang up. Of course, you can also pass a specific number of milliseconds, but it doesn't make much sense.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326292783&siteId=291194637