使用Camera2和GLsurface实现相机预览

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/qq_36391075/article/details/81631461

之前有用过旧版的Camera实现相机预览,可是很多东西不好控制,而且旧版的camera现在以被弃用,所以使用Camera2来进行项目的开发。

先看效果:

这里写图片描述

camera2是为连接Android相机的接口。

这里写图片描述

这里引用了管道的概念将安卓设备和摄像头之间联通起来,系统向摄像头发送Capture请求,而摄像头返回CameraMetadata。在一切建立在一个叫做CameraCaptureSession的会话中。

Camera2支持RAW输出,可以调节曝光,对焦模式,快门等。

这里写图片描述

  • CameraManager:摄像头管理器,用于检测摄像头,打开系统摄像头,调用CameraManager.getCameraCharacteristics(String)可以获取指定摄像头的相关特性
  • CameraCharacteristics 摄像头的特性
  • CameraDevice 摄像头,类似android.hardware.Camera也就是Camera1的Camera
  • CameraCaptureSession 这个对象控制摄像头的预览或者拍照,setRepeatingRequest()开启预览,capture()拍照,CameraCaptureSession提供了StateCallback、CaptureCallback两个接口来监听CameraCaptureSession的创建和拍照过程。
  • CameraRequest和CameraRequest.Builder,预览或者拍照时,都需要一个CameraRequest对象。CameraRequest表示一次捕获请求,用来对z照片的各种参数设置,比如对焦模式、曝光模式等。CameraRequest.Builder用来生成CameraRequest对象。

预览的基本思想

在这里,我使用的是OpenGL ES,用摄像头获取的图像,调用OpenGL ES的库进行绘制。

摄像头:

Camera2使用的是Surface来进行预览,这个surface包括:SurfaceView, SurfaceTexture ( Surface(SurfaceTexture))

关于各个surface的区别可以参考:https://blog.csdn.net/jinzhuojun/article/details/44062175

主要思路

  1. 获得摄像头管理器 mCameraManagermCameraManager.openCamera()来打开摄像头
  2. 指定要打开的摄像头,并创建openCamera()所需要的CameraDevice.StateCallback stateCallback
  3. CameraDevice.StateCallback stateCallback中调用createPreview(),这个方法中,使用CaptureRequest.Builder创建预览需要的CameraRequest,并初始化了CameraCaptureSession
  4. CameraCaptureSession的回调中,最后调用了setRepeatingRequest(previewRequest, null, null)进行了预览

Camera代码:

public class CameraHelper  {

    private Context mContext;
    private CameraManager mCameraManager;
    private CameraDevice mCameraDevice;
    private String mCameraId = null;
    private boolean mFlashSupport = false;
    private Surface mSurface;
    private CaptureRequest.Builder mPreviewBuilder;

    private android.os.Handler mMainHandler;
    private android.os.Handler mChildHanlder;

    private static final String TAG = "CameraHelper";


    public CameraHelper(Context context, SurfaceTexture surfaceTexture){

        mContext = context;
        Display display = ((WindowManager)context
                .getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
        DisplayMetrics metrics = new DisplayMetrics();
        display.getMetrics(metrics);
        int screenWidth = metrics.widthPixels;
        int screenHidth = metrics.heightPixels;

        //设置图像像素比位4:3
        surfaceTexture.
                setDefaultBufferSize(4*screenWidth/3,3 * screenWidth / 4);
        mSurface = new Surface(surfaceTexture);
        initCamera();
    }

    private void initCamera(){

        //得到CameraManager
        mCameraManager = (CameraManager) mContext.getSystemService(Context.CAMERA_SERVICE);
        try {
            for(String id : mCameraManager.getCameraIdList()){

                CameraCharacteristics characteristics
                        = mCameraManager.getCameraCharacteristics(id);
                Integer front = characteristics.get(CameraCharacteristics.LENS_FACING);
                mCameraId = id;
                //是否支持闪光灯
                Boolean available = characteristics.
                        get(CameraCharacteristics.FLASH_INFO_AVAILABLE);
                mFlashSupport = available == null? false :available;

                //启动前置摄像头
                if(front!=null && front == CameraCharacteristics.LENS_FACING_FRONT){
                    break;
                }
            }

            mMainHandler = new android.os.Handler(Looper.getMainLooper());
            if(mCameraId!=null){
                //打开摄像头
                mCameraManager.openCamera(mCameraId,mStateCallback
                        ,mMainHandler);

            }

        }catch (CameraAccessException e){
            e.printStackTrace();
        }

    }

    private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
        @Override
        public void onOpened(@NonNull CameraDevice camera) {

            mCameraDevice = camera;
            //开启预览
            createPreview();
        }

        @Override
        public void onDisconnected(@NonNull CameraDevice camera) {

            mCameraDevice.close();
            mCameraDevice = null;

        }

        @Override
        public void onError(@NonNull CameraDevice camera, int error) {

            mCameraDevice.close();
            mCameraDevice = null;

        }
    };

    private void createPreview(){

        List<Surface> surfaces = new ArrayList<>();
        surfaces.add(mSurface);
        try {
            //设置一个具有输出Surface的CaptureRequest.Builder
           mPreviewBuilder =  mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            mPreviewBuilder.addTarget(mSurface);

            //进行相机预览
            mCameraDevice.createCaptureSession(surfaces,mStateCallbackSession,null);

        }catch (CameraAccessException e){
            e.printStackTrace();
        }



    }

    private CameraCaptureSession.StateCallback  mStateCallbackSession = new CameraCaptureSession.StateCallback() {
        @Override
        public void onConfigured(@NonNull CameraCaptureSession session) {

            mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE,
                    CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
            try {
                //发送请求
                session.setRepeatingRequest(mPreviewBuilder.build(),
                        null,null);

            }catch (CameraAccessException e){
                e.printStackTrace();
            }

        }

        @Override
        public void onConfigureFailed(@NonNull CameraCaptureSession session) {
            Log.d(TAG, "onConfigureFailed: ");
        }
    };
}

这里有一个点,就是surfaceTexture,这个surfaceTexture怎么得到呢?我们知道在OpenGL中,我们需要绑定纹理id,这个surfaceTextrue就是根据纹理id进行构建的,

public class RecordView extends GLSurfaceView {

    private RecordRender mRender;
    private  Context mContext;
    private float[] mStMatrix = new float[16];

    private static final String TAG = "RecordView";

    public RecordView(Context context) {
        this(context,null);
    }

    public RecordView(Context context, AttributeSet attrs) {
        super(context, attrs);
        mContext = context;
        setEGLContextClientVersion(2);
        mRender = new RecordRender();
        setRenderer(mRender);
    }


    private class RecordRender implements Renderer,SurfaceTexture.OnFrameAvailableListener{

        private Photo mPhoto;
        private int mTextId;
        private SurfaceTexture mSurfaceTexture;
        private CameraHelper mCamera;
        private float[] mProjMatrix = new float[16];

        @Override
        public void onSurfaceCreated(GL10 gl, EGLConfig config) {

            GLES20.glClearColor(1,1,1,1);
            GLES20.glEnable(GLES20.GL_DEPTH_TEST);
            mPhoto = new Photo();
            initTextureId();//构建纹理id
            mSurfaceTexture = new SurfaceTexture(mTextId);//构建用于预览的surfaceTexture
            mSurfaceTexture.setOnFrameAvailableListener(this);
            mCamera = new CameraHelper(mContext,mSurfaceTexture);//开启预览

        }

        @Override
        public void onSurfaceChanged(GL10 gl, int width, int height) {
            GLES20.glViewport(0,0,width,height);

        }

        @Override
        public void onDrawFrame(GL10 gl) {
            GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT|GLES20.GL_DEPTH_BUFFER_BIT);
            //得到最新的图像
            mSurfaceTexture.updateTexImage();
            //得到图像的纹理矩阵
            mSurfaceTexture.getTransformMatrix(mStMatrix);
            //绘制图像
            mPhoto.draw(mTextId,mStMatrix);
        }

        private void initTextureId(){

            int[] texutes = new int[1];
            GLES20.glGenTextures(1,texutes,0);
            mTextId = texutes[0];
            GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,mTextId);

            GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                    GLES20.GL_TEXTURE_MIN_FILTER,GLES20.GL_NEAREST);//设置MIN 采样方式
            GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                    GLES20.GL_TEXTURE_MAG_FILTER,GLES20.GL_LINEAR);//设置MAG采样方式
            GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                    GLES20.GL_TEXTURE_WRAP_S,GLES20.GL_CLAMP_TO_EDGE);//设置S轴拉伸方式
            GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,
                    GLES20.GL_TEXTURE_WRAP_T,GLES20.GL_CLAMP_TO_EDGE);//设置T轴拉伸方式

        }


        @Override
        public void onFrameAvailable(SurfaceTexture surfaceTexture) {


        }

    }


}

关于Photo,是我自己封装的一个类,用于定义图形的绘制坐标,纹理绑定和绘制方式。要注意的是,这里必须要使用的纹理类型是:GLES11Ext.GL_TEXTURE_EXTERNAL_OES

public class Photo {

    private String mVertexShder =
            "attribute vec4 aPosition;" +
            "attribute vec4 aTexCoord;" +
            "varying vec2 vTextureCoord;" +
            "uniform mat4 uMvpMatrix;" +
            "uniform mat4 uTexMatrix;" +
            "void main(){" +
            "gl_Position = uMvpMatrix * aPosition;" +
            "vTextureCoord = (uTexMatrix * aTexCoord).xy;" +
            "}";

    private String mFragmentShader = "#extension GL_OES_EGL_image_external : require\n" +
            "precision highp float;" +
            "varying vec2 vTextureCoord;" +
            "uniform samplerExternalOES uSampler;" +
            "void main(){" +
            "gl_FragColor = texture2D(uSampler,vTextureCoord);" +
            "}";

    private int maPositionHandle;
    private int maTextureHandle;
    private int muMvpMatrixHandle;
    private int muTexMatrixHandle;

    private FloatBuffer mVertexBuffer;
    private FloatBuffer mTextureBuffer;
    private ByteBuffer mIndexbuffer;

    private int mProgram;

    private int mVCount = 4;
    private int mIndexCount = 6;

    private float[] mMvpMatrix = new float[16];
    private float[] mMMatrix = new float[16];

    public Photo(){
        initVertexData();
        initFragmentData();
    }

    private void initVertexData(){

        float[] vertexs = new float[]{
                -1,1,0,
                -1,-1,0,
                1,1,0,
                1,-1,0
        };

        ByteBuffer vbb = ByteBuffer.allocateDirect(vertexs.length*4);
        vbb.order(ByteOrder.nativeOrder());
        mVertexBuffer = vbb.asFloatBuffer();
        mVertexBuffer.put(vertexs);
        mVertexBuffer.position(0);

        float[] textures = new float[]{
                0,0,
                0,1,
                1,0,
                1,1
        };

        ByteBuffer cbb = ByteBuffer.allocateDirect(textures.length*4);
        cbb.order(ByteOrder.nativeOrder());
        mTextureBuffer = cbb.asFloatBuffer();
        mTextureBuffer.put(textures);
        mTextureBuffer.position(0);

        byte[] indexs = new byte[]{
          0,1,2,
          1,3,2
        };

        mIndexbuffer = ByteBuffer.allocateDirect(indexs.length);
        mIndexbuffer.order(ByteOrder.nativeOrder());
        mIndexbuffer.put(indexs);
        mIndexbuffer.position(0);

    }

    private void initFragmentData(){

        mProgram = ShaderUtil.loadProgram(mVertexShder,mFragmentShader);
        maPositionHandle = GLES20.glGetAttribLocation(mProgram,"aPosition");
        maTextureHandle = GLES20.glGetAttribLocation(mProgram,"aTexCoord");
        muMvpMatrixHandle = GLES20.glGetUniformLocation(mProgram,"uMvpMatrix");
        muTexMatrixHandle = GLES20.glGetUniformLocation(mProgram,"uTexMatrix");

    }

    public void draw(int textId,float[] sTMatrix){
        GLES20.glUseProgram(mProgram);

        GLES20.glVertexAttribPointer(maPositionHandle,3,GLES20.GL_FLOAT,
                false,3*4,mVertexBuffer);
        GLES20.glVertexAttribPointer(maTextureHandle,2,GLES20.GL_FLOAT,
                false,2*4,mTextureBuffer);
        GLES20.glEnableVertexAttribArray(maPositionHandle);
        GLES20.glEnableVertexAttribArray(maTextureHandle);

        if(sTMatrix!=null){

            GLES20.glUniformMatrix4fv(muTexMatrixHandle,
                    1,false,sTMatrix,0);
        }else {
            GLES20.glUniformMatrix4fv(muTexMatrixHandle,
                    1,false,mMvpMatrix,0);
        }

        Matrix.setIdentityM(mMvpMatrix,0);
        Matrix.rotateM(mMvpMatrix,0,180,1,0,0);

        GLES20.glUniformMatrix4fv(muMvpMatrixHandle,
                1,false,mMvpMatrix,0);

        GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,textId);

        GLES20.glDrawElements(GLES20.GL_TRIANGLES,mIndexCount,
                GLES20.GL_UNSIGNED_BYTE,mIndexbuffer);
    }


}

关于OpenGL纹理绑定,可以参考:https://blog.csdn.net/qq_36391075/article/details/81564454

项目地址:https://github.com/vivianluomin/FunCamera

补充:

关于CameraCharacteristics里面的参数,主要用到的有以下几个:

  • LENS_FACING:前置摄像头(LENS_FACING_FRONT)还是后置摄像头(LENS_FACING_BACK)。
  • SENSOR_ORIENTATION:摄像头拍照方向。
  • FLASH_INFO_AVAILABLE:是否支持闪光灯。
  • CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL:获取当前设备支持的相机特性。

createCaptureRequest()方法里参数templateType代表了请求类型,请求类型一共分为六种,分别为:

  • TEMPLATE_PREVIEW:创建预览的请求
  • TEMPLATE_STILL_CAPTURE:创建一个适合于静态图像捕获的请求,图像质量优先于帧速率。
  • TEMPLATE_RECORD:创建视频录制的请求
  • TEMPLATE_VIDEO_SNAPSHOT:创建视视频录制时截屏的请求
  • TEMPLATE_ZERO_SHUTTER_LAG:创建一个适用于零快门延迟的请求。在不影响预览帧率的情况下最大化图像质量。
  • TEMPLATE_MANUAL:创建一个基本捕获请求,这种请求中所有的自动控制都是禁用的(自动曝光,自动白平衡、自动焦点)。

参考:https://juejin.im/post/5a33a5106fb9a04525782db5

推介一个开源视频录制项目:https://github.com/guoxiaoxing/phoenix

猜你喜欢

转载自blog.csdn.net/qq_36391075/article/details/81631461
今日推荐