[Transfer] This may be the most detailed article introducing Android UvcCamera

The device is connected with an external USB camera for basic preview, photo taking and video recording. I believe that some students have encountered similar needs in their work.

uvc camera? Regardless of whether you have used it before or encountered it, I believe that after reading this article, it will definitely bring you some gains.

This article will expand on the following points:

     1. What is UVC?
     2. UVCCamera open source project?
     3. Open source project integration?
     4. Minor changes to the demo, to obtain real-time yuv stream while recording?
     5. Problems encountered and solutions?

1. What is UVC?

The full name of UVC is USB Video Class, and the direct translation means: USB Video Class, which is a protocol standard specially defined for USB video capture devices.

This standard is a protocol standard defined for USB video capture devices jointly launched by Microsoft and several other device manufacturers, and has become one of the USB org standards.

The current mainstream operating systems have provided UVC device drivers, so hardware devices that meet the UVC specification can be used normally in the host without installing any drivers. Yes, the current Android system already supports uvc devices.

summary:

Speaking of this, everyone should have such a concept, uvc is a protocol, and different devices may support different protocols. If our usb camera needs to be supported on Android devices, then this camera must be a camera that supports the uvc protocol.

2. UVCCamera open source project?

https://github.com/saki4510t/UVCCamera

Now we search for uvc camera-related articles on the Internet, and the uvc camera-related projects we can find can be said without exaggeration, basically based on the above open source project. The packaging is very good, the code logic is relatively clear, and it is very convenient to use, and the basic preview, photo, and video functions of the camera have been realized. It is a relatively complete project.

We first pull the code locally through git pull and import it into AndroidStudio, (it is also possible to download the code directly without git pull.

If the github website does not go over the wall in China, sometimes it may not be accessible, why not, you can also try to search for this project on gitee to download).

9f8733dc0ffb1480198b3a47b6c1fc92.png

The directory structure of the entire project is shown in the figure below. Of course, during the import process, you will encounter some error reporting problems, in fact, it is mainly a problem with the gradle version.

For the problem of importing errors, we will give you a detailed explanation later in this article, including the problems encountered and how to solve them.

083e0638c6af4af0d5433b126f87789d.png

In this open source project, in addition to the source code of the sdk library, the author also provides 8 demos. The specific functions of these 8 demos are introduced as follows:

1)USBCameraTest0
            显示如何使用SurfaceView来启动/停止预览。
 
 
2)USBCameraTest
            显示如何启动/停止预览。这与USBCameraTest0几乎相同,
            但是使用自定义的TextureView来显示相机图像而不是使用SurfaceView。
 
 
3)USBCameraTest2
            演示如何使用MediaCodec编码器将UVC相机(无音频)的视频记录为.MP4文件。
            此示例需要API>=18,因为MediaMuxer仅支持API>=18。
 
 
4)USBCameraTest3
            演示如何将音频(来自内部麦克风)的视频(来自UVC相机)录制为.MP4文件。
            这也显示了几种捕捉静止图像的方式。此示例可能最适用于您的定制应用程序的基础项目。
 
 
5)USBCameraTest4
            显示了访问UVC相机并将视频图像保存到后台服务的方式。
            这是最复杂的示例之一,因为这需要使用AIDL的IPC。
 
 
6)USBCameraTest5
            和USBCameraTest3几乎相同,但使用IFrameCallback接口保存视频图像,
            而不是使用来自MediaCodec编码器的输入Surface。
            在大多数情况下,您不应使用IFrameCallback来保存图像,因为IFrameCallback比使用Surface要慢很多。
            但是,如果您想获取视频帧数据并自行处理它们或将它们作为字节缓冲区传递给其他外部库,
            则IFrameCallback将非常有用。
 
 
7)USBCameraTest6
            这显示了如何将视频图像分割为多个Surface。你可以在这个应用程序中看到视频图像并排观看。
            这个例子还展示了如何使用EGL来渲染图像。
            如果您想在添加视觉效果/滤镜效果后显示视频图像,则此示例可能会对您有所帮助。
 
 
8)USBCameraTest7
            这显示了如何使用两个摄像头并显示来自每个摄像头的视频图像。这仍然是实验性的,可能有一些问题。
 
 
9)usbCameraTest8
            这显示了如何设置/获取uvc控件。目前这只支持亮度和对比度。

The code logic of the provided demo is very clear, and you can view the corresponding demo according to your needs.

These demos include basic functions such as preview, recording, and taking pictures. As for adjusting the brightness and contrast, it may be due to different cameras. I verified it locally and found that it has no effect. If any students try it later and find it effective, please leave a message for me to communicate with you.

We can see that Demo7 is a Demo that supports 2 cameras. If there are multiple camera support requirements, you can refer to the logic in this.

3. Compilation and integration of the open source project UVCCaemra?

The core code of UVCCamera is in libuvcamera.

2a423b89062504a05e0df315754e9057.png

If we want to integrate this project in our project, we need two things, one is the so library, and the other is the java sdk source code that can be called.

From the screenshot above, we can clearly see that the code mainly contains two parts of jni and java. Compile jni to get the so library we need. The java code can be packaged into aar, or the entire code can be directly copied to our project directory, which can also be used as a library reference.

1) Compilation of so library

Now the compilation of the so library is very convenient. As shown in the figure below, we switch to the jni directory in the Terminal terminal interface of as, and directly ndk-build to generate the so library file we need.

7f0610f7064d54756def22f6a500bed4.png

Here is a place we have to pay attention to, that is, we need Android 32-bit or 64-bit library files, which are configured in Application.mk. I have circled the location of the Application.mk file in the screenshot above. If it is 32-bit, the content of APP_ABI here can be changed to armeabi-v7a, 64-bit is arm64-v8a, and so on for other platforms.

2) Pack aar

If our project wants to integrate this open source project, we must provide java code before we can call it. The way I use here is to package the core code of UVCCamera (that is, not including 8 demos) into aar, and then refer to the packaged aar in my own project directory. Packaging into aar is also operated in
AS very simple. Post the picture first.

197b71ee6220eccc8fe5a6b88c21077d.png

8a2be6120b1d6b9ad1ab15b5ff71c88c.png

From the screenshot above, we can see that there are 2 modules that need to be packaged, libuvccamera and usbCameraCommon.

Follow the sequence of operations on the screenshot, from 1 to 3. First, click Gradle on the right side of the as interface. In the vacated interface, double-click to execute assembleRelease. If there is no error after execution, you can see the generated aar file under the build output path of the module.

The last is to copy the generated so library file and the generated aar file to the libs directory of our own project, and import it into the project for use.

3) Integrate UVCCamera sdk into our own project
Through the above steps, we have successfully compiled the so library file and aar file. The figure below shows that we import the generated files into our own engineering projects.

7d22a4489eb17e83a6291d90ba018d8b.png

4. Minor changes to the demo: while recording video, get real-time yuv stream?

I have written an article about uvcCamea before, and the address is posted here, along with the address of the demo. Interested students can also take a look.

In this demo, in addition to the basic preview, photo, and video functions, I also added an interface to return the real-time yuv stream according to my own needs. If there is a need to perform real-time streaming video streams such as face recognition and uploading to the background, I believe it can help you.

"An article takes you to understand Android Usb camera"
https://www.jianshu.com/p/35124f098c24

demo address:
https://github.com/yorkZJC/UvcCameraDemo

Regarding my own demo, if you need to modify the resolution, you can modify it in the MyConstants.java file as shown in the figure below.

cfef4a608beec9e71bf660db69fe8107.png

The screenshot below shows the interface of the yuv stream callback .

915d55c8f27de4304a38a90f8b8563a1.png

5. Problems encountered and solutions?

1) sdk, ndk configuration?

In the first step, we need to configure the sdk and nkd first. I believe that many students will be able to configure the sdk. In addition, we need to use ndk-build to compile the so library, so the ndk must be configured properly. My local ndk version is r17, I believe The version of this ndk has little impact.

There are two ways to configure ndk. You can modify it directly in the local.properties file, or you can configure it by selecting our local nkd path in the view interface and Project Structure.

The following screenshots correspond to these two different modification methods, and you can use any one of them.

d272bba55615933ac0fda9d67aa4055.png

0526c422eaea4e8f4d369698189b7dfa.png

2) Import Android Studio, gradle version configuration?

The following are some problems I have encountered. According to my modification operation, I believe everyone can run it successfully.
【error 1】

Caused by: org.apache.http.conn.HttpHostConnectException: Connect to maven.google.com:443 
[maven.google.com/142.250.204.46] failed: Connection timed out: connect

1f84b3cd572283b128cde8805d7e5866.png

【error2】

ERROR: The minSdk version should not be declared in the android manifest file. You can move the version from the manifest to the defaultConfig in the build.gradle file.
Remove minSdkVersion and sync project
Affected Modules: libuvccamera

728df0c789e1c4e4ef878c5d4dabca71.png

【error3】

* What went wrong:
Execution failed for task ':libuvccamera:ndkBuild'.
> A problem occurred starting process 'command 'null/ndk-build.cmd''

73c3ad95ee512ba025dfd3d0b3d84d80.png

Android NDK: The armeabi ABI is no longer supported. Use armeabi-v7a.    
Android NDK: NDK Application 'local' targets unknown ABI(s): armeabi mips    
D:/APPS/sdk/android-ndk-r17b/build//../build/core/setup-app.mk:79: *** Android NDK: Aborting    .  Stop.
2021-06-11 10:08:11.386 3105-3105/? E/AndroidRuntime: FATAL EXCEPTION: main
    Process: com.serenegiant.usbcameratest0, PID: 3105
    java.lang.RuntimeException: Unable to start activity ComponentInfo{com.serenegiant.usbcameratest0/com.serenegiant.usbcameratest0.MainActivity}: java.lang.IllegalStateException: You need to use a Theme.AppCompat theme (or descendant) with this activity.
        at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3432)

I pull the code locally through git pull, so every local modification can be tracked and recorded through git. Regarding the compilation error, let's take a look at what I have modified in total.

5880535de8b6392d4978888f21f9e7a6.png

In the screenshot above, we can see that a total of 5 places have been modified.
   i. Modification of build.gradle in the root directory of the project;
  ii. Modification of libuvcamera/build.gradle;

3) When the USB camera is pulled out, the application exits due to a crash exception?

There is a bug in the original library file, that is, when we unplug the camera while using the usb camera, there is a crash in the so library, which causes our application to exit abnormally directly.

For this problem, other great gods on the Internet have given solutions, and I will post the modified places here. I also personally modified the verification.

ffbb13349b12667211a77697473ae8b7.png

diff --git a/libuvccamera/src/main/jni/libusb/libusb/os/android_usbfs.c b/libuvccamera/src/main/jni/libusb/libusb/os/android_usbfs.c
index 8626595..c4842c4 100644
--- a/libuvccamera/src/main/jni/libusb/libusb/os/android_usbfs.c
+++ b/libuvccamera/src/main/jni/libusb/libusb/os/android_usbfs.c
@@ -2726,6 +2726,12 @@ static int handle_iso_completion(struct libusb_device_handle *handle,    // XXX add
 
        usbi_mutex_lock(&itransfer->lock);
        for (i = 0; i < num_urbs; i++) {
+           //+Add by york.zhou on 2021.05.19,fix issue app crash on remove usb device
+           if (tpriv->iso_urbs == NULL){
+            break;
+        }
+        //-Add by york.zhou on 2021.05.19,fix issue app crash on remove usb device
+
                if (urb == tpriv->iso_urbs[i]) {
                        urb_idx = i + 1;
                        break;
diff --git a/libuvccamera/src/main/jni/libuvc/src/stream.c b/libuvccamera/src/main/jni/libuvc/src/stream.c
index 8a1e90a..b7cedcc 100644
--- a/libuvccamera/src/main/jni/libuvc/src/stream.c
+++ b/libuvccamera/src/main/jni/libuvc/src/stream.c
@@ -641,7 +641,8 @@ static void _uvc_delete_transfer(struct libusb_transfer *transfer) {
                                libusb_cancel_transfer(strmh->transfers[i]);    // XXX 20141112追加
                                UVC_DEBUG("Freeing transfer %d (%p)", i, transfer);
                                free(transfer->buffer);
-                               libusb_free_transfer(transfer);
+                               //+Add york.zhou 2021.05-19,fix remove usb devices,app crash
+                               //libusb_free_transfer(transfer);
                                strmh->transfers[i] = NULL;
                                break;
                        }
 
4) Some usb cameras cannot be recognized?

Some students may also encounter the problem that some usb cameras cannot be recognized. There is a premise here, which is to confirm that the usb camera can be recognized and used normally when it is plugged into the computer, but it cannot be recognized when it is plugged into our device.

When encountering this kind of problem, you can grab a complete logcat log, then search for subclass globally in the log, and configure the subclass-related information found in the format of the screenshot below in device_filter.xml under the xml directory.

fd91c005c83537d03053978f75e555e8.png

About the content of uvcCamera, it is over here.

Thank you all for reading. You are also welcome to communicate together.

Guess you like

Origin blog.csdn.net/qq_27489007/article/details/131326292