Qualcomm Camera HAL3: Key points of CamX and Chi-CDK

Table of contents

I. Overview

2. Directory

3. Relationship between CamX components


I. Overview

Qualcomm CamX architecture is the camera HAL3 architecture implemented by Qualcomm and is widely adopted by various OEM manufacturers.

2. Directory

The code is located under vendor/qcom/proprietary:

  • camx: a collection of code implementations of common functional interfaces
  • chi-cdk: a collection of customizable code implementations

 CamX subdirectory:

  • core/: used to store the core implementation module of camx, which also contains the hal/ directory mainly used forimplementing the hal3 interface, and the chi/ directory responsible forinteracting with CHI
  • hwl/: used to store hardware nodes with independent computing capabilities, this part of the node is managed by csl
  • swl/: used to store software nodes that do not have independent computing capabilities and must rely on the CPU to implement them
  • csl/: used to store the communication module mainly responsible forcamx and camera driver, providing a unified Camera driver control interface for camx< /span>

Chi-Cdk subdirectory:

  • chioverride/: used to store the core module of CHI implementation, responsible forinteracting with camx and implementing the overall CHI Framework and specific business processing.
  • bin/: used to store platform-related configuration items
  • topology/: used to store user-defined Usecase xml configuration files
  • node/: node used to store user-defined functions
  • module/: used to store configuration files of different sensors. This part is needed when initializing the sensor.
  • tuning/: Configuration files used to store effect parameters in different scenarios
  • sensor/: used to store private information and register configuration parameters of different sensors
  • actuator/: used to store configuration information of different focus modules
  • ois/: used to store the configuration information of the anti-shake module
  • flash/: stores the configuration information of the flash module
  • eeprom/: stores the configuration information of the eeprom external storage module
  • fd/: stores the configuration information of the face recognition module

3. Relationship between CamX components

 

Usecase: A Usecase represents a specific image collection scene, such as a portrait scene, a rear camera scene, etc., during initialization based on some input from the upper layer specific information to create. In this process, on the one hand, a specific Usecase is instantiated. This instance is used to manage all resources of the entire scene, and is also responsible for the business processing logic. On the other hand, it obtains the For a specific Usecase in the , the pipeline used to implement certain specific functions is obtained.

Feature: In Usecase, Feature is an option. If the current user selects HDR mode or needs to perform special functions such as taking pictures under Zoom, during the Usecase creation process , one or more Features will be created as needed. Generally, a Feature corresponds to a specific function. If the scene does not require any specific function, you can not use it or create any Feature.

Session: Each Usecase or Feature can contain one or more Sessions. Each Session directly manages and is responsible for the internal Pipeline data flow. Each session All Requests are Usecase or Featuret and are sent to the internal Pipeline through the Session for processing. After the data processing is completed, the results are sent to CHI through the Session method, and then whether they are sent directly to the upper layer or the data is encapsulated and sent to another party again. Post-processing is performed in a Session, which is decided by CHI.

Pipeline: There is a one-to-many relationship between Session and Pipeline. Usually a Session only contains one Pipeline, which is used to implement a specific image processing function, but it is not absolute. , for example, the Session included in FeatureMFNR includes three pipelines, and for example, the rear portrait preview also uses one Session to include two Pipelines for main and secondary dual-camera previews, mainly depending on the number of pipelines required for the current function and Is there any correlation between them?

Node: According to the above definition of Pipeline, it contains a certain number of Nodes internally, and the more complex the function implemented, the more Nodes it contains. At the same time, Node The connections between them become more and more complicated. For example, the realization of the blur effect of the rear portrait preview is to use the RTBOfflinePreview Pipeline to combine the images of the main and secondary cameras into one frame with a blur effect. Completed the blur function.

Finally, the connection method of the Node in the Pipeline is described through the Link in the XML file. Each Link defines an input end and an output end corresponding to the input and output ports on different Nodes. In this way, the The output end of one Node and the input end of another Node are connected in series one by one. When the image data is input from the beginning of the Pipeline, it can flow between Nodes according to this defined trajectory, and During the flow process, each Node will process the data internally, so that when the data flows from the starting end to the output end of the last Node, the data has been processed many times, and these processing effects are finally added together. It is the function that the Pipeline wants to achieve, such as noise reduction, blurring, etc.​ 

Reference excellent blogs:

In-depth understanding of Android camera architecture_In-depth understanding of android camera-CSDN Blog

 

Guess you like

Origin blog.csdn.net/weixin_36389889/article/details/134581310