Qualcomm Camera HAL3: CAMX, CHI-CDK detailed explanation

There are not many introductory documents about Qualcomm CameraHAL3 on the Internet. I have done some summary and organization of Qualcomm CameraHAL3 before. It is a bit messy, so I will make do with it.

1. Preliminary understanding

Qualcomm CameraHAL3 has a huge architecture and a huge amount of code.

First, have a preliminary understanding of the key terms and catalogs of CAMX and Chi-CDK.

1.1 Several key concepts in the CAMX CHI-CDK system:

(1).Usecase: As the name suggests, "use case" is a functional requirement.
For example, the ZSL function is a usecase, HDR is a usecase, multi-camera is a usecase, and
a usecase may contain multiple pipelines and multiple functions. Feature, multiple isp modules, multiple nodes.
The entire chi-cdk is a system built around the implementation of usecase.

(2).Pipeline: Data flow pipeline, for example, a preview stream or a video stream pipeline is created. These data streams are circulated in the form of pipeline pipelines.

(3).Node: Function node. The camera algorithm we will analyze later is a node nested in the pipeline flow. This node is equivalent to realizing the content of the algorithm. For example, the
dummyrtb node realizes the fusion processing of dual camera data,
remosaic The node implements data rearrangement, and
the staticaecalgo node implements access to third-party AEC algorithms, etc.

(4).Target: target object, used to define some parameter information used

(5).Topology: The entire usecase, pipeline and note present a topological link structure. These topologies are described in the form of an xml. This xml describes the topology of usecase, pipeline flow and nete in the camera function of the entire project. Configuration parameters of relationships and their parameters, etc.

The SM7450 project uses chi-cdk/oem/qcom/topology/titan/fillmore/fillmore_usecase.xml by default

1.2 Terminology

  • ABF: Auto Bayer Filter, Bayer domain noise reduction algorithm
  • ACE:Advanced Chroma Enhancement Advanced Chroma Enhancement
  • ADRC: automatic dynamic range compression automatic dynamic range compression
  • AFD:Auto Flicker Detection, automatic flicker detection
  • ASD :Auto Scene Detection
  • ASF: Adaptive Spatial Filter, adaptive spatial filtering
  • BDS :Bayer Download Scaler
  • BPC: Bad Pixel Correction, bad pixel correction
  • BPS :Bayer processing segment(for snapshot)
  • CDS :Chroma DownSampler
  • CDK:Camera Development Kit camera development kit
  • CHI:Camera Hardware Interface camera hardware interface
  • CS: Chroma Suppression, chroma suppression
  • CSID:Camera serial interface decoder module
  • CV:Chroma Enhancement Chroma Enhancement
  • DPU :Display processing unit
  • GTM: Global Tone Mapping, global tone mapping
  • IFE:Image Front End, the data output by the Sensor will first arrive at IFE
  • IPE :Image processing engine
  • KMD :Kernal ModeDriver
  • LPM: low power manager (run at low power consumption)
  • LTM: Local Tone Mapping, local tone mapping
  • MCTF: Motion Compensation Temporal Filtering Multi-frame noise reduction during recording
  • MCE :Memory Color Enhancement
  • MFNR:Multi Frame Noise Reduction Multi-frame noise reduction when taking pictures
  • OPE :Offline Processing Engine
  • PDAF: phase difference auto focus, phase focus
  • QCFA:Quad (Bayer Coding) Color Filter Arrangement/Array
  • RDI :Raw Dump Interface
  • RTB :Real Time Bokeh
  • SCE:Skin Color Enhancement, skin color enhancement
  • TNR:temporal noise reduction, time domain noise reduction
  • TFE :Thin Front End
  • UMD :User Mode Driver
  • VPU :Video processing unit(codec)
  • WNR: Wavelet Noise Reduction, wavelet noise reduction, Yuv domain noise reduction algorithm

1.3 Main directory

1.3.1 There are the following main directories in CAMX:

  • core/: used to store the core implementation module of camx, which also includes the hal/ directory mainly used to implement the hal3 interface, and the chi/ directory responsible for interacting with CHI
  • csl/: used to store the communication module mainly responsible for camx and camera driver, providing a unified Camera driver control interface for camx
  • hwl/: used to store hardware nodes with independent computing capabilities. This part of the node is managed by csl
  • swl/: used to store nodes that do not have independent computing capabilities and must rely on the CPU to implement

1.3.2 There are the following main directories in Chi-Cdk:

  • chioverride/: used to store the core module of CHI implementation, responsible for interacting with camx and implementing the overall framework of CHI and specific business processing.
  • bin/: used to store platform-related configuration items
  • topology/: used to store user-defined Usecase xml configuration files
  • node/: node used to store user-defined functions
  • module/: used to store configuration files of different sensors. This part is needed when initializing the sensor.
  • tuning/: Configuration files used to store effect parameters in different scenarios
  • sensor/: used to store private information and register configuration parameters of different sensors
  • actuator/: used to store configuration information of different focus modules
  • ois/: used to store the configuration information of the anti-shake module
  • flash/: stores the configuration information of the flash module
  • eeprom/: stores the configuration information of the eeprom external storage module
  • fd/: stores the configuration information of the face recognition module

2. Overall architecture of CAMX

2.1 Overall architecture diagram of CAMX:

2.2 CAMX CHI-CDK communication mechanism

CAMX and CHI-CDK obtained each other's entry method by dlopen each other's So library:

2.3 CameraHAL3 data flow direction

CamraHAL3 data flow diagram:

When the camera data comes out of the sensor, it will first pass through IFE, and then be divided into two situations: preview/video and photo taking.

If it is preview or recording, it is processed by IPE first and finally output to the display.

If you are taking pictures, they are first processed by BSP, then passed through the JPEG encoder, and finally saved as picture output.

IFE, IPE, BPS, and JPEG respectively represent the hardware processing units inside the chip.

The processing of data within these units is still relatively complicated. In different processing units, some complex algorithm processing will be performed. Here we first have an understanding and a basic concept.

3. Basic components of CAMX CHI-CDK

3.1 UseCase

UseCase, literal meaning: use case

Official note:

A set of streams configured by the client combined with a set of static properties specifying the processing of those streams

A set of flows configured by the client. This set of flows is a flow described by a series of static attributes.


See createCaptureSession in the Android CameraDevice documentation

Let’s understand it better with the following code:

//UseCase: 预览+录像
List<Surface> surfaces = new ArrayList<>();

if(previewSurface != null && previewSurface.isValid()){
  surfaces.add(previewSurface);
  mPreviewBuilder.addTarget(previewSurface);
}

if(mMediaRecorder != null && mMediaRecorderSurface != null 
      && mMediaRecorderSurface.isValid()){
  surfaces.add(mMediaRecorderSurface);
  mPreviewBuilder.addTarget(mMediaRecorderSurface);
}

mCameraDevice.createCaptureSession(surfaces,...,...);

This code sets both the preview surface and the recording surface, and then creates a session.

This means that I need to get the camera data for both preview and recording.

Suppose the size of my preview setting is 1080 x 720 and the video is 1080p, then this 1080 x 720 preview + 1080p video

It's a usecase

Other analogies.

UsecaseId:\chi-cdk\core\chiutils\chxdefs.h

/// @brief Usecase identifying enums
enum class UsecaseId
{
    NoMatch             = 0,
    Default             = 1,
    Preview             = 2,
    PreviewZSL          = 3,
    MFNR                = 4,
    MFSR                = 5,
    MultiCamera         = 6,
    QuadCFA             = 7,
    RawJPEG             = 8,
    MultiCameraVR       = 9,
    Torch               = 10,
    YUVInBlobOut        = 11,
    VideoLiveShot       = 12,
    SuperSlowMotionFRC  = 13,
    Feature2            = 14,
    Depth               = 15,
    AON                 = 16,
    MaxUsecases         = 17,
};

chi-cdk/oem/qcom/topology/titan/fillmore/fillmore_usecase.xml

This xml file describes 82 usecases, but our cameras may not necessarily run all of these usecases.

These xmls only describe configurations. Whether the described usecases are implemented depends on whether they are implemented in the code and enable these usecases.

For example, we often come into contact with UsecaseTorch, UsecasePreview, UsecaseVideo, UsecaseSnapshot, UsecaseZSL, UsecaseQuadCFA, UsecaseRTB, UsecaseSAT, etc.

Select UsecaseId

Different UsecaseIds correspond to different "use cases".

This stage is achieved by calling the UsecaseSelector::GetMatchingUsecase() method.

In this function, the corresponding UsecaseId is selected through the incoming operation_mode, num_streams configuration data stream, quantity, and the number of currently used Sensors.

For example, when the value of numPhysicalCameras is greater than 1, and the number of data streams configured at the same time, num_streams, is greater than 1, UsecaseId::MultiCamera is selected, indicating that a dual-camera scene is currently used.

chi-cdk\core\chiusecase\Chxusecaseutils.cpp

UsecaseId UsecaseSelector::GetMatchingUsecase(
    const LogicalCameraInfo*        pCamInfo,
    camera3_stream_configuration_t* pStreamConfig)
{
    UsecaseId usecaseId = UsecaseId::Default; //第一行代码
    ......
    CHX_LOG_INFO("usecase ID:%d",usecaseId);
    return usecaseId;                         //最后一行代码
}

chi-cdk\core\chiframework\Chxextensionmodule.h

UsecaseSelector*        m_pUsecaseSelector;                     ///< Usecase selector
UsecaseFactory*         m_pUsecaseFactory;                      ///< Usecase factory
Usecase*                m_pSelectedUsecase[MaxNumImageSensors]; ///< Selected usecase

Create Usecase:

Create the corresponding Usecase through UsecaseFactory based on the previously selected UsecaseId.

Class Usecase is the base class of all Usecases, which defines and implements some common interfaces.

CameraUsecaseBase inherits from Usecase and extends some functions.

AdvancedCameraUsecase inherits from CameraUsecaseBase and is the Usecase implementation class mainly responsible for most scenarios.

In addition, for multi-camera scenarios, UsecaseMultiCamera, which is inherited from AdvancedCameraUsecase, is now provided to implement it.

As you can see in this code, except for the dual-camera scene, most other scenes use the AdvancedCameraUsecase class to create Usecase.

chi-cdk\core\chiframework\Chxextensionmodule.cpp

CDKResult ExtensionModule::InitializeOverrideSession(
            uint32_t                        logicalCameraId,
            const camera3_device_t*         pCamera3Device,
            const chi_hal_ops_t*            chiHalOps,
            camera3_stream_configuration_t* pStreamConfig,
            int*                            pIsOverrideEnabled,
            VOID**                          pPrivate)
{
    ...
    selectedUsecaseId = m_pUsecaseSelector->GetMatchingUsecase(&m_logicalCameraInfo[logicalCameraId],
pStreamConfig);
    ...
    m_pSelectedUsecase[logicalCameraId] =
 m_pUsecaseFactory->CreateUsecaseObject(&m_logicalCameraInfo[logicalCameraId],
                                               selectedUsecaseId, m_pStreamConfig[logicalCameraId],
                                               m_multiCameraResources.hDescriptorConfig);                                                           
}

chi-cdk\core\chiusecase\Chxusecaseutils.cpp

Usecase* UsecaseFactory::CreateUsecaseObject(
LogicalCameraInfo*              pLogicalCameraInfo,     ///< camera info
UsecaseId                       usecaseId,              ///< Usecase Id
camera3_stream_configuration_t* pStreamConfig,          ///< Stream config
ChiMcxConfigHandle              hDescriptorConfig)      ///< mcx config
{
    Usecase* pUsecase  = NULL;
    UINT     camera0Id = pLogicalCameraInfo->ppDeviceInfo[0]->cameraId;
    switch (usecaseId)
    {
        case UsecaseId::PreviewZSL:
        case UsecaseId::VideoLiveShot:
            pUsecase = AdvancedCameraUsecase::Create(pLogicalCameraInfo, pStreamConfig, usecaseId);
            break;
        case UsecaseId::MultiCamera:
            if ((LogicalCameraType::LogicalCameraType_Default == pLogicalCameraInfo->logicalCameraType) &&
                (pLogicalCameraInfo->numPhysicalCameras > 1))
            {
                pUsecase = ChiMulticameraBase::Create(pLogicalCameraInfo, pStreamConfig, hDescriptorConfig);
            }
            break;
        case UsecaseId::MultiCameraVR:
            //pUsecase = UsecaseMultiVRCamera::Create(pLogicalCameraInfo, pStreamConfig);
            break;
        case UsecaseId::QuadCFA:
            pUsecase = AdvancedCameraUsecase::Create(pLogicalCameraInfo, pStreamConfig, usecaseId);
            break;
        case UsecaseId::Torch:
            pUsecase = UsecaseTorch::Create(pLogicalCameraInfo, pStreamConfig);
            break;
        case UsecaseId::Depth:
            pUsecase = AdvancedCameraUsecase::Create(pLogicalCameraInfo, pStreamConfig, usecaseId);
            break;
        case UsecaseId::AON:
            pUsecase = CHXUsecaseAON::Create(pLogicalCameraInfo);
            break;
        default:
            pUsecase = AdvancedCameraUsecase::Create(pLogicalCameraInfo, pStreamConfig, usecaseId);
            break;
    }
    
    return pUsecase;
}

A lot of initialization operations are done in the AdvancedCameraUsecase::Create method, including the following stages:

  • Get Usecase configuration information in XML file
  • Create Feature
  • Save the data flow and rebuild the Usecase configuration information
  • Call the initialize method of the parent class CameraUsecaseBase to perform some general initialization work

Chi-cdk\core\chiusecase\Chxadvancedcamerausecase.cpp

Get Usecase configuration information in XML file

This part is mainly implemented by calling the CameraUsecaseBase::GetXMLUsecaseByName method.

The main operation of this method is to find the Usecase matching the given usecaseName from the PerNumTargetUsecases array and return it to the caller as the return value.

In the function, it will be compared with the default "UsecaseZSL" passed in, update and return pUsecase

PerNumTargetUsecases is defined in g_pipeline.h. This file is generated by converting the content in common_usecase.xml defined in a platform directory through the \chi-cdk\tools\usecaseconverter\usecaseconverter.pl script during the compilation process to generate g_pipeline.h .


/// AdvancedCameraUsecase::Create

AdvancedCameraUsecase* AdvancedCameraUsecase::Create(
            LogicalCameraInfo*              pCameraInfo,   ///< Camera info
            camera3_stream_configuration_t* pStreamConfig, ///< Stream configuration
            UsecaseId                       usecaseId)     ///< Identifier for usecase function
{
    AdvancedCameraUsecase* pAdvancedCameraUsecase = CHX_NEW AdvancedCameraUsecase;
    
    if ((NULL != pAdvancedCameraUsecase) && (NULL != pStreamConfig))
    {
        result = pAdvancedCameraUsecase->Initialize(pCameraInfo, pStreamConfig, usecaseId);
    }

    return pAdvancedCameraUsecase;
}


/// AdvancedCameraUsecase::Initialize
/// 这个函数后面会反复查看

static const CHAR*  ZSL_USECASE_NAME   = "UsecaseZSL";
CDKResult AdvancedCameraUsecase::Initialize(
    LogicalCameraInfo*              pCameraInfo,   ///< Camera info
    camera3_stream_configuration_t* pStreamConfig, ///< Stream configuration
    UsecaseId                       usecaseId)     ///< Identifier for the usecase function
{
    ...
    m_pAdvancedUsecase = GetXMLUsecaseByName(ZSL_USECASE_NAME);
    ...
    if (CDKResultSuccess == result)
    {
        if ((UsecaseId::PreviewZSL    == m_usecaseId) ||
            (UsecaseId::YUVInBlobOut  == m_usecaseId) ||
            (UsecaseId::VideoLiveShot == m_usecaseId) ||
            (UsecaseId::QuadCFA       == m_usecaseId) ||
            (UsecaseId::RawJPEG       == m_usecaseId) ||
            (UsecaseId::Feature2      == m_usecaseId) ||
            (UsecaseId::MultiCamera   == m_usecaseId))
            {
                SelectFeatures(pStreamConfig);
            }
        result = SelectUsecaseConfig(pCameraInfo, pStreamConfig);
    }
     ...
 }


/// CameraUsecaseBase::GetXMLUsecaseByName

/// @brief Collection of usecases with matching properties (target count at this point)
struct ChiTargetUsecases
{
    UINT        numUsecases;  ///< The number of Usecases in this collection
    ChiUsecase* pChiUsecases; ///< An array of Usecases of size numUsecases
};

ChiUsecase* CameraUsecaseBase::GetXMLUsecaseByName(const CHAR* usecaseName)
{
    ChiUsecase* pUsecase   = NULL;
    UINT32      numTargets = 0;

    CHX_LOG("E. usecaseName:%s", usecaseName);

    struct ChiTargetUsecases* pPerNumTargetUsecases = UsecaseSelector::GetValueFromUsecasepChiTargetUsecases().at(
                                                      "PerNumTargetUsecases");


    numTargets = UsecaseSelector::GetValueFromUsecaseEnum().at("PerNumTargetUsecasesSize");
  

    for (UINT32 i = 0; i < numTargets; i++)
    {
        if (0 < pPerNumTargetUsecases[i].numUsecases)
        {
            ChiUsecase* pUsecasePerTarget = pPerNumTargetUsecases[i].pChiUsecases;

            for (UINT32 index = 0; index < pPerNumTargetUsecases[i].numUsecases; index++)
            {
                if (0 == strcmp(usecaseName, pUsecasePerTarget[index].pUsecaseName))
                {
                    //传入的默认"UsecaseZSL"和挑选出来的不一致,就用挑选出来的
                    pUsecase = &pUsecasePerTarget[index];
                    break;
                }
            }
        }
    }

    CHX_LOG("pUsecase:%p", pUsecase);
    return pUsecase;
}

UseCase has many derived classes in camx. Camx creates different usecase objects for different streams, which are used to manage the selection of features, and create pipelines and sessions.

3.2 Feature

Feature represents a specific function.

The features of Qualcomm Camera HAL3 include HDR (high dynamic range), SuperNight (super night scene), MFNR (multi-frame noise reduction) and so on.

Usecase selects the corresponding feature, and then associates a set of pipelines. The upper layer issues a request request, and the hal layer will select the corresponding feature based on the request.

3.3 Node

Node is a single abstract module with independent processing functions, which can be a software unit or a hardware unit.

Node is a very important parent class in camx. It is an intermediate node for processing camera requests and is used to process requests issued by the pipeline.

Node structure:

Create Node process:

Node nodes are crucial in the camx chi architecture, and data processing is performed through encapsulated Node nodes.

Node initialization process: 

3.4 Pipeline

A collection of nodes. The pipeline provides a collection of all resources for a single specific function and maintains the flow of all hardware resources and data.

3.5 Session

A collection of several associated pipelines, an abstract control unit used to manage pipelines, which contains at least one pipeline and controls all hardware resources, as well as the request flow and data input and output within each pipeline.

3.6 Link

Define different Port connection ports (input port and output port)

3.7 Port

As the input and output port of Node, use SrcPort and DstPort structures to define XML files.

3.8 Topologies

3.9 Commonly used Usecases, Pipelines and their corresponding relationships

The purpose of componentizing Camx and Chi-Cdk is to:

Different models, product performance and positioning are different. Even if the baseline is the same, the use case, etc. may be different.

It gives mobile phone manufacturers a lot of space for customization. UseCase can be reused in scenarios, and the corresponding pipeline can also be used or reused.

Correspondence between commonly used Usecase and Pipeline:

 

4. Relationship between components

4.1 Relationship between basic components:

  • According to the requirements of the upper layer, the corresponding config is streamed down.
  • Next, the corresponding usecase will be selected based on the applied stream.
  • After the usecase selection is completed, the required features will be selected.
  • Different features will be associated with the corresponding pipeline
  • The pipeline is composed of a series of nodes
  • Finally, the stream of the upper layer config will be handed over to each node for processing.

Component diagram: 

At present, the architecture of Qualcomm Camera HAL3 has gradually changed to Feature-centered, abandoning the previous Usecase-centered architecture model.

5. Interaction between basic components and upper layers

5.1 Overall rendering process of Camera App and interaction flow chart with CAMX:

figure 1:

 figure 2:

5.2 Request flow

Each Request issued by the Session in the upper layer corresponds to three Result:

  • partial metadata
  • metadata
  • image data

For each Result, the upload process can be roughly divided into the following two stages:

  • Complete the processing of image data inside the Session and send the results to Usecase
  • Usecase receives data from Session and uploads it to Provider

5.3 The embodiment of session callback function in CAMX

5.3.1 Session::StreamOn()

This method is mainly used to start the data output of the hardware

The specific point is to configure the Sensor register, let it start drawing, and inform each Node of the current Session status, so that they can also be prepared to process data internally, so the subsequent flow of related Requests will be based on This method is based on the premise, so the importance of this method is evident.

The StreamOn method of Session mainly does the following two tasks:

  1. Call the FinalizeDeferPipeline() method
    . If the current pipeline has not been initialized, the pipeline's FinalizePipeline() method will be called. In this method, the FinalizeInitialization, CreateBufferManager, NotifyPipelineCreated and PrepareNodeStreamOn operations will be performed for each Node belonging to the current pipeline. FinalizeInitialization is used to

    complete The initialization action of Node, NotifyPipelineCreated is used to notify Node of the current Pipeline status. At this time, Node can perform corresponding operations according to its own needs. The PrepareNodeStreamOn() method is

    mainly used to complete the control hardware modules of Node such as Sensor and IFE before drawing. Configuration, which includes the setting of exposed parameters.

    The CreateBufferManagers() method involves a very important Buffer management mechanism in CAMX CHI-CDK, which is used for the creation of Node's ImageBufferManager, and this class is used to manage the output port in Node. Buffer application/transfer/release and other operations.

  2. Calling Pipeline's StreamOn() method
    will further notify the CSL part to start the data stream, and call each Node's OnNodeStreamOn() method. This method will call Activate() of ImageBufferManager, which will actually allocate the data for After loading the image data buffer, the user-defined Nod pOnStreamOn() method implemented in the CHI part will be called. The user can do some customized operations in this method.

5.3.2 Session::ProcessCaptureRequest()

Each Request transfer starts with this method as the entry point. The specific process is shown in the figure below:

Pipeline is added to DRQ for the first time by calling the AddDeferredNode method for each Node.

At this time, all Nodes will be added to m_readyNodes, and then by calling the dispatchReadyNodes method, DRQ will be triggered to start the entire internal processing process.

The basic process can be seen in the figure below:

How the result is sent to Usecase after the image data is processed inside the Session:

How does Usecase receive Session data and send it to Provider?

Take the commonly used AdvancedCameraUsecase as an example to sort out the code:

6. Log TAG:

6.1 Power on the camera driver:

Driver power-on log: cam_sensor_driver_cmd | Probe success

6.2 Power off the camera driver:

Driver power off log:

cam_sensor_driver_cmd: CAM_STOP_DEV Success for productname_ofilm_s5khm2_wide sensor_id:0x1ad2,sensor_slave_addr:0x20

cam_sensor_driver_cmd:  CAM_RELEASE_DEV Success for productname_ofilm_s5khm2_wide sensor_id:0x1ad2, slave_addr:0x20

6.3 Before turning on the camera and starting to transmit the first frame:

CAM_START_DEV

6.4 Bottom layer traversal of cameras:

  • CHIUSECASE: [INFO ] chifeature2graphselector.cpp:11256 BuildCameraIdSet() cameraId 4, set 54
  • CHIUSECASE: [INFO ] chifeature2graphselector.cpp:11256 BuildCameraIdSet() cameraId 0, set 50
  • CHIUSECASE: [INFO ] chifeature2graphselector.cpp:11256 BuildCameraIdSet() cameraId 1, set 50
  • CHIUSECASE: [INFO ] chifeature2graphselector.cpp:11256 BuildCameraIdSet() cameraId 2, set 50
  • CHIUSECASE: [INFO ] chifeature2graphselector.cpp:11256 BuildCameraIdSet() cameraId 3, set 50

6.5 Open the camera:

CameraService: CameraService::connect|first frame arrived|CameraService: disconnect: Disconnected|CAM_ACQUIRE_DEV|CAM_START_DEV|CAM_STOP_DEV|CAM_RELEASE_DEV

"configure_streams": Configure streams

"pipelineName": pipeline name

CAMX  :|CHIUSECASE:|STREAM_ONSelectFeatureGraphforRequestFromTable|Node::|CamX:|CHIUSECASE:|Camera3|CameraDevice

7. Others

7.1 Define the xml address of the node in the pipeline:

vendor/qcom/proprietary/chi-cdk/oem/qcom/topology/titan/usecase-components/usecases/UsecaseZSL/pipelines

7.2 Node link mode definition:

The Node and connection methods in Pipeline are defined in XML, which mainly includes the following tag definitions:

  • PipelineName: used to define the name of the Pipeline
  • NodeList: This tag defines all the Nodes of this Pipeline.
  • PortLinkages: This label defines the connection relationship between different ports on the Node.

7.3 vendortag definition file:

/vendor/qcom/proprietary/chi-cdk/api/common/chioemvendortagdefines.h 

7.4  sensor driver directory:

A project named productname, the driver file directory of a camera:

vendor\qcom\proprietary\chi-cdk\oem\qcom\sensor\productname_sensor\productname_ofilm_ov16a1q_front_sensor

8. End

Guess you like

Origin blog.csdn.net/geyichongchujianghu/article/details/131029549