Interpretation of the Logical Model Orchestrator of Huawei Cloud Digital Factory Platform with Examples

Abstract:   This issue systematically introduces how to use the logical model orchestrator of the Huawei Cloud Digital Factory Platform based on the relevant business activity process involved in a "production execution management" case scenario.

This article is shared from the HUAWEI CLOUD community " Digital Factory in-depth and Simple Series (3): Introduction to the Use of Logical Model Orchestrator ", author: Yunqi MAE.

The manufacturing application engine (Manufacturing App Engine, hereinafter referred to as the MAE engine) of the Huawei Cloud digital factory platform adopts a real-time event-driven architecture. Decoupling. The HUAWEI CLOUD Digital Factory Platform MAE engine provides powerful and easy-to-use logic flow orchestration tools, rule engines, and data flow engines to sense the status changes of business objects in real time and automatically trigger data analysis, processing, and transfer between different business objects. The logic flow orchestration tool is naturally integrated with the information model of the business object, and business personnel can also understand and quickly arrange the context information required for business process processing, and adjust the business logic process in the digital system in response to changes in business requirements at any time, so that the digital system can truly Adapt to the increasingly flexible production business process, realize agile and flexible configuration, and realize the construction and upgrade of digital applications driven by business needs.

The logical flow of the MAE engine of the HUAWEI CLOUD digital factory platform consists of three types of nodes:

  • Trigger: the trigger source of the logic flow engine, which supports the following four trigger types: data change (creation/change), user operation, IoT message, and periodic plan, which can sense and respond to changes in business data/IoT device data and users in real time the execution of the operation;
  • Rule engine: Model and configure business rules, support complex nested rule conditions, real-time dynamic calculation and judgment, and drive compliance execution of business flows;
  • Action executor: the execution result of the business flow, which supports the following four types of actions: business flow (the execution result of the logic flow of business object A, which triggers the execution of the logic flow of business object B), event record (creating event records or backfilling related Business object data), message notification (active notification of business messages: system notification/email/SMS/IM tools, etc.) and external integration (calling external system API, triggering external system business flow).

This issue will continue to expand based on the previous issue of "Production Execution Management" scenario case, to introduce in detail how to use the "Logical Model" orchestrator of Huawei Cloud Digital Factory Platform.

(1) Case Scenario Description

In this scenario case, it is necessary to realize the on-demand automatic flow of data between related business objects involved in "production execution management" by implementing the arrangement and operation of the following four business logic flows:

  • Process task order generation: when creating a production order, according to the process route information of the production order, automatically disassemble and generate the production order according to the process steps under the process route;
  • Generation of production material requirements: Provide the operation "generate material requirement list" on the production order, according to the product and planned output information of the production order, automatically generate the material requirement list according to the material list data in the product's manufacturing BOM;
  • Production order closure: Provide the operation "close production order" on the production order, and the user needs to fill in the "closed reason" when operating, and update the status of the production order and the process order under it to "closed";
  • Completion warehousing update product and warehouse inventory: After the user submits and confirms the "Completion warehousing document", according to the "product", "warehouse" and "quantity" on the completion warehousing document, it will automatically write back and update the " Inventory On Hand" and the "Inventory On Hand" of the product in the corresponding warehouse.

(2) Formal construction steps

The following describes how to use the logic orchestrator of the HUAWEI CLOUD Digital Factory Platform MAE engine to implement the four business logic flows in this scenario:

2.1. Relevant information model preparation:

In the previous issue, when we introduced how to use the information configurator of the Huawei Cloud Digital Factory Platform, we completed the information model configuration of the following business objects: production order, process task order, and material demand order. In order to complete the fourth business flow "update inventory" , we need to supplement the configuration to complete the information model of "Completion Receipt Document".

In the "Transaction" model builder, select the "Production" domain tab, and then create or use the existing business scenario "Production Management", and create a new transaction model "Completion Receipt" in the business scenario:

For the configuration method of the information model field of "Completion Receipt Document", please refer to the usage method of the information configurator introduced in the previous issue, and will not be explained here.

2.2. Production execution related business logic flow modeling:

On the "Modeling Workbench" of the HUAWEI CLOUD Digital Factory Platform, the user uses the "Transaction" model builder to select the corresponding transaction model, and creates and publishes the required logical process in the "Logical Model" of the transaction model:

Usually, we create the corresponding logic flow under the transaction model of the trigger source to complete the business flow processing. When the logic flow is running, it will automatically bring in the instance data of the trigger source model as input information, and then in the logic flow, you can choose to connect as needed To other business models that have an information association relationship with the source model, so as to realize the automatic processing flow of data from the trigger source business object to the target business object.

The logical flow model mainly provides the following operations:

a. Editing: Modify the basic information such as the name, priority and status of the logical flow;

b. Enable: enable or disable the logic flow;

c. Configuration: open the logical flow orchestrator and configure the execution logic of the logical flow;

d. Copy: copy the current logical flow and create a new logical flow;

e. Log: View the historical operation log of the logical flow, which can be used to view the detailed log of a certain operation of the logical flow to diagnose the cause of the abnormality. For the running instance whose running result status of the logical flow is abnormal, it supports retrying through the "retry" operation run.

Logic flow 1: Create production order and generate process order

Under the "Logical Model" of the "Production Order" model, operate the "New Process" button to add a logical flow "Generate Process Order":

After the addition is complete, click the "Configure" operation on the right side of the logical flow to open the logical flow orchestrator and configure the execution logic of the logical flow:

(1) Trigger configuration

Triggers are used to trigger the logic flow to run. Logic Orchestrator currently supports the following trigger types, and select the corresponding trigger type according to actual business needs:

a. Data creation: when the data instance of the selected trigger object model is created, the execution of the logic flow is automatically triggered;

b. Data change: when the value of some fields set in the selected trigger object model data instance changes, the execution of the logic flow is automatically triggered;

c. User operation: When the user is on the front-end interface, the operation executes a "user operation" button defined by the current model, which automatically triggers the execution of the logic flow;

d. Timing trigger: Triggered according to the timing plan strategy, you need to set the task execution cycle (hourly/daily/weekly/monthly/yearly), plan validity period range (start date/end date), etc., and automatically schedule regularly according to the plan strategy Triggers the execution of a logic flow. The logic flow of "timing trigger" supports the configuration of "filtering rules" to filter the range of data instances that trigger the data model according to certain conditions.

The "trigger object" of the trigger supports the selection of the main information model of the current business model and its sub-information models (for example, the trigger object of the logic flow created under the "product" model, you can select the "product information" master data model, or Sub-models such as "BOM/BOM list" can be selected), when the logic flow is running, the data instance information of the business model selected by the "trigger object" is automatically placed in the input information context of the logic flow, and can be used by subsequent logic flow Used by Rule or Action nodes.

After creating a "production order", the user needs to automatically disassemble the production order into process data to generate a process order according to the process route of the production order. Select "data creation" for "trigger type", and then select "production order" for the trigger object. Fill in a trigger name "production order creation" with business meaning:

(2) Create an action node

Click the "+" icon behind the created "Trigger" and select "Add Action (Current)" or "Add Action (Branch)" to create an action node. The former is behind the previous node (that is, the trigger node) Creates an action node which is a branch of an action node created in parallel to the previous node.

The "action" node of the logic flow mainly has the following two configuration items:

  1. Output type: currently supports four output types: create data, update data, pipeline cache and message notification, among which "create data" and "update data" are used to create or Update the data of the relevant upstream and downstream business objects of the current trigger object model; "pipeline cache" is used to temporarily cache the execution result data of the current action node as the data input of subsequent nodes (rules/actions) of the logical flow; "message notification " is used to actively send a message to notify the user of the result data of the logic flow operation.
  2. Output model: You can select all the business models associated with the trigger object model and the newly added business model after the action node is processed:
  • The input model of the logic flow (that is, the trigger object, such as "production order"), and the model in the pipeline after process processing (for example, the production order is associated with models such as "product" and "process route" in the logic flow action node);
  • The sub-model of the above b1 model (such as the sub-model "inventory entry and exit record" of "product") and the parent model;
  • The associated model of the b1 model above, that is, if the information field of a certain business model is associated with the b1 model, for example, assuming that the b1 model is "production order", and the information field of the "material requirement list" model is associated with the "production order" model, then output Model selectable to the Material Requirements Order model.

If the required business model cannot be selected for the output model when creating an action node, you can temporarily leave it blank. After arranging data conversion logic in the action node to supplement other related business object models in series, you can select the required business model. business model.

After creating a "production order", the user needs to automatically dismantle it according to the routing of the production order, and then create the corresponding process order data, then select "create data" for the "output type", select "process order" for the output model, Finally, fill in an action name with business meaning "Create Process Task":

(3) Configure the data conversion logic of the action node

After adding and creating an action node, enter the "Action Configurator", and further arrange the "Data Conversion Logic" in the Action Configurator according to the actual business needs to meet the data requirements of the target output model:

"Action Configurator" has the following 3 configuration items:

  1. Input data field display: The left side displays the input data information field of the current action node. The input information mainly comes from: the output data field of the previous logic flow node, and the input parameter field defined by the user operation.
  2. Logical arrangement of data conversion: Currently, four types of data conversion operator nodes are provided:

Association node : used to obtain the data fields of other business objects that are associated with the input data or the output data of the previous data conversion node;

Split node: used to obtain the input data or the sub-model data of the output data of the previous data conversion node;

Calculation node: used to perform four arithmetic operations on the input data or the numeric field value in the output data of the previous data conversion node or refer to standard functions for data processing;

Aggregation node: It is used to summarize the input data or the output data of the previous data conversion node according to certain dimension fields.

According to actual business needs, flexibly combine and arrange 4 types of data conversion operator nodes to realize data conversion and processing of input data of action nodes, so as to meet the final output data result requirements of current action nodes

3. Output data field mapping: After the action node is processed by "data conversion", it is cached in the data field in the data pipeline, and mapped to the field value of the output model. Output model field values, and also support manual input of constant values, reference to standard function calculation results and system variables (system time, current login personnel, etc.).

According to business requirements, the production order needs to be split according to the process steps under its production process route, and the corresponding process task order data needs to be created in the "Create Process Task" action node:

a. Add an associated node "Associated Routing" to achieve the related "Processing Route" master data from the input "Production Order" data association:

The associated conversion node has the following three configuration areas:

  • Source model: Select the required model from the input data of the action node or the output data of the previous data conversion node as the input source data of the associated conversion node. For example, for the current case scenario, select "Production Order" for the "source model";
  • Target model: According to the "source model" business object selected by the association conversion node, you can select the data model of other business objects that are associated with the data model of this business object. When the logic flow is running, it will be based on the input "source model" "The data instance automatically acquires the data instance of the selected target association model in series. For example, for the current case scenario, the "target association model" selects the "process route" that is associated with the "production order";
  • Target Field: Configure the specific data field range of the output "Target Model" that needs to be returned by the associated transformation node. For example, for the current case scenario, "Target Field" selects the field range that needs to be returned from the "Processing Route" data.

b. Add a split conversion node "expand the process procedure", based on the "process route" data output by the previous step of the associated node conversion, to further obtain the "process process" sub-model data under the "process route" data to achieve data splitting Convert:

The split conversion node has the following 3 configuration items:

  • Source model: Select the required model from the input data of the action node or the output data of the previous data conversion node as the input source data of the split conversion node. For example, for the current case scenario, the "source model" selects the previous associated node's output "route";
  • Target model: According to the "source model" business object selected by the associated conversion node, you can select the sub-model of the data model of the business object or other business object models associated with the business object. When the logic flow is running, it will be based on the input The "source model" data instance is automatically connected in series to the data instance of the selected target associated model. For example, for the current case scenario, the "target model" selects the sub-model "process procedure" of the "process route" model;
  • Target field: Configure the specific data field range of the output "target model" that needs to be returned by the split conversion node. For example, for the current case scenario, "target field" selects the field range that needs to be returned from the "process procedure" data.

(4) Output model configuration

After arranging the "Data Conversion Logic" of the action node "Create Process Task" through the "Action Configurator", after obtaining the process route and process process data required to create the output model "Process Task Order" data, finally map and configure "on-demand" output model" field value.

The "Output Model Configurator" has the following 3 configuration areas:

  1. Pipeline data display: The left side displays the data fields after the input data of the current action node has been "data converted".
  2. Output model field value mapping: The mapping method of the output field value supports the following four methods:
  • Drag the pipeline field : drag the data field in the data pipeline on the left side of the mapping, and the "Output Model Configurator" will automatically verify whether the data types of the pipeline field and the output field are compatible (click the header "Field Value" of the "Output Model" The prompt icon behind the column, you can view the detailed field mapping rules);
  • Manual input of fixed constant values: the output model field value supports direct manual input of fixed constant values;
  • Reference standard function calculation: The platform has built-in some commonly used data calculation functions. Users can click and drag a function to an output model field according to business needs, and then configure the parameter values ​​required by the function (the parameters support selection in the reference pipeline. A certain field value of the output model or a certain field value of the output model), taking "Calculate Actual Work Hours" as an example, you can refer to the "Date Interval Function" and drag it to the "Actual Work Hours (Minutes)" field of the "Operation Order":

Configure parameter values ​​of standard functions as needed, and support referencing pipeline data fields:

  • Reference system global variables: The platform has built-in some system global variables, such as system time, current login personnel, current personnel organization, etc., and supports dragging a global variable to map to an output model field.

For the action node whose output type is "Create Data", the output model configuration will automatically verify that all the required fields of the model except the "Data Identification" field must have mapping values ​​(the background will automatically process when the data identification field is left blank). For action nodes of other output types such as "update data", just configure the mapping value for the field value that needs to be changed.

Through the above four configuration steps, the modeling configuration of the business logic flow of "creating a production order and generating a process order" is completed.

Logical flow 2: Operate production orders and generate material requirements

"Operate production order, generate material requirement list" logic flow requirements:

When the user clicks the "Generate Material Requirements" button on a production order, the BOM data will be automatically expanded according to the product master data according to the product field information and planned output of the production order, and the base quantity of the expanded material list will be multiplied by " According to the planned output of "Production Order", the required quantity of the material is obtained, and then the corresponding material requirement list and detailed data of the required material are created. The material requirement order number generated in the logic flow needs to be backfilled to "Production Order".

(1) Added user operation "Generate Material Requirements"

On the "Production Order" model, add a new user action "Generate Material Requirements":

"User Action" has the following 3 configuration items:

  1. Operation object: You can select the main information model of the current model and its sub-information model. For example, we can control the main model data of the "Material Requisition Document" (that is, a certain Material Requisition Document) or the sub-model "Material Requirement Details" of the Material Requisition Document. ” respectively define the desired user action;
  2. Enable parameters: Configure whether users need to input some parameter information when using operations on the front-end interface. For example, in the next case, the "close production order" operation requires the user to input "close reason". In the next case, the configuration method of "parameters" for user operations will be introduced in detail;
  3. Front-end display control of user operations: Supports configuration of rule conditions for user operations to be displayed on the front end:

For example, to control the production order that has created the material requirement, the "generate material requirement" operation can no longer be used, then configure the "generate material requirement" user operation to display the rule condition that the value of the "associated material requirement" field of the "production order" is empty .

(2) Create a logical flow triggered by user operations

In the configuration window of the newly added "Generate Material Requirements" operation, you can select "Save and Configure Logic Flow" to quickly create a logic flow triggered by user operations, or you can first confirm to save the "User Operation", and then add Bind "user action" in the trigger configuration of the logic flow:

In the orchestration and configuration steps of the logic flow of "generating material requirements", the configuration method of similar requirements in the logic flow of "creating production orders and generating process task orders" above will not be explained further:

There are three points for the different requirements and the configuration method of the corresponding logical flow:

1. Calculation of the required quantity of materials: output the field value of the required quantity of materials in the details of the created material requirement list, which needs to be split into BOM material list data according to the product master data, and then multiply the reference quantity on the material list by the "production order" The planned output is calculated to obtain the "Material Requirement Quantity".

To meet this requirement, it is necessary to use the "Calculation Node" in the "Data Conversion Logic" of the action node of "Create Material Requisition Details":

The following is the configuration of the calculation node "Calculate Material Requirement Quantity":

Click the "Add" button on the right side of the "Measurement Information Field" to add a calculation field "Material Requirement Quantity", configure in sequence: calculation field name, calculation field code, and then edit the calculation formula: drag "production" from the data pipeline on the left The "Planned Output" field of "Order" and the "Quantity" field of "BOM Bill of Material" are used as calculation parameters, and then the multiplication sign "*" operator is entered in the middle. The calculation formula supports the four operators of "+-*/" addition, subtraction, multiplication and division, and supports clicking and dragging to refer to standard function calculations.

2. After summarizing the required quantity by the same material, generate material requirement details:

It is necessary to summarize the BOM data output by the pre-sequence conversion node and summarize the required quantity according to the same material code, and then generate the corresponding detailed material demand data.

In the action node of "Create Material Requisition Details", the "Aggregation Operator Node" needs to be used in the "Data Conversion Logic":

The following is the configuration of the aggregation node "Summary Demand by Item":

Configuration instructions for aggregation conversion nodes:

  • Check the dimension field to be output: In the data pipeline field on the left, check the dimension field to be output by the aggregation node as the summary dimension of the aggregation operation. Unchecked fields will not appear in the output result of the aggregation node. For example, for the current case, if the output model needs the material and unit of measure fields of the BOM list in the data pipeline, and the code field of the material requirement list, check the corresponding fields in the data pipeline on the left;
  • Add a numerical field that needs summary calculation: Click the "Add" button on the right side of "Aggregate Information Field" to add an aggregate calculation field "Material Requirement Quantity Summary", and configure in sequence: Aggregate Field Name, Aggregate Field Code, Select the Aggregate Calculation Type as "Sum", and then edit the aggregate calculation formula: drag the "Material Requirement Quantity" field that needs to be summed up from the data pipeline on the left as a calculation parameter. The calculation formula supports the four arithmetic operators "+-*/" addition, subtraction, multiplication and division, as well as click and drag to refer to standard function calculations.

After saving the aggregation node, when we configure the output model, we will see that only the checked aggregation dimension fields and calculation fields are kept in the data pipeline:

3. The data of the material requirement list is the "header row structure". To create a material requirement list, the demand details need to be created synchronously:

In the logic flow, first create the action node of "Generate Material Requirements", and then create the action node of "Generate Material Requirements Details". The previous action node, according to the input "Production Order" data instance, outputs and creates the corresponding material requirements Single data, the platform will automatically add the MOR data created by the action node output to the data pipeline; in the latter action node, when the output creates the corresponding MOR details (that is, the sub-model "requirement details" of the MOR) data , you can get the associated "Material Requisition No." required to create "Material Requisition Details" data from the data pipeline.

4. The created material requirement order number needs to be backfilled to update the "production order":

The previous "Create Material Requisition" action node outputs the created Material Requirement Number, which needs to be backfilled and updated to the "Associated Material Requirement" field of "Production Order".

After the "Create Material Requisition Document" action node, click "Add Action (Branch)" to add a branch node "Backfill Order Requisition Number" parallel to the "Create Material Requisition Details" action node:

Select "Update Data" as the output type of the "Backfill Order Requisition Number" action node, and then configure the output model fields:

For output model configuration instructions with output type "Update Data":

  • If the "Data ID" field of the output model exists in the data pipeline (for example, the "code" of the material requirement list), drag the value mapped to the "Data ID" field of the output model, and check "Filter Field" to pass the "Data Identification" value to determine the specific data instance that needs to update the output model;
  • If it is necessary to combine certain field conditions to determine the output model data range for target update, drag the required data pipeline field to map to the corresponding field value of the output model and check "Filter Field", and select "Output Model" to "Configure filter rules" on the right side of the box to set the target data filter conditions of the output model in combination:

So far, the modeling configuration of the business logic flow of "operating production orders and generating material requirements" has been completed.

Logical flow 3: Close the production order and automatically close the process order

Requirements for the logical flow of "close the production order and automatically close the process order":

For the "production order" in any state of "not started/in progress/completed", an operation button of "close order" is provided. When the user operates the "close order" button, he needs to fill in the "reason for closing" and then verify the production order If the condition is not satisfied, the user will be prompted for the reason. If the condition is met, the status of the "production order" will be updated to "closed", and the "reason for closing" filled in by the user will be updated to the "reason for closing" of the "production order". field. After closing the "Production Order", all "Process Orders" related to the "Production Order" need to be closed synchronously, that is, the status of the process order is updated to "Closed".

(1) Add user operation "Close"

On the "Production Order" model, add a user action "Close" and check "Enable Parameters?":

The parameter configuration window of "User Operation" has the following two configuration items:

1. Parameter type: The parameter supports two definition methods:

  • Reference model field: You can refer to a field of the business model to which the operation belongs as a parameter, and refer to the parameters of the model field. When the user uses the operation, the parameter value will automatically refer to the corresponding field value on the data instance of the current model, such as defining a parameter "Executor" refers to the "Person in Charge" field of the "Production Order" model. When the user uses the operation, the value of the "Executor" parameter will automatically default to the "Person in Charge" of the current "Production Order";
  • Customization: Select the field type of the parameter, which supports numerical value, text, date, time and array. Among them, the "array" type parameter is used to meet such business needs: when the user performs an operation, he needs to fill in multiple rows of data as input information. For example, for a demand scenario where a production order is split into multiple orders, an array-type parameter needs to be defined to carry the information that needs to be split into multiple orders.

2. Parameter display control:

  • "Display only" : It is used to control the parameters to only be displayed on the front-end interface and cannot be edited. It is usually applicable to parameters that refer to model fields to display the corresponding field values ​​​​of the current model data instance as contextual reference information for user operations, such as for splitting In the production order scenario, define a display-only parameter that refers to the "Planned Output" field of the "Production Order", which is used to display the "Planned Output" data of the "Production Order" before splitting in the front-end operation window, as the user uses the split Reference information during operation;
  • "Required": It is used to control whether the parameter value must be filled in when the user uses the operation.

For the "close order" operation, we add a parameter of the "custom" type, select "text" as the field type, then fill in the name of the parameter displayed on the front end, and the "parameter code" for background transmission, and finally check the "required Fill" to complete the parameter configuration.

After configuring the parameters of "Close Order", select and click "Save and Configure Logic Flow" to start configuring the processing logic flow of the "Close Order" operation.

(2) Add a rule node to verify the order status

Add a "rule" node and configure the rule conditions:

The rule node has the following three configuration items:

1. Rule editing mode: Provides two ways of editing rule conditions: "General" and "Script":

  • Conventional: Fully graphical editing rule conditions, usually used by default, can meet the needs of most scenarios;
  • Script: manually fill in regular expressions, support dragging and referencing data fields in the left pipeline, system global variables, or refer to standard functions as parameters of expressions, and can realize complex regular expressions.

2. Matching rules: Support to combine complex rule conditions by adding multiple "condition groups", and each condition group can set the condition combination mode as "satisfy all conditions" or "satisfy any condition". Under the condition group, add a specific condition: select a Condition Field, select a condition operator, and select or enter a condition value.

3. Prompt text for verification: The content of the configured prompt text is used to remind the user on the front end when the rule judges that the conditions are not met when the logic flow is running.

For the rule node of "Close Order", select "Any Condition" for "Matching Rule", and then add three conditions, and then configure the three status values ​​that the status field of "Production Order" allows to meet: "In progress", "Completed completed" and "not started".

Other orchestration and configuration steps of the "close order" logic flow are similar to the configuration method of the logic flow of "creating a production order and generating a process order" described above, and will not be described further:

So far, the modeling configuration of the business logic flow of "close the production order and automatically close the process order" has been completed.

Logical flow 4: Confirm the finished storage order, and automatically update the product and warehouse inventory

Requirements for the logic flow of "confirming the completion receipt and automatically updating the product and warehouse inventory":

"Completion Receipt" provides an operation button of "Confirm Complete". When the user operates the "Confirm Complete" button, it needs to verify whether the status of the "Complete Receipt" meets the conditions. If the conditions are not met, the user will be prompted. If the conditions are met, then According to information such as "Product", "Warehouse" and "Receipt Quantity" of "Completion Receipt Document", update the "Inventory on hand" of the corresponding product master data and the inventory on hand of the product under the corresponding warehouse. Finally update the status of "Completion Receipt Document" to "Completed".

The orchestration and configuration methods involved in the logic flow of "confirming completion receipt" have been introduced in the above several logic flow cases, and will not be described further.

Here we focus on the configuration method of the "Update Warehouse Product Inventory" action node:

  • Select "Update Data" for the output type, and check "Create data if no match": because when a product enters a certain warehouse for the first time, there is no inventory data of the product under the warehouse (that is, "Warehouse The data instance of the product inventory does not exist in the sub-model "Product Inventory" of the "spatial location model), then the first warehousing is to create the corresponding product inventory data, and the subsequent completed warehousing operation is to update the corresponding product inventory data , so for this business scenario, you need to check "Create data if no match".
  • Configure the field mapping of the output model "Warehouse-Product Inventory" to the "Warehouse Location" and "Product" field values ​​of "Completion Receipt", and check "Filter Field" to match the target data of the "Warehouse-Product Inventory" model . Finally, refer to the standard function "current value self-increment function" to calculate the "inventory on hand" after completion and storage:

"Basic Field" parameter selects the "Inventory On Hand" field of the output model "Warehouse-Product Inventory" (the current inventory of the corresponding warehouse product can be obtained), and the "Incremental Field" parameter selects the "Receipt" of "Completed Receipt" Inventory Quantity" to increase the "Incoming Quantity" to the current inventory of a product in the warehouse.

So far, we have completed the modeling configuration of the four business logic flows in the case scenario. After the logic flow modeling is completed, you need to enable the release of the logic flow and the business model to which it belongs, so that the logic flow will take effect:

(3) Operation effect verification

Logic 1: Create a production order and automatically generate a process order

Create a production order, and enter the field information of the production "product", "routing" and "planned output" of the order:

After saving the "production order", enter the detail editing page of the created production order to check and verify whether the corresponding process order is created synchronously:

Logical flow 2: Operate production orders and generate material requirements

Use the production order created in the previous step to verify that there is an action button for "Generate Material Requirements":

Then click the "Generate Material Requirement" button, and after the execution is complete, check whether the corresponding Material Requisition has been generated, whether the field value of the "Associated Material Requirement" on the production order has been automatically backfilled and updated, and whether the "Generate Material Requirement" button is no longer visible:

Check whether the demand details of the generated material requirements list cover the BOM materials of the products on the production order:

Logical flow 3: Close the production order and automatically close the process order

Using the production order created in the previous step, verify that there is a "Close" action button:

Then click the "Close" button to verify whether the pop-up window needs to enter the "Close Reason":

After confirming, verify whether the status of the production order and related process order has been automatically updated to "closed":

Logical flow 4: Confirm the finished storage order, and automatically update the product and warehouse inventory

Create a "Completion Receipt" and enter the field information of the production "product", "receipt quantity" and "warehouse location" of the order:

After saving the "Receipt of Completion", on the data viewing page of the Receipt of Completion, click Operation, and then select "Confirm Complete":

After waiting for the completion of the operation, check whether the existing quantity of the product has increased in the "Product" master data function:

In the "Factory > Warehouse" master data function, check whether the existing quantity of products under the "Finished Warehouse" warehouse has increased:

The above is a related business activity process involved in an actual "production execution management" scenario, and systematically introduces how to use the logical model orchestrator of Huawei Cloud Digital Factory Platform to complete the modeling and use of related business process logic. The next issue will systematically introduce how to use the "Analysis" model configurator of the HUAWEI CLOUD Digital Factory Platform to build business data statistical analysis functions.

Add HUAWEI CLOUD IoT assistant WeChat hwc-iot, reply "Digital Factory" and apply to experience HUAWEI CLOUD Digital Factory

 

Click to follow and learn about Huawei Cloud's fresh technologies for the first time~

{{o.name}}
{{m.name}}

Guess you like

Origin my.oschina.net/u/4526289/blog/8904149