SAP data source catalog and data source catalog in BW and loading/conversion of source data

This is also a very small topic: it
may be expanded in the future:

In BW's Data Source, if you select the source system as SAP, you can see all the data source folders of the SAP system. The folders of these data sources have a name: application components.

Load master data from SAP source

So how does BW access SAP's data source?
In order to access SAP's data source, map the data to BW's infoprovider. We have to tell BW the name and field of the data source.
This process is to copy the data source metadata.

Now, under our BW data source, we can see many application components.
Insert picture description here
These directories are fixed.
It's under SAP's RSA5:
Insert picture description here
If you compare and find that BW has fewer application components than RSA5, it is because the components that do not replicate metadata are hidden here in BW.
Need to open:
Insert picture description here

Now you only need to right-click on the application component and click Copy Metadata.
Insert picture description here
Of course, you can also create it yourself.
These can all be done directly on the BW side, because your RFC and SAP are connected anyway.

Or you can go to SAP and use SBIW to build a data source yourself. Generally, we build a data source based on views.
Insert picture description here
Insert picture description here
Or under the original system in BW, right-click to customize the extractor, and then log in to SAP's SBIW.
Insert picture description here

The above SAPs all stand for ERP systems.

If you think the data source you want to create does not belong to all the existing components, then you can create a new component folder:
select the post-processing of the general data source under SBIW, and insert the name of the new component you want to add in the hierarchy , Then change the description and save.
Insert picture description here
When you create a new data source, you have to select application components. That is, which component you want to build under.
Insert picture description here
Then select the field to be sent out.
Insert picture description here
This loading of the main data from the source system is skipped directly. Anyway, the data source has been copied. Whether you built it yourself or the system already has it, the next step is to directly build a DTP and convert it to an info object (provided that This info object has main data or text) and then select the extraction method: full or incremental, just get a filter if there is a time limit.
After extracting the main data or text, just look at it in the info object.

There are a lot of data sources, just use an infosource.

Load transaction data from SAP source system

For example: the data source is 0CO_OM_CCA_1
InfoSource: Cost Centers and Costs
extraction structure is written as an infosource, which I don’t understand. Probably use info source as a unified target source for different data sources.
Insert picture description here


Let's look at the conversion first. The following buttons are more useful.
1 You can see how many fields there are in the thumbnail.
2 See the technical field request number.
3 View the detailed conversion.
Insert picture description here
3 After opening, this is the case:
Insert picture description here

The role of conversion

Conversion is actually a rule, and each field of each data record in each data packet is processed through each data conversion rule. (The package is in DTP)
From the above conversion, we can see how many fields each piece of data contains.
Generally, how many records are contained in each package must be set in DTP. Then you are extracting data package by package, so conversion is also processing package by package.
If the package is large, the processing will be slow. (The premise is that you have conversion rules. If you pull the number directly, it doesn't matter. The conversion search takes time)

Rules can be made in the start routine, end or expert routine, and also in the fields.

Start routine

The ABAP to run for each packet when the conversion starts. After the calculation is performed, it is stored in a global data structure or table. (The memory resides in the buffer of the data conversion program)
This structure or table can be accessed from other routines. (Read from the buffer or write to the buffer again)

Function: Data preparation before the conversion starts, based on the package. For
example: delete some records that do not require updating, and buffer the data table to an internal table for conversion
Insert picture description here
Insert picture description here
Insert picture description here
. Read from the table to the work area.
If there is a lot of data to be processed, this avoids multiple accesses to the database by a single rule, and direct access to the buffer can improve performance.

Field assignment

For simple field mapping, no ABAP coding is required to set a constant value, or if a formula is used, the field assignment here is to convert the corresponding field on each data record of the data packet after starting the routine.
Direct connection means direct distribution. The following is to write a code.
Insert picture description here
You can also change to other types, depending on your needs: the
Insert picture description here
source can have multiple fields, and the target generally has one field.
Let’s take a closer look at the options above.

no transformation

Generally don't choose this one. . .

constant

Specify constant

direct assignment

Direct connection, filling from the source. Or pull a match of the same type. If you pull the same type but different units, such as currency units or quantity units, you have to use the currency unit and measurement unit conversion function to convert the source unit to the target unit.

formula

Update the info object by calculating the value of the formula

read master data

What is read is the main data of an info object.
In this case, there is a field in the target, and there is no direct matching field in the source, but there is a matching field in the attribute of an information object in the source. For example, if there is a company code in the target, and there is only a cost center in the source, but the cost center has a company code attribute, then I can read the company code from the master data table of this cost center to fill in the target company code.
If the master data of your cost center is changed, you must change it here.

routine

This is a field routine with only one return value. This option is valid for all attributes of the info object or only for display attributes.

Rule group: rule group

So why did this group come?
I don't seem to know much.
Generally, I only see standard group and technical group in the transaction data. The
standard group is all the fields and conversion rules.
Insert picture description here
The technical group is usually a record mode,
Insert picture description here
but there is also a new rule group written there.
What is it for?

This explanation is more interesting, but we didn't use it that way.

We have seen from the above that there can be many rule groups in a conversion. Each rule group determines a set of conversion rules.

Imagine your source is:

Order date
Delivery date
Order quantity
Delivery quantity

The target has only one date feature: 0CALDAY
then we can split the rule group, rule group 1: if the key value is the order quantity, update the order date to 0CALDAY
Rule group 2: if the key value is the delivery quantity, update the delivery date to 0CALDAY

(Of course this can also be achieved by writing code)

End routine

Data post-processing is executed after packet-by-packet conversion.
For example, delete some unnecessary data.

Expert routine

Special purpose self-programming conversion.
You have to get the message transmission of the monitor yourself, or else you can't monitor DTP. If you have seen the conversion rules and then built expert routines, these conversion rules will be deleted.

Collection type

One thing to talk about is the collection type of key values. These are the way to update the key value when the primary key is the same. Total, minimum, maximum.
There seems to be only coverage in the info object.
It seems that DSO can choose to cover or other collection methods.

Guess you like

Origin blog.csdn.net/weixin_45689053/article/details/111036315