Obtain system information, table input and text file output:
Picture: This is the entire conversion.
Set a parameter in the acquisition of system information. The parameter is the where condition of the select statement in the table input. In the table input, pay attention to checking and replacing the variables in the SQL statement, and fill in the next step to obtain system information.
And the system information is set from the beginning of the conversion
Before performing the conversion, set its parameters below. The parameter device_id is 550377, so the SQL statement input for the entire table is:
SELECT *
FROM zj_jyxx_info1 WHERE DEVICE_ID>'550377'
Text file output Here, the data in the table can be output to the TXT file through this component.
The text file is entered into the database:
The whole process is as follows:
Enter data through text files and transfer them to tables in the database.
Here, double-click the text file to enter, click Browse to find the text file to be entered, click Open, and then click Add
Here, you can set the input parameters of the text file: file type, separator, encoding method, etc.
file type:
The plain text file chooses CSV, Comma Seperated Value file (Comma Seperated Value), it is a kind of
A plain text file format used to store data, usually used in spreadsheets or database software.
Separator: The key option, the text content is divided according to its content, it is recommended to have multiple "spaces" + ":".
Table output Here, after selecting the database and table, you can set the input field mapping, because my text file is the data I exported from the table before, so I can directly correspond to the corresponding field, and then click execute.
Excel input:
The whole process is as follows:
Inputting through Excel is somewhat similar to inputting text documents
The same is to browse and select to add
Then select the Excel place where your data is located on the worksheet page, usually sheet
You can get the field directly from the field page
Click OK, and then the table output is similar to the above text file input, because my Excel data is directly exported from the database, so the table fields of the data can be obtained directly and input into the database without rebuilding the table.
Then click Run, and the data is imported from Excel to the database.
xml file input
As shown in the figure, the overall process of xml file input
This step is basically the same as the text file input.
In the content part, you need to select the encoding method and the path of cyclic reading;
For example, the xml file is: <?xml version="1.0" encoding="utf-8"?>
<res>
<item>
<Sqid>SPSCSP3317000636</Sqid>
<QYMC>Shanghai Hualing Technology Development Co., Ltd.</QYMC>
<Fzjg></Fzjg>
<Lxrxm></Lxrxm>
<Lxrsj></Lxrsj>
<InsertTime>2017-11-23 09:49:00.0</InsertTime>
</item>
<item>
<Sqid>SPSCSP3317000636</Sqid>
<QYMC>Shanghai Hualing Technology Development Co., Ltd.</QYMC>
<Fzjg></Fzjg>
<Lxrxm></Lxrxm>
<Lxrsj></Lxrsj>
<InsertTime>2017-11-23 09:49:00.0</InsertTime>
</item>
</res>
Then the path of circular reading is: /item
For the field location, you need to click Get Field to get all the fields in /item in the xml file.
After that is the table output, select the target table, enter the field mapping, click OK, and then run.
Generate random number and output after calculation
The picture shows the whole process
On the page for generating random values, fill in the name of the random number and select the type of random number
On the calculator page, select the new field to be generated, the calculation method, the field to be calculated, the type and length of the generated value.
The above picture shows the final output text file, num1, num2, num3, and the value obtained by calculating num1*num2
In the log writing, the field value to be printed can be obtained by obtaining the field, and then the random number generated and the calculation result will be written in the log below.
CSV file input
This is the total flow of CSV file input
First get some data to generate a TXT file
The data format is shown above
After the file input, select the separator and encoding method, and get the fields, and then preview to see if there are any problems.
Then table output, input field mapping. After the execution was successful.
This is the data after successful execution.
json input
The following figure shows the entire process
To input through json, you first need to save your json string in .js format, otherwise kettle will not recognize it, and then change the encoding format to ANSI, so that there will be no garbled characters.
After that, add files, select fields, select the connection database, target table, and field mapping in the table output section.
The above picture shows the json input string and the data inserted into the target table after the table is output.
Generate records
The above figure shows the entire process, this process is to output the generated records as a TXT file.
In generating records, you can increase the name type and length assignment of the field, etc. The above limitation is to limit the number of rows you display, for example, it is limited to five lines, the following figure
The effect after outputting through a text file is shown in the following figure:
ABC is the name of the three fields. Below are their values.
Get the name of the subdirectory:
Through this component, all subfolders in a directory can be output as field data
The following picture is the effect picture after outputting as a text file
Get file name
Through this component, the desired file name can be generated through this component or inserted into the database.
This is the previewed effect picture, including the file name and file path, file creation time, size and other detailed information.
Get the number of file lines:
The number of rows of data in the file can be obtained through this component
The following picture is the preview effect picture
Get the table name:
The picture below is an overview
Get all the tables in a database by getting the table name component, and insert them into the table as data field information
You can choose whether to include the database name, table name, view, stored procedure name, etc. in the Get Table Name component.
The following figure shows the preview effect after only the table name is displayed
The following figure shows the effect displayed in the database after output through the table
Get resource library configuration
For the time being, I only know that the component can get all jobs and trans under the database connected to the kettle
The following picture is the preview effect picture
Excel output:
Through this component, the data in the table can be output to Excel. It was troublesome to export each time before. Through the kettle, you can directly select table input, write SQL, and directly select Excel output and click to run. The speed is also very fast.
The picture above is an overview diagram, and the picture below is an effect diagram of table input and Excel output
Excel output Here you can choose the extension of the output Excel, whether it is xls or xlsx.
json output
Through this component, the data in the table can be output into a json string
The picture below is an overview
The table input is the same without any changes.
Json output Here you can select the output file name, the number of output files, the encoding method and so on.
The following figure shows the effect of selecting four data to be output
SQL file output
The component can be input through tables, and the table structure and data of the input tables can be changed into SQL statements for output through files.
The picture below is an overview
Table input is the same as before.
The following figure shows the SQL file output component
Choose to connect to the database and target table. In the output file below, you can select the output file name and the statement of creating a table or clearing the table, and the following options including the time on schedule.
The content page can choose the date format and encoding format.
The following figure shows the output file, which can be directly executed in the database.