[Kettle] study notes Hbase to import data from Mysql

1, connected Hadoop cluster and MySQL

 1-1 download the current cluster configuration file Hbase

 1-2 decompression to change the configuration file directory cdh510

 1-3 Review D: \ kettle data-integration \ plugins \ at \ pentaho-big-data-plugin plugin.properties configuration

 1-4 Configure Hadoop cluster in the Kettle

      After the test connection will find User Home Directory Access and Root Directory Access is being given, this may be because we are operating on Hadoop data in a non-clustered host, will result in no permission to write data into the Hbase, did not try out an exact solution, but it does not affect the reading Hbase data, regardless of where it temporarily. (There is to know to solve the big brother approach, then, seeking guidance ~ ~)

 1-5 connecting to Mysql

 2, Kettle flow

 2-1 Hbase data read

 2-2 ranks transformation

       Since Hbase not a relational database, so we want to transform the ranks of its fields, or the export data is shown below

   Select the core object - Conversion - column switch

 It is recommended blog post, it's an example of the column switch assembly to explain this very image: https://www.cnblogs.com/OliverQin/p/5871330.html

 Then convection fields renamed into fields corresponding to the roll-out table Mysql, PS: field corresponds here to accurately

Table 2-3 Output

Outputs the data to the target table Mysql

 2-4 run results

PS: exploration study, discussion and mutual welcome guidance ~

Guess you like

Origin www.cnblogs.com/AutumnNight/p/10938709.html