logstash-input-jdbc 同步问题记录

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/jjshouji/article/details/80373906

1.同步没有报错,ES没有索引创建

[elk@test1 bin]$ ./logstash -f ../data_config/account_1.conf 
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Sending Logstash's logs to /home/elk/logstash-5.5.1/logs which is now configured via log4j2.properties
[2018-05-19T13:49:44,131][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.0.7.71:9200/]}}
[2018-05-19T13:49:44,148][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://10.0.7.71:9200/, :path=>"/"}
[2018-05-19T13:49:44,340][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<Java::JavaNet::URI:0x2f62d3b4>}
[2018-05-19T13:49:44,350][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-05-19T13:49:44,437][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-05-19T13:49:44,450][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<Java::JavaNet::URI:0x7456407b>]}
[2018-05-19T13:49:44,453][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>6, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>750}
[2018-05-19T13:49:44,869][INFO ][logstash.pipeline        ] Pipeline main started
[2018-05-19T13:49:44,956][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-05-19T13:49:46,381][INFO ][logstash.inputs.jdbc     ] (0.025000s) SELECT count(*) AS `count` FROM (SELECT id,type as dic_type,code,value,create_time,idx,remarks,crm_id,hr_id from jw_slave.dictionary) AS `t1` LIMIT 1
[2018-05-19T13:49:46,416][INFO ][logstash.inputs.jdbc     ] (0.012000s) SELECT * FROM (SELECT id,type as dic_type,code,value,create_time,idx,remarks,crm_id,hr_id from jw_slave.dictionary) AS `t1` LIMIT 50000 OFFSET 0
{"code":"course_common","@timestamp":"2018-05-19T05:49:46.541Z","create_time":"2016-10-24T03:06:06.000Z","crm_id":"43A0FF8A-D499-4491-B6B8-FA73EA73265F","@version":"1","id":1,"idx":1,"type":"dic","value":"常规课","remarks":"","hr_id":"","dic_type":"course"}
{"code":"term_2018_fall","@timestamp":"2018-05-19T05:49:46.824Z","create_time":"2017-03-21T02:37:14.000Z","crm_id":"","@version":"1","id":161,"idx":201804,"type":"dic","value":"2018秋季","remarks":"","hr_id":"","dic_type":"term"}
{"code":"level_glory","@timestamp":"2018-05-19T05:49:46.824Z","create_time":"2017-06-02T08:53:00.000Z","crm_id":"","@version":"1","id":162,"idx":9,"type":"dic","value":"荣耀班","remarks":"","hr_id":"","dic_type":"level"}
{"code":"level_dyw","@timestamp":"2018-05-19T05:49:46.825Z","create_time":"2017-09-13T07:33:33.000Z","crm_id":"DYW","@version":"1","id":163,"idx":10,"type":"dic","value":"大语文","remarks":"","hr_id":"","dic_type":"level"}
{"code":"course_special","@timestamp":"2018-05-19T05:49:46.826Z","create_time":"2017-12-08T06:48:32.000Z","crm_id":"","@version":"1","id":164,"idx":13,"type":"dic","value":"特色课","remarks":"","hr_id":"","dic_type":"course"}
{"code":"term_2019_winter","@timestamp":"2018-05-19T05:49:46.827Z","create_time":"2018-03-14T03:35:17.000Z","crm_id":"","@version":"1","id":165,"idx":201901,"type":"dic","value":"2019寒假","remarks":"","hr_id":"","dic_type":"term"}
{"code":"term_2019_spring","@timestamp":"2018-05-19T05:49:46.829Z","create_time":"2018-03-14T03:35:19.000Z","crm_id":"","@version":"1","id":166,"idx":201902,"type":"dic","value":"2019春季","remarks":"","hr_id":"","dic_type":"term"}
{"code":"term_2019_summer","@timestamp":"2018-05-19T05:49:46.829Z","create_time":"2018-03-14T03:35:47.000Z","crm_id":"","@version":"1","id":167,"idx":201903,"type":"dic","value":"2019暑假","remarks":"","hr_id":"","dic_type":"term"}
{"code":"term_2019_fall","@timestamp":"2018-05-19T05:49:46.831Z","create_time":"2018-03-14T03:36:07.000Z","crm_id":"","@version":"1","id":168,"idx":201904,"type":"dic","value":"2019秋季","remarks":"","hr_id":"","dic_type":"term"}
[2018-05-19T13:49:47,902][WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}

#同步没有报错,但是ES 没有创建相应的索引

2.配置文件问题

[elk@test1 bin]$ cat  ../data_config/account_1.conf          
input {

 jdbc {
      jdbc_connection_string => "jdbc:mysql://10.0.7.244:3306/jw_account?zeroDateTimeBehavior=convertToNull"
      jdbc_user => "root"
      jdbc_password =>"Abc123" 
      jdbc_driver_library => "/home/elk/mysql-connector-java-5.1.44-bin.jar"
      jdbc_driver_class => "com.mysql.jdbc.Driver"
      jdbc_paging_enabled => "true"
      jdbc_page_size => "50000"
      statement => "SELECT id,type as dic_type,code,value,create_time,idx,remarks,crm_id,hr_id from jw_slave.dictionary"
      #schedule => "* */2 * * *"
      type => "dic"
    }
}
output {
 stdout {
        codec => json_lines
    }
         if[type] == "dic"{
        elasticsearch {
        hosts  => "10.0.7.71:9200"
        index => "dictionary"
        document_type => "dictionary"
        }
    }
}
问题原因:配置文件中自定义 TYPE="dic" 与mysql 表中的字段名称同名,导致同步到索引有重复名称,才同步不进去,别名表字段名。


猜你喜欢

转载自blog.csdn.net/jjshouji/article/details/80373906