logstash测试将数据写到es里去

有了操作logstash的单机经验之后,对其有了大致了解.不满足.

然后.

如何直接用命令行的方式.也是手动输入的方式.将数据写到ES里面去.

将语法稍微变通一下即可。 logstash -e 'inpu
t { stdin{} }  output {  elasticsearch {}  }'

具体如下:
C:\Users\Administrator>D:\es\logstash-5.4.1\logstash-5.4.1\bin\logstash -e 'inpu
t { stdin{} }  output {  elasticsearch {}  }'
Sending Logstash's logs to D:/es/logstash-5.4.1/logstash-5.4.1/logs which is now
configured via log4j2.properties
[2018-02-04T05:17:38,403][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
ool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2018-02-04T05:17:38,419][INFO ][logstash.outputs.elasticsearch] Running health
check to see if an Elasticsearch connection is working {:healthcheck_url=>http:/
/127.0.0.1:9200/, :path=>"/"}
[2018-02-04T05:17:38,606][WARN ][logstash.outputs.elasticsearch] Restored connec
tion to ES instance {:url=>#<URI::HTTP:0x3afb9a53 URL:http://127.0.0.1:9200/>}
[2018-02-04T05:17:38,606][INFO ][logstash.outputs.elasticsearch] Using mapping t
emplate from {:path=>nil}
[2018-02-04T05:17:38,669][INFO ][logstash.outputs.elasticsearch] Attempting to i
nstall template {:manage_template=>{"template"=>"logstash-*", "version"=>50001,
"settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=
>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"pa
th_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text"
, "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"str
ing", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=
>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"
=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"d
ynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_po
int"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}
}}}}}
[2018-02-04T05:17:38,684][INFO ][logstash.outputs.elasticsearch] New Elasticsear
ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0
x384fe3aa URL://127.0.0.1>]}
[2018-02-04T05:17:38,700][INFO ][logstash.pipeline        ] Starting pipeline {"
id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.
delay"=>5, "pipeline.max_inflight"=>250}
[2018-02-04T05:17:38,747][INFO ][logstash.pipeline        ] Pipeline main starte
d
The stdin plugin is now waiting for input:
[2018-02-04T05:17:38,872][INFO ][logstash.agent           ] Successfully started
Logstash API endpoint {:port=>9600}
hello.arpenker

KIBANA中.DEV TOOLS的console里直接查询所有。
GET _search
{
  "query": {
    "match_all": {}
  }
}

即可发现终于新增了一条记录。



  {
        "_index": "logstash-2018.02.03",
        "_type": "logs",
        "_id": "AWFdiFrCBZpn0eY58si-",
        "_score": 1,
        "_source": {
          "@timestamp": "2018-02-03T21:17:52.331Z",
          "@version": "1",
          "host": "iZ94hfcf8jiZ",
          "message": "hello.arpenker\r"
        }
      }
    ]
  }
}

致此。标致着logstach写数据到es,从kibana中查询的流程全部打通了。

猜你喜欢

转载自arpenker.iteye.com/blog/2410171