springCloud、boot集成elk

elasticsearch、logstash、kibana版本均为5.6.2版,需要注意版本匹配问题
1,启动elashticsearch
2,logstash/config目录下新建log.conf文件,其内容:
input {
  # For detail config for log4j as input, 
  # See: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-log4j.html
      tcp { 
    mode => "server"
    host => "127.0.0.1"
        port => 4567
        codec => json_lines 
    }  
}
filter {
  #Only matched data are send to output.
}
output {
  # For detail config for elasticsearch as output, 
  # See: https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html
  elasticsearch {  
    hosts  => ["127.0.0.1:9200"]   #ElasticSearch host, can be array.
    index  => "applog"         #The index to write data to.
  }
}
3,kibana配置.修改kibana/config文件夹中的kibana.yml的配置文件
server.port: 5601
server.host: "localhost"
elasticsearch.url: "http://localhost:9200"
kibana.index: ".kibana"

4,springCload、springBoot中的logback-spring.xml文件配置将日志写入logstash

    <appender name="logstash2" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <remoteHost>127.0.0.1</remoteHost>
        <port>4567</port>
        <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
            <!-- Minimum logging level to be presented in the console logs-->
            <level>INFO</level> <!--写入logstash的日志级别-->
        </filter>
        <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder"/>
    </appender>
    ​
    <root level="INFO">
        <appender-ref ref="console"/>
        <appender-ref ref="logstash"/>
        <appender-ref ref="logstash2"/>
        <!--<appender-ref ref="flatfile"/>-->
    </root>

后记:日志是直接写入到elashticsearch,可以集成kafka或redis作为缓冲。

猜你喜欢

转载自blog.csdn.net/foolone/article/details/80981174