springboot整合elk(elasticsearch-6.8.1,logstash-6.3.0,kibana-6.8.1)
本案例demo及elk下载地址:资源下载链接
https://download.csdn.net/download/dayonglove2018/12881644
logstash感觉也应该用对应的版本,只是我这边下载速度太太太太慢了,几kb/s 甚至出现不下载。。。。
下载
https://www.elastic.co/cn/downloads/past-releases#elasticsearch
https://artifacts.elastic.co/downloads/logstash/logstash-6.3.0.zip
https://artifacts.elastic.co/downloads/kibana/kibana-6.8.1-windows-x86_64.zip
下载了好几个世纪:
配置与安装
配置elasticsearch
修改yml配置
找到elasticsearch.yml文件,在文件最下面添加如下内容
http.cors.enabled: true
http.cors.allow-origin: "*"
http.cors.allow-headers: Authorization,Content-Type
xpack.security.enabled: true
xpack.security.transport.ssl.enabled: true
双击启动 elasticsearch.bat
访问localhost:9200 会出现如下json说明启动成功!
设置ES密码
注意:得先启动es服务以后再开启窗口设置密码!
进入到D:\elk\elasticsearch-6.8.1\bin目录,启动cmd 执行:
elasticsearch-setup-passwords interactive
这个图片是之前安装过的,也是同样的步骤
es安装,配置好了。可以关了黑窗口,再重新启动一次
启动OK!
配置kibana
修改kibana.yml文件
添加如下内容:
刚才设置的es的用户名和密码,配置到这里
server.host: "0.0.0.0"
# ES的用户名和密码
elasticsearch.url: "http://localhost:9200"
elasticsearch.username: "elastic"
elasticsearch.password: "123456"
启动kibana
双击kibana.bat
访问:http://localhost:5601/
启动成功。用刚才配置的用户名(elastic)和密码(123456)登录即可!
配置logstash
新建logstash.conf文件,放到bin目录下,方便启动操作
配置内容:
logstash.conf
读取控制台日志:
input {
stdin {
} }
input {
tcp {
host => "127.0.0.1"
port => 9250
mode => "server"
# tags => ["tags"]
codec => json_lines
}
}
# output {
stdout {
codec => rubydebug } }
output {
stdout{
codec =>rubydebug}
elasticsearch {
hosts => ["localhost:9200"]
index => "logback-%{+YYYY.MM.dd}"
user => "elastic"
password => "123456"
}
}
读取日志文件:
input {
file {
path => "c:/opt/logs/java-contract-info.log" # 日志文件
type => "elasticsearch"
discover_interval => 3 #心跳监听日志文件是否改变
start_position => "beginning" #从文件开始处读写
}
}
output {
stdout{
codec => rubydebug
}
elasticsearch {
hosts => "localhost:9200"
index => "logstash-log4j2-%{+YYYY.MM.dd}"
user => "elastic"
password => "123456"
}
}
启动logstash
进入到 D:\elk\logstash-6.3.0\bin 目录下,输入命令:
logstash -f logtash.conf
到此elk都启动了。下面开始springboot整合他们
spingboot整合
目录结构为:
pom文件
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.6.RELEASE</version>
<relativePath/>
</parent>
<groupId>com.zjy</groupId>
<artifactId>elk</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>elk</name>
<description>Demo project for Spring Boot</description>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<exclusions>
<exclusion>
<artifactId>logback-core</artifactId>
<groupId>ch.qos.logback</groupId>
</exclusion>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!--elasticsearch-->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-elasticsearch</artifactId>
<version>3.2.1.RELEASE</version>
</dependency>
<!--fastjson-->
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.47</version>
</dependency>
<!--knife4j-->
<dependency>
<groupId>com.github.xiaoymin</groupId>
<artifactId>knife4j-spring-boot-starter</artifactId>
<!--在引用时请在maven中央仓库搜索最新版本号-->
<version>2.0.2</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.12</version>
<scope>provided</scope>
</dependency>
<!--logstash-->
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>5.2</version>
</dependency>
<dependency>
<groupId>net.logstash.log4j</groupId>
<artifactId>jsonevent-layout</artifactId>
<version>1.6</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<exclusions>
<exclusion>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
</exclusion>
</exclusions>
<version>1.1.8</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
<version>1.1.8</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
application.properties
spring.application.name=es
server.port=2080
# ES配置
elasticsearch.host=127.0.0.1
elasticsearch.port=9200
elasticsearch.clustername=elasticsearch
elasticsearch.search.pool.size=5
elasticsearch.username=elastic
elasticsearch.password=123456
#logstash服务器地址
# logstash.host=127.0.0.1
#logstash端口
# logstash.port=9250
logging.config=classpath:logback-spring.xml
logback-spring.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<include resource="org/springframework/boot/logging/logback/base.xml" />
<appender name="LOGSTASH"
class="net.logstash.logback.appender.LogstashTcpSocketAppender">
<!--配置logStash 服务地址 -->
<destination>127.0.0.1:9250</destination>
<!-- 日志输出编码 -->
<encoder charset="UTF-8"
class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
<providers>
<timestamp>
<timeZone>UTC</timeZone>
</timestamp>
<pattern>
<pattern>
{
"LogLevel": "%level",
"ServiceName": "${springAppName:-}",
"Pid": "${PID:-}",
"Thread": "%thread",
"Class": "%logger{40}",
"Date": "%d",
"ExceptionInfo": "%ex{full}",
"message": "%message"
}
</pattern>
</pattern>
</providers>
</encoder>
</appender>
<root level="INFO">
<appender-ref ref="LOGSTASH" />
<appender-ref ref="CONSOLE" />
</root>
</configuration>
ElasticsearchConfig
package com.zjy.elk.config;
import org.apache.http.HttpHost;
import org.apache.http.auth.AuthScope;
import org.apache.http.auth.UsernamePasswordCredentials;
import org.apache.http.client.CredentialsProvider;
import org.apache.http.impl.client.BasicCredentialsProvider;
import org.apache.http.impl.nio.client.HttpAsyncClientBuilder;
import org.elasticsearch.client.Client;
import org.elasticsearch.client.RestClient;
import org.elasticsearch.client.RestClientBuilder;
import org.elasticsearch.client.RestHighLevelClient;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.transport.TransportAddress;
import org.elasticsearch.transport.client.PreBuiltTransportClient;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.elasticsearch.core.ElasticsearchOperations;
import org.springframework.data.elasticsearch.core.ElasticsearchRestTemplate;
import org.springframework.data.elasticsearch.core.ElasticsearchTemplate;
import org.springframework.data.elasticsearch.repository.config.EnableElasticsearchRepositories;
import java.net.InetAddress;
@Configuration
@EnableElasticsearchRepositories(basePackages = "com.zjy.elk.dao")
public class ElasticsearchConfig {
/**日志对象*/
private static final Logger logger = LoggerFactory.getLogger(ElasticsearchConfig.class);
@Value("${elasticsearch.host}")
private String esHost;
@Value("${elasticsearch.port}")
private int esPort;
@Value("${elasticsearch.clustername}")
private String esClusterName;
@Value("${elasticsearch.search.pool.size}")
private Integer threadPoolSearchSize;
@Value("${elasticsearch.username}")
private String userName;
@Value("${elasticsearch.password}")
private String password;
@Bean
public RestHighLevelClient client(){
/*用户认证对象*/
final CredentialsProvider credentialsProvider = new BasicCredentialsProvider();
/*设置账号密码*/
credentialsProvider.setCredentials(AuthScope.ANY,new UsernamePasswordCredentials(userName, password));
/*创建rest client对象*/
RestClientBuilder builder = RestClient.builder(new HttpHost(esHost, esPort))
.setHttpClientConfigCallback(new RestClientBuilder.HttpClientConfigCallback() {
@Override
public HttpAsyncClientBuilder customizeHttpClient(HttpAsyncClientBuilder httpAsyncClientBuilder) {
return httpAsyncClientBuilder.setDefaultCredentialsProvider(credentialsProvider);
}
});
RestHighLevelClient client = new RestHighLevelClient(builder);
return client;
}
@Bean(name="elasticsearchTemplate")
public ElasticsearchRestTemplate elasticsearchRestTemplate(){
return new ElasticsearchRestTemplate(client());
}
}
Swagger2Config
package com.zjy.elk.config;
import com.github.xiaoymin.knife4j.spring.annotations.EnableKnife4j;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import springfox.documentation.builders.ApiInfoBuilder;
import springfox.documentation.builders.PathSelectors;
import springfox.documentation.builders.RequestHandlerSelectors;
import springfox.documentation.service.ApiInfo;
import springfox.documentation.spi.DocumentationType;
import springfox.documentation.spring.web.plugins.Docket;
import springfox.documentation.swagger2.annotations.EnableSwagger2;
@Configuration
@EnableSwagger2
@EnableKnife4j
public class Swagger2Config {
@Bean
public Docket createRestApi() {
return new Docket(DocumentationType.SWAGGER_2)
.useDefaultResponseMessages(false)
.apiInfo(apiInfo())
.select()
.apis(RequestHandlerSelectors.basePackage("com.zjy.elk.controller"))
.paths(PathSelectors.any())
.build();
}
private ApiInfo apiInfo() {
return new ApiInfoBuilder()
.title("swagger-bootstrap-ui RESTful APIs")
.description("swagger-bootstrap-ui")
.termsOfServiceUrl("http://localhost:8999/")
.version("1.0")
.build();
}
}
EKLController
package com.zjy.elk.controller;
import com.alibaba.fastjson.JSONArray;
import com.zjy.elk.dao.ESRepository;
import com.zjy.elk.entity.ESData;
import com.zjy.elk.entity.ResultBO;
import io.swagger.annotations.ApiOperation;
import lombok.extern.slf4j.Slf4j;
import org.elasticsearch.index.query.MatchQueryBuilder;
import org.elasticsearch.index.query.QueryBuilders;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import org.springframework.web.bind.annotation.*;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.List;
import java.util.Optional;
import java.util.UUID;
@Slf4j
@RestController
@RequestMapping(value = "/demo")
public class EKLController {
@Autowired
private ESRepository esRepository;
/**
* 添加多条es数据
* @param esDataList
* @return
*/
@ApiOperation(value = "保存es数据", notes = "保存es数据")
@PostMapping("save")
public ResultBO save(@RequestBody List<ESData> esDataList){
log.info("接收到的 ES 数据为: {},总共是 {} 条数据", esDataList, esDataList.size());
for (ESData esData : esDataList) {
String id = UUID.randomUUID().toString().replaceAll("-", "");
esData.setId(id);
esData.setCreateTime(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").format(new Date()));
esData.setSortTime(System.currentTimeMillis());
}
esRepository.saveAll(esDataList);
log.info("保存数据成功!");
log.debug("保存数据成功!");
log.error("保存数据成功!");
log.warn("保存数据成功!");
return ResultBO.success(esDataList);
}
/**
* 查询es数据
* @param id
* @return
*/
@ApiOperation(value = "查询es数据", notes = "查询es数据")
@GetMapping("detail")
public ResultBO detail(@RequestParam(value = "id", required = true) String id){
log.info("接收到的 ES 的id为: {}", id);
Optional<ESData> byId = esRepository.findById(id);
log.info("查询到的 ES数据 为: {}", byId);
return ResultBO.success(byId);
}
/**
* 查询所有es数据
* @return
*/
@ApiOperation(value = "查询所有es数据", notes = "查询所有es数据")
@GetMapping("list")
public ResultBO list(){
logger.info("查询所有es数据");
// 这里报空指针。要把日志收集起来,在kibana中查看
try {
ESData esData = null;
String content = esData.getContent();
System.out.println(content);
} catch (Exception e){
logger.error("错误日志信息", e);
}
Iterable<ESData> all = esRepository.findAll();
logger.info("查询到的 所有ES数据 为: {}", all);
return ResultBO.success(all);
}
/**
* 根据关键字 title 中进行搜索
* @return
*/
@ApiOperation(value = "根据关键字 title 中进行搜索", notes = "据关键字 title 中进行搜索")
@GetMapping("findByTitle")
public ResultBO findByTitle(@RequestParam(value = "title", required = true) String title){
log.info("输入的关键字 title 为: {}", title);
MatchQueryBuilder matchQueryBuilder = QueryBuilders.matchQuery("title", title);
Iterable<ESData> search = esRepository.search(matchQueryBuilder);
log.info("查询到的 所有ES数据 为: {}", search);
return ResultBO.success(search);
}
/**
* 根据关键字 content 中进行搜索
* @return
*/
@ApiOperation(value = "根据关键字 content 中的词进行搜索", notes = "据关键字 content 中进行搜索")
@GetMapping("findByContent")
public ResultBO findByContent(@RequestParam(value = "content", required = true) String content){
log.info("输入的关键字 content 为: {}", content);
MatchQueryBuilder matchQueryBuilder = QueryBuilders.matchQuery("content", content);
Iterable<ESData> search = esRepository.search(matchQueryBuilder);
Object json = JSONArray.toJSON(search);
log.info("查询到的 所有ES数据 为: {}", json);
return ResultBO.success(search);
}
/**
* 根据关键字 title 中进行分页查询排序
* @return
*/
@ApiOperation(value = "根据关键字 title 中进行分页搜索", notes = "根据关键字 title 中进行分页搜索")
@PostMapping("findByTitlePage")
public ResultBO findByTitlePage(@RequestParam(value = "title", required = true) String title, Integer page, Integer size){
log.info("输入的关键字 title 为: {}, paget为: {}, size为: {}", title, page, size);
// 根据创建时间 倒排序 并分页
Sort sort = Sort.by("sortTime").descending();
Pageable pageable = PageRequest.of(page, size, sort);
MatchQueryBuilder matchQueryBuilder = QueryBuilders.matchQuery("title", title);
Iterable<ESData> search = esRepository.search(matchQueryBuilder, pageable);
Object json = JSONArray.toJSON(search);
log.info("查询到的 所有ES数据 为: {}", json);
return ResultBO.success(search);
}
/**
* 根据id进行删除
* @return
*/
@ApiOperation(value = "根据id进行删除", notes = "根据id进行删除")
@PostMapping("delete")
public ResultBO delete(@RequestParam(value = "id", required = true) String id){
log.info("要删除的 id 为: {}", id);
esRepository.deleteById(id);
log.info("成功删除es");
return ResultBO.success();
}
/**
* 删除所有
* @return
*/
@ApiOperation(value = "删除所有", notes = "删除所有")
@PostMapping("deleteAll")
public ResultBO deleteAll(){
log.info("删除所有");
esRepository.deleteAll();
log.info("成功删除所有ES数据");
return ResultBO.success();
}
}
ESRepository
package com.zjy.elk.dao;
import com.zjy.elk.entity.ESData;
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
import org.springframework.stereotype.Repository;
@Repository
public interface ESRepository extends ElasticsearchRepository<ESData, String> {
}
日志实体:ESData
package com.zjy.elk.entity;
import lombok.Data;
import lombok.ToString;
import org.springframework.data.elasticsearch.annotations.DateFormat;
import org.springframework.data.elasticsearch.annotations.Document;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
@Data
@ToString
@Document(indexName = "blog", type = "article")
public class ESData {
/**
* 主键ID
*/
@Field(type = FieldType.Keyword)
private String id;
/**
* 文章标题
*/
@Field(type = FieldType.Text, analyzer = "ik_max_word", searchAnalyzer = "ik_max_word")
private String title;
/**
* 文章内容
*/
@Field(type = FieldType.Text, analyzer = "ik_max_word", searchAnalyzer = "ik_max_word")
private String content;
/**
* 创建时间
*/
@Field(type = FieldType.Date, pattern = "yyyy-MM-dd HH:mm:ss", format = DateFormat.custom, fielddata = true)
private String createTime;
/**
* 创建时间--排序使用
*/
@Field(type = FieldType.Long, fielddata = true)
private long sortTime;
}
ResultBO
package com.zjy.elk.entity;
import io.swagger.annotations.ApiModelProperty;
import lombok.Data;
@Data
public class ResultBO<T> {
@ApiModelProperty(required = true, notes = "返回数据")
private T content;
@ApiModelProperty(required = true, notes = "返回成功与否", example = "true")
private boolean succeed = true;
@ApiModelProperty(required = true, notes = "结果码", example = "200")
private int code = 0;
@ApiModelProperty(required = true, notes = "返回信息说明", example = "SUCCESS")
private String msg;
public ResultBO(T content) {
this.content = content;
}
public ResultBO(boolean succeed, int code, String msg, T content) {
this.succeed = succeed;
this.code = code;
this.msg = msg;
this.content = content;
}
public ResultBO(boolean succeed, int code, String msg) {
this.succeed = succeed;
this.code = code;
this.msg = msg;
}
public ResultBO() {
}
public static <T> ResultBO<T> success(T content) {
return new ResultBO<T>(content);
}
public static ResultBO success() {
return new ResultBO();
}
public static ResultBO fail(int code, String msg) {
return new ResultBO(false, code, msg);
}
public static ResultBO fail(String msg) {
return new ResultBO(false, -1, msg);
}
public static ResultBO fail() {
return fail("fail");
}
}
代码贴完了,可以启动测试了
测试
启动项目,访问:swagger: http://localhost:2080/doc.html#
查询所有es数据:
因为我故意弄了个空指针。就是想把错误信息打印到kibana中去
控制台输出:
logstash-6.3.0黑窗口输出:
Kibana查询:
查看logback*索引日志
info级的日志:
error级的日志:
错误信息全输入进来了
查看blog*索引日志:
可以只看这三个字段。具体的,请大佬自行研究。
测试全是OK!
欢迎大神指导,可以留言交流!
======================
本人原创文章,转载注明出入!
=================