ElasticSearch7 installation and integration SpringBoot

ElasticSearch7 installation

step

Note :

1. The current version 7.5.1, rear mounted plug-es dependent version, the version can also be specified as required

2.es recommended to install in the docker, in order to facilitate the presentation, the direct use of windows version

  • Into the extracted directory elasticsearch-7.5.1, enter the bin directory, double-click to start elasticsearch.bat

  • Verify that the startup is successful, direct access to port 9200 in a browser

  • Or other tools to access it

Common commands

After a successful start, you can perform an operation command to the es by postman

1. Add or update the index and documentation

Method One (recommended): PUT / {index} / {document} / {id}, id is will pass, if not the id insert data, update existing data id (if only incoming index, creating an index)

Method two: POST / {index} / {document} / {id}, id may be omitted, if not pass es generated by

2. Obtain all documents

The GET / {index} / {document} / _ Search

Such as: http://127.0.0.1:9200/newindex/newdoc/_search

3. Get the document specified id

The GET / {index} / {document} / {ID}

Such as: http://127.0.0.1:9200/newindex/newdoc/1

4. Fuzzy query

GET / {index} / {document} / _ search? Q = * Keywords *

Such as: http://127.0.0.1:9200/newindex/newdoc/_search?q=* Wang *

5. Delete the document

The DELETE / {index} / {document} / {ID}

Such as: http://127.0.0.1:9200/newindex/newdoc/1

More statements refer to the official website

Visualization Tools

  • Installation ElasticSearch-Head, download the source code
git clone https://github.com/mobz/elasticsearch-head.git
复制代码
  • Global installed grunt project build tools
npm install -g grunt-cli
复制代码
  • Installation depends
cd elasticsearch-head/
npm install
复制代码
  • Elasticsearch modify configuration files
vim ../elasticsearch-7.5.1/config/elasticsearch.yml
复制代码
  • Additional cross-domain access configuration, you can add at the end
http.cors.enabled: true
http.cors.allow-origin: "*"
复制代码

  • Start ElasticSearch-Head
cd -	// 返回head根目录
grunt server
复制代码
  • Browser Access View: localhost: 9100

IK tokenizer

download

  1. github download with or directly download the archive , I chose the second

  2. After decompression, the decompressed folder are copied to the elasticsearch-7.5.1 \ plugins directory, folder, rename it ik

  3. Test ik word's Chinese results

Custom word extended content filter

  1. New in \ plugins \ ik config directory \ elasticsearch-7.5.1 \ custom.dic;

  2. Add your own custom vocabulary;

  3. IKAnalyzer.cfg.xml modify files in the same directory as the specified custom dictionary <entry key = "ext_dict"> properties;

  1. Restart elasticsearch, the effect is as follows:

Integration SpringBoot

ready

  • Add dependentSpring Data ElasticSearch
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
复制代码
  • Add Configuration
spring:
  data:
    elasticsearch:
      cluster-nodes: 127.0.0.1:9300
复制代码

Coding

  • New Entity Classes
@Data
@Accessors(chain = true)
@Document(indexName = "school", type = "student") // indexName为ES索引名,type为文档名
public class Student implements Serializable {

    // id标识
    // index=true代表是否开启索引,默认开启;
    // type字段类型
    // analyzer="ik_max_word"代表搜索的时候是如何分词匹配,为IK分词器最细颗粒度
    // searchAnalyzer = "ik_max_word"搜索分词的类型
    @Id
    private String id;
    
    @Field(type = FieldType.Keyword, analyzer = "ik_max_word", searchAnalyzer = "ik_max_word")
    private String name;

    private Integer age;

    @Field(type = FieldType.Double)
    private Double score;

    @Field(type = FieldType.Text, analyzer = "ik_max_word")
    private String info;
}
复制代码
  • Paging entity
@Data
@Accessors(chain = true)
public class QueryPage {

    /**
     * 当前页
     */
    private Integer current;

    /**
     * 每页记录数
     */
    private Integer size;
}
复制代码
  • Data Persistence
public interface EsRepository extends ElasticsearchRepository<Student, String> {

    /**
     * 根据学生姓名或信息模糊查询
     */
    Page<Student> findByNameAndInfoLike(String name, String info, Pageable pageable);
}
复制代码
  • Business layer interface and its implementation
public interface EsService {

    /**
     * 插入
     */
    void add(Student student);
    
    /**
     * 批量插入
     */
    void addAll(List<Student> student);

    /**
     * 模糊查询
     */
    Page<Student> search(String keyword, QueryPage queryPage);
}
复制代码
@Service
public class EsServiceImpl implements EsService {

    @Autowired
    private EsRepository esRepository;

    @Override
    public void add(Student student) {
        esRepository.save(student);
    }
    
    @Override
    public void addAll(List<Student> student) {
        esRepository.saveAll(student);
    }    

    @Override
    public Page<Student> search(String keyword, QueryPage queryPage) {
        // es默认索引从0开始,mp默认从1开始
        PageRequest pageRequest = PageRequest.of(queryPage.getCurrent() - 1, queryPage.getSize());
        return esRepository.findByNameOrInfoLike(keyword, keyword, pageRequest);
    }
}
复制代码
  • Write test classes
@SpringBootTest
public class EsServiceImplTest {

    @Autowired
    private EsService esService;

    @Test
    public void insert() {
        List<Student> students = new ArrayList<>();
        for (int i = 10; i <= 12; i++) {
            Student student = new Student();
            student.setId(i + "").setAge(10 + i).setName("王二狗" + i).setScore(72.5 + i).setInfo("大王派我来巡山" + i);
            students.add(student);
        }
        esService.addAll(students);
    }

    @Test
    public void fuzzySearch() {
        QueryPage queryPage = new QueryPage();
        queryPage.setCurrent(1).setSize(5);
        Page<Student> list = esService.search("二狗2", queryPage);
        list.forEach(System.out::println);
    }
}
复制代码

MySql data import ElasticSearch

installation

Configuration

  • Extracting archive
  • Copy \logstash-7.5.1\config\logstash-sample.confin the current directory, renamelogstash.conf
  • Modified as follows
# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  jdbc {
	# MySql连接配置
    jdbc_connection_string => "jdbc:mysql://127.0.0.1:3306/springboot_es?characterEncoding=UTF8"
    jdbc_user => "root"
    jdbc_password => "1234"
    jdbc_driver_library => "D:\Develop_Tools_Others\logstash-7.5.1\mysql-connector-java-5.1.26.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_paging_enabled => "true"
    jdbc_page_size => "50000"
    # SQL查询语句,用于将查询到的数据导入到ElasticSearch
    statement => "select id,name,age,score,info from t_student"
    # 定时任务,各自表示:分 时 天 月 年 。全部为 * 默认每分钟执行
    schedule => "* * * * *"
  }
}
output {
  elasticsearch {
    hosts => "localhost:9200"
    # 索引名称
    index => "school"
	# 文档名称
    document_type => "student"
    # 自增ID编号
    document_id => "%{id}"
  }
  stdout {
    # JSON格式输出
    codec => json_lines
  }
}
复制代码
  • Create a database springboot_es, into the database script
SET NAMES utf8mb4;
SET FOREIGN_KEY_CHECKS = 0;

-- ----------------------------
-- Table structure for t_student
-- ----------------------------
DROP TABLE IF EXISTS `t_student`;
CREATE TABLE `t_student`  (
  `id` int(11) NOT NULL AUTO_INCREMENT COMMENT '主键',
  `name` varchar(50) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL COMMENT '学生姓名',
  `age` int(11) NULL DEFAULT NULL COMMENT '年龄',
  `score` double(255, 0) NULL DEFAULT NULL COMMENT '成绩',
  `info` varchar(255) CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci NULL DEFAULT NULL COMMENT '信息',
  PRIMARY KEY (`id`) USING BTREE
) ENGINE = InnoDB AUTO_INCREMENT = 4 CHARACTER SET = utf8mb4 COLLATE = utf8mb4_general_ci ROW_FORMAT = Dynamic;

-- ----------------------------
-- Records of t_student
-- ----------------------------
INSERT INTO `t_student` VALUES (1, '小明', 18, 88, '好好学习');
INSERT INTO `t_student` VALUES (2, '小红', 17, 85, '天天向上');
INSERT INTO `t_student` VALUES (3, '王二狗', 30, 59, '无产阶级');

SET FOREIGN_KEY_CHECKS = 1;
复制代码

run

  • First start ES, then start Es-head
  • Finally start logstash, the start command is:
D:\Develop_Tools_Others\logstash-7.5.1>.\bin\logstash.bat -f .\config\logstash.conf
复制代码
  • Access localhost: 9600, start to see success

  • View the console, whether synchronous data

  • Finally, yield may see es-head, view data synchronization

Guess you like

Origin juejin.im/post/5e16d0525188254bec681d20