Docker install elasticsearch visualization tool kibana, ik tokenizer

1. Download the mirror file

docker pull elasticsearch:7.4.2

Download the visual interface, the version must be the same as that of elasticsearch

docker pull kibana:7.4.2

2. Create a real column

mkdir -p /usr/local/elasticsearch/config
mkdir -p /usr/local/elasticsearch/data
echo "http.host: 0.0.0.0">>/usr/local/elasticsearch/config/elasticsearch.yml
docker run -itd --name elasticsearch -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" -e ES_JAVA_OPTS="-Xms64m -Xmx128m" -v /usr/local/elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml -v /usr/local/elasticsearch/data:/usr/share/elasticsearch/data -v /usr/local/elasticsearch/plugins:/usr/share/elasticsearch/plugins elasticsearch:7.4.2

Special attention:
-e ES_JAVA_OPTS="-Xms64m -Xmx128m" \ In the test environment, set the initial memory and maximum memory of ES, otherwise it will cause too large to start ES.

We will find that elasticsearch is still not accessible. Open the startup log to see that the mapped directory does not have permissions,
Insert picture description here
so we need to add permissions to all folders under /usr/share/elasticsearch

chmod 777 -R /usr/local/elasticsearch/

Start the elasticsearch container again

docker start elasticsearch

3. The browser can access port 9200. If the firewall is not closed, please close it, or open port 9200.
Insert picture description here
4. Install the visual interface kibana, first copy the kibana.yml in docker to the directory /usr/local/kibana/conf
Then execute the following command

docker run --name kibana -v /usr/local/kibana/conf:/usr/share/kibana/config  -e ELASTICSEARCH_HOSTS=http://81.68.112.20:9200 -p 5601:5601 -itd kibana:7.4.2

Here ELASTICSEARCH_HOSTS=http://81.68.112.20:9200 is replaced with your own host address to
modify the configuration information in kibana.yml

修改elasticsearch.hosts的地址, 改成es的ip
server.name: kibana
server.host:0”
elasticsearch.hosts: [ “http://192.168.56.10:9200]
xpack.monitoring.ui.container.elasticsearch.enabled: true

After modifying the configuration file, restart docker's kibana service to access kibana

5. Install the ik
tokenizer . A tokenizer (word tokenizer) receives a character stream, divides it into independent tokens (word elements, usually independent words), and then outputs the token stream

For example, when the witespace tokenizer encounters a blank character to split the text, it will split the text "Quick brown fox" into [Quick brown fox]

The tokenizer is also responsible for recording the order or position of each term (terms) (used for phrase phrases and word proximity word neighbor queries), and

The character offsets (character offsets) of the start and end of the original word represented by term (used to highlight the search content).

Elasticsearch provides many built-in tokenizers, which can be used to build custom analyzers (custom tokenizers)

Note: The default elasticsearch-plugin.install xxx.zip cannot be used for automatic installation.

https://github.com/medcl/elasticsearch-analysis-ik/releases download the version corresponding to es
Insert picture description here

Create a directory ik under /usr/local/elasticsearch

mkdir /usr/local/elasticsearch/ik

Upload the locally downloaded elasticsearch-analysis-ik-7.4.2.zip to the ik directory and unzip it.
Insert picture description here

docker exec -it elasticsearch bash
cd /usr/share/elasticsearch/bin
elasticsearch-plugi
elasticsearch-plugi list 

Observe whether the execution result of elasticsearch-plugi list is ik
Insert picture description here

Guess you like

Origin blog.csdn.net/u014496893/article/details/113769456