ES plug-in installation

HTTP: // blog.csdn.net/napoay/article/details/53896348 

# update 
sudo yum Update - the y- 


sudo RPM -ivh HTTP: // dl.fedoraproject.org/pub/epel/epel-release-latest-7. .noarch.rpm 
sudo RPM -ivh HTTPS: // kojipkgs.fedoraproject.org // Packages Standard Package / HTTP-Parser / 2.7.1 / 3.el7 / x86_64 / HTTP-Parser-2.7.1-3.el7.x86_64.rpm 


sudo NPM the install yum 

the sudo yum the install - Y Git 

the sudo yum the install - Y bzip2 

Git clone Git: // github.com/mobz/elasticsearch-head.git 

# source package after downloading cut to / with BigData directory, and change their user and group 
sudo chownXiaoniu -R & lt: xiaoniu / with BigData / elasticsearch- head 

# into elasticsearch - head in 
CD elasticsearch - head 
# compiler installation 
npm install 


open elasticsearch the -head-Master / Gruntfile.js, connect to find the following properties, the new hostname: '0.0. 0.0 ' , 
        Connect: { 
                        Server: { 
                                Options: { 
                                        hostname: ' 0.0.0.0 ' , 
                                        Port: 9100 , 
                                        Base: ' '. , 
                                        Keepalive: to true
                                } 
                        } 
                } 



Edit elasticsearch -5.4.3 / config / elasticsearch.yml, add the following: 
http.cors.enabled: to true 
http.cors.allow -origin: "*" 

# services running 
npm RUN Start

 ------ -------------------------------------------------- ------------------------------------- 


install IK tokenizer 
download the corresponding version of the plug- 
HTTPS: // github.com/medcl/elasticsearch-analysis-ik/releases 


first download the corresponding version of ik es tokenizer zip package, es uploaded to the server, there is a plugins directory at es installation directory, create a directory under this ik called the directory 
and then extract the good content, copy ik directory to 
the directory ik es copied to the other node 
restart all es 


# create an index name News 
curl -XPUT HTTP:// 192.168.100.211:9200/news 

# Create a mapping (equivalent schema information in the data table and field names and field types) 
curl -XPOST HTTP: // 192.168.100.211:9200/news/fulltext/_mapping -d ' 
{
         "Properties" : {
             "Content" : {
                 "type": "text" ,
                 "Analyzer": "ik_max_word" ,
                 "search_analyzer": "ik_max_word" 
            } 
        } 
    
} '
 

curl -XPOST HTTP: // 192.168.100.211: 9200 / News / FULLTEXT / 1 -d ' 
{ "Content": "US left Iraq is a mess right"} '

curl -XPOST http://-D 192.168.100.211:9200/news/fulltext/2 ' 
{ "Content": "Ministry of Public Security: around the school bus to enjoy the highest right of way"} '
 
curl -XPOST HTTP: // 192.168.100.211:9200/news/fulltext/ -d 3 ' 
{ "Content": "China and South Korea fishing conflicts police investigation: the average Korean police buckle a Chinese fishing day"} '
 
curl -XPOST HTTP: // 192.168.100.211:9200/news/fulltext/4 -d ' 
{ "content": "Chinese Consulate General in Los Angeles shooting suspects have been Asian man surrendered"} '
 
curl -XPOST HTTP: // 192.168.100.211:9200/news/fulltext/_search -d' 
{
     "Query": { "match ": {" content ":" China " }},
     " highlight " : {
         " pre_tags ": ["<font color='red'>", "<tag2>"],
        "post_tags" : ["</font>", "</tag2>"],
         "Fields" : {
             "Content" : {} 
        } 
    } 
} '
 
--------------------------------- ---------------------------------- 


curl -XGET 'HTTP: // centos7-2: 9200 / _analyze? pretty & analyzer = ik_max_word '-d' Lenovo is the world's largest notebook manufacturers ' 

curl -XGET' HTTP: // centos7-2: 9200 / _analyze Pretty & Analyzer = ik_smart '-d' Lenovo is the world's largest notebook manufacturers '? 

curl -XPUT' https://192.168.100.211:9200/iktest?pretty '-d' { 
    "Settings" : {
         "Analysis" : {
             "Analyzer" : {
                 "IK" : {
                    "tokenizer" : "ik_max_word"
                }
            }
        }
    },
    "mappings" : {
        "article" : {
            "dynamic" : true,
            "properties" : {
                "subject" : {
                    "type" : "string",
                    "analyzer" : "ik_max_word"
                }
            }
        }
    }
}'

curl -XPUT 'https://192.168.100.211:9200/iktest?pretty' -d '{
    "settings" : {
        "analysis" : {
            "analyzer" : {
                "ik" : {
                    "tokenizer" : "ik_max_word"
                }
            }
        }
    },
    "mappings" : {
        "article" : {
            "dynamic" : true,
            "properties" : {
                "subject" : {
                    "type" : "string",
                    "analyzer" : "ik_max_word"
                }
            }
        }
    }
}'



curl -XGET 'http://192.168.10.16:9200/_analyze?pretty&analyzer=ik_max_word' -d 'People's Republic of China'

 -------------------------------------------- ------------------------------------------------- 

es install SQL plug-in 
. / bin / elasticsearch-plugin install HTTPS: // github.com/NLPchina/elasticsearch-sql/releases/download/5.4.3.0/elasticsearch-sql-5.4.3.0.zip 

# then extract to plugins directory copy the contents of the node es other plugins directory 

to download the SQL Server 
wget HTTPS: // github.com/NLPchina/elasticsearch-sql/releases/download/5.4.1.0/es-sql-site-standalone.zip 

with npm compiler install 
the unzip es -sql-the sum of site Standalone.zip 
cd Site -server / 
npm install Express - the Save 

modify the SQL Server port 
vi site_configuration.json 
start the service 
the Node the Node-server.js &

 

Guess you like

Origin www.cnblogs.com/JBLi/p/11402996.html
Recommended