Bulk API is provided elasticsearch suitable for batch operations, adding volume can be achieved, modify, delete, recorded on a Multi Get can only achieve batch acquisition.
Behavior and action format comprising request requestBody request data, a command is two, but to wrap, as follows:
{action:{metatata}}\n
{requestbody}\n
action is operating behavior include the following:
create: When you create a document does not exist
update: update the document
index: create a new document or replace an existing document
delete: delete a document
metatata: contains _index, _type, _id index that is to be performed, the type of document id
Create and index difference: if the data exists, use the create operation fails, it will prompt the document already exists, use the index can be executed successfully.
Specific examples are as follows:
1, to add bulk
Add id to the index list of books in lib2 is 1,3,4 document.
POST /lib2/books/_bulk {"index":{"_id":1}} {"title":"Html5","price":45} {"index":{"_id":3}} {"title":"Php","price":35} {"index":{"_id":4}} {"title":"Python","price":50}
Spend a record command check is successful
1 GET /lib2/books/_mget 2 { 3 "ids":["1","3","4"] 4 }
2, bulk editing
Modify the document id is 1,3,4 document, the age were changed to 51,53,54
POST /lib2/books/_bulk {"update":{"_index":"lib2","_type":"books","_id":1}} {"doc":{"price":51}} {"update":{"_index":"lib2","_type":"books","_id":3}} {"doc":{"price":53}} {"update":{"_index":"lib2","_type":"books","_id":4}} {"doc":{"price":54}}
3, bulk delete
POST /lib2/books/_bulk {"delete":{"_index":"lib2","_type":"books","_id":1}} {"delete":{"_index":"lib2","_type":"books","_id":3}} {"delete":{"_index":"lib2","_type":"books","_id":4}}
4, the mixing operation can be performed with the POST
Follows, the document id is added 1,3,4 document, modify the age of 1 to 54 age 53,4 modified, and deleted document id is 3,4, this time only the document id 1 , age 53
POST /lib2/books/_bulk {"index":{"_id":1}} {"title":"Html5","price":45} {"index":{"_id":3}} {"title":"Php","price":35} {"index":{"_id":4}} {"title":"Python","price":50} {"update":{"_index":"lib2","_type":"books","_id":1}} {"doc":{"price":53}} {"update":{"_index":"lib2","_type":"books","_id":4}} {"doc":{"price":54}} {"delete":{"_index":"lib2","_type":"books","_id":3}} {"delete":{"_index":"lib2","_type":"books","_id":4}}
Inquiries about the results, in line with expectations
{ "docs": [ { "_index": "lib2", "_type": "books", "_id": "1", "_version": 2, "found": true, "_source": { "title": "Html5", "price": 53 } }, { "_index": "lib2", "_type": "books", "_id": "3", "found": false }, { "_index": "lib2", "_type": "books", "_id": "4", "found": false } ] }
bulk data to be processed will be loaded into memory, so the amount of data is limited, the optimal amount of data is not a defined value, it depends on the hardware, document size, complexity, index, and search the load. General recommendations are 1000-5000 document, the size of the proposal is 5-15M, default can not exceed 100M, can be configured in elasticsearch profile elasticsearch.yml in.