业务场景:有一个汽车销售公司,拥有很多家4S店,这些4S店的数据,都会在一段时间内陆续传递过来,汽车的销售数据,现在希望能够在内存中缓存比如1000条销售数据,然后一次性批量上传到es中去
添加测试数据
PUT /car_shop/sales/1 { "brand": "宝马", "name": "宝马320", "price": 320000, "produce_date": "2017-01-01", "sale_price": 300000, "sale_date": "2017-01-21" }
PUT /car_shop/sales/2 { "brand": "宝马", "name": "宝马320", "price": 320000, "produce_date": "2017-01-01", "sale_price": 300000, "sale_date": "2017-01-21" }
|
执行bulk操作
基于bulk执行插入,更新,和删除的操作
package com.es.core.senior;
|
查看结果
GET /car_shop/sales/_search 结果: { "took": 2, "timed_out": false, "_shards": { "total": 5, "successful": 5, "failed": 0 }, "hits": { "total": 2, "max_score": 1, "hits": [ { "_index": "car_shop", "_type": "sales", "_id": "1", "_score": 1, "_source": { "brand": "宝马", "name": "宝马320", "price": 320000, "produce_date": "2017-01-01", "sale_price": 290000, "sale_date": "2017-01-21" } }, { "_index": "car_shop", "_type": "sales", "_id": "3", "_score": 1, "_source": { "brand": "奔驰", "name": "奔驰C200", "price": 350000, "produce_date": "2017-01-20", "sale_price": 320000, "sale_date": "2017-01-25" } } ] } } |