mongodb also provides two commands, mongoexport and mongoimport, to export or import data. The exported data is in json format or csv format. Backup and restore functions can also be implemented.
Specifically, use mongoexport --help to view. Here we mainly introduce how to write the -q condition.
For example, if I want to export all records with username='test', it can be written as
- mongoexport -d search_logs -c key_words -q "{'username' : 'test'}" -o mongo_$(date +%F).json
Note: $(date +%F) is a shell command that will output the current date in the format of 2012-02-22
If you want to export data larger than a certain time period, for example, my data structure is
- db.key_words.find({request_time:new Date(1329493503417)})
- { "_id" : ObjectId("4f3e75ffd6194c0b1e000001"), "username" : "test800", "request_time" : ISODate("2012-02-17T15:45:03.417Z"), "search_word" : "s" }
If I want to export data whose requests_time is greater than "2012-02-17T15:00:00Z", I need to convert this time type first.
- > ISODate("2012-02-17T15:00:00Z").valueOf()
- 1329490800000
Then
- mongoexport -d search_logs -c key_words -q '{request_time:{$gte:new Date(1329490800000)}}' -o mongo_$(date +%F).json
Note that if the condition after -q is enclosed in "" double quotes, the $ character needs to be escaped \$
- mongoexport -d search_logs -c key_words -q "{request_time:{'\$gte':new Date(1329490800000)}}" -o mongo_$(date +%F).json
I worked on this issue for a night and recorded it. I think it may be caused by the difference between json and bson. The specific reason is not very clear. I hope that all the knights who know it will guide me, thank you.
If you want to restore, you can use mongoimport
- mongoimport -d search_logs -c key_words --file mongo_$(date +%F).json