MogoDB4

from pymongo import MongoClient

# Create a link object
conn = MongoClient ( 'localhost', 27017)

# Create a collection of objects and database object
db = conn.stu

my_set=db.class1

index

index=my_set.ensure_index('name')

 

Creating composite index

index=my_set.ensure_index([('name',1),('Age',1)])

 

Create a unique index

index=my_set.ensure_index('name',unique=True)

 

Create a sparse index

index=my_set.ensure_index('name',sparse=True)

 

 

View the collection index

list_indexes

for i in my_set.list_indexes():
    print(i)

 

Delete Index

drop_index (): Delete one index

my_set.drop_index ( 'name_1') --------- name_1 is the name of the index

 

drop_indexes (): Delete all indexes

my_set.drop_indexes()

 

Polymerization operation

aggregate([])

Parameters: the polymerization parameters consistent with the wording mongoshell

Returns: an iterator, find the same return value

l=[{'$group':{'_id':'$gender','count':{'$sum':1}}},
{'$match':{'count':{'$gt':1}}}
]
cursor=my_set.aggregate(l)
for i in cursor:
   print(i)

 

Mongo large file storage

Import MongoClient pymongo from 

Import bson.binary 

Conn = MongoClient ( 'localhost', 27017) 

DB = conn.file 

my_set = db.img 
# stores 
F = Open ( 'picture.jpg', 'RB') 
# read binary stream format binary string becomes bson 
Content = bson.binary.Binary (reached, f.read ()) 

my_set.insert ({ 'filename': 'picture.jpg', 'Data': Content}) 

conn.Close ( )

  

> show dbs
admin    0.000GB
config   0.000GB
file     0.005GB
grid_db  0.005GB
local    0.000GB
stu      0.000GB
> 

> show tables
img

  

Extraction of files

from pymongo import MongoClient

import bson.binary

conn=MongoClient('localhost',27017)

db=conn.file

my_set=db.img
data=my_set.find_one({'filename':'picture.jpg'})
with open(data['filename'],'wb') as f:
	f.write(data['data'])
conn.close()

  

Guess you like

Origin www.cnblogs.com/sike8/p/11230260.html