(Multiprocess.Lock) and lock (multiprocess.Queue) queue

Process synchronization (lock)

There are so, we imitate a rush ticket software, document ten votes, a total of twenty people try to steal, steal a less one, twenty of us in order to achieve concurrent effect, apparently to in order to achieve multi-process, then there is a problem at this time, several people at the same time see are 10 tickets left, everyone grab a piece, only to the remaining 19. The question is, when should someone grab, others can not grab my son. So this time we need to lock this step, as long as there is progress in the implementation of, other processes can not execute code, can only wait.

This time we need to import a Lock class in multiprocessing in.

#文件db的内容为:{"count":1}
#注意一定要用双引号,不然json无法识别
from multiprocessing import Process,Lock
import time,json,random def search(): dic=json.load(open('db.txt')) print('\033[43m剩余票数%s\033[0m' %dic['count']) def get(): dic=json.load(open('db.txt')) time.sleep(0.1) #模拟读数据的网络延迟 if dic['count'] >0: dic['count']-=1 time.sleep(0.2) #模拟写数据的网络延迟 json.dump(dic,open('db.txt','w')) print('\033[43m购票成功\033[0m') def task(lock): search() lock.acquire() get() lock.release() if __name__ == '__main__': lock=Lock() for i in range(100): #模拟并发100个客户端抢票 p=Process(target=task,args=(lock,)) p.start() 加锁:购票行为由并发变成了串行,牺牲了运行效率,但保证了数据安全
Process lock is to lock the code into a serial 
join all child processes is to become a serial


 

Generates a lock (only here, is a lock, in the above method in the words is to generate more locked) code inside of our Lord process, then the lock of the object as a parameter passed into it. lock.acquire () is to open the lock, only one process can go in to perform the next process wants to go have to wait lock.release () release to get in. This is to achieve a good code can run only one process, a process can only manipulate files at the same time.

When the lock can modify multiple processes to ensure that the same piece of data, at the same time only one task can be modified, that is, the serial changes, yes, the speed is slow, but speed is sacrificed to ensure data security

I reduced my speed is certainly not willing to, ah, there is no way it can best of both worlds? Really have.

That is, queues and pipes

≈ lock queue pipeline +

 

queue

For chestnuts.

Easy to go to a cool bun bun shop, cook just make their own buns, a finish to put into the market a yard, just get cold easily buns from the platters, no more waiting. The platter is called the queue.

Create a way

Queue([maxsize])

maxsize is the maximum number of entries allowed in the queue, no size limit thereof is omitted.

1 q.put方法用以插入数据到队列中,put方法还有两个可选参数:blocked和timeout。如果blocked为True(默认值),并且timeout为正值,该方法会阻塞timeout指定的时间,直到该队列有剩余的空间。如果超时,会抛出Queue.Full异常。如果blocked为False,但该Queue已满,会立即抛出Queue.Full异常。
2 q.get方法可以从队列读取并且删除一个元素。同样,get方法有两个可选参数:blocked和timeout。如果blocked为True(默认值),并且timeout为正值,那么在等待时间内没有取到任何元素,会抛出Queue.Empty异常。如果blocked为False,有两种情况存在,如果Queue有一个值可用,则立即返回该值,否则,如果队列为空,则立即抛出Queue.Empty异常. 3 q.get_nowait():同q.get(False),就是说不等,有就拿,没有就砸店 4 q.put_nowait():同q.put(False),同上(厨师:我为什么要砸店?) 5 q.empty():调用此方法时q为空则返回True,该结果不可靠,比如在返回True的过程中,如果队列中又加入了项目。 6 q.full():调用此方法时q已满则返回True,该结果不可靠,比如在返回True的过程中,如果队列中的项目被取走。 7 q.qsize():返回队列中目前项目的正确数量,结果也不可靠,理由同q.empty()和q.full()一样

chestnut

from multiprocessing import Process,Queue
import time,random

def producer(q,name,food): '''生产者''' for i in range(3): print(f'{name}生产了{food}{i}') time.sleep(random.randint(1, 3)) res = f'{food}{i}' q.put(res) # q.put(None) def consumer(q,name): '''消费者''' while True: res = q.get(timeout=5) if res is None:break time.sleep(random.randint(1,3)) print(f'{name}吃了{res}') if __name__ == '__main__': q = Queue() p1 = Process(target=producer,args=(q,'rocky','包子')) p2 = Process(target=producer,args=(q,'mac','韭菜')) p3 = Process(target=producer,args=(q,'nick','蒜泥')) c1 = Process(target=consumer,args=(q,'成哥')) c2 = Process(target=consumer,args=(q,'浩南哥')) p1.start() p2.start() p3.start() c1.start() c2.start() p1.join()# 写了三个生产者的join,保证了生产者生产完毕,也就是生产者进程全结束了。 p2.join()#为什么这个要写在c1和c2 的start下面,因为这样才有边造边吃的效果 p3.join() q.put(None)# 几个消费者put几次,一旦消费者接收到None,就会挂掉。 q.put(None)

Now we are talking about the buns, but also talk about the way producers and consumers!

Producers and consumers

Very straightforward, the chef is the producer, the consumer is cool easily.

Manufacturer: production data task

Consumers: the task of processing data

In this case, we can put all the above things string up, with a view chestnuts

No, no, but also to talk about a thing.

JoinableQueue

This stuff is in multiprocessing inside.

from multiprocessing import Process,Queue,JoinableQueue


q = JoinableQueue()

q.put('zhao') # 放队列里一个任务
q.put('qian') print(q.get()) q.task_done() # 完成了一次任务 print(q.get()) q.task_done() # 完成了一次任务 q.join() #计数器不为0的时候 阻塞等待计数器为0后通过 # 想象成一个计数器 :put +1 task_done -1

Whenever JoinableQueue queue to put a thing, he's +1 on the number of tasks, whenever a take away, we manually call at q of task_done method, the number of tasks will be -1, join the queue but also by the method, when the task when the number is 0, join will not be blocked.

Now on a total of chestnuts!

Integrated example

from multiprocessing import Process,Queue,JoinableQueue
import time,random

def producer(q,name,food): '''生产者''' for i in range(3): print(f'{name}生产了{food}{i}') time.sleep(random.randint(1, 3)) res = f'{food}{i}' q.put(res) # q.put(None) def consumer(q,name): '''消费者''' while True: res = q.get() # if res is None:break time.sleep(random.randint(1,3)) print(f'{name}吃了{res}') q.task_done() # if __name__ == '__main__': q = JoinableQueue() p1 = Process(target=producer,args=(q,'rocky','包子')) p2 = Process(target=producer,args=(q,'mac','韭菜')) p3 = Process(target=producer,args=(q,'nick','蒜泥')) c1 = Process(target=consumer,args=(q,'成哥')) c2 = Process(target=consumer,args=(q,'浩南哥')) p1.start() p2.start() p3.start() c1.daemon = True c2.daemon = True c1.start() c2.start() p1.join() p2.join() p3.join() # 生产者生产完毕 # q.put(None)# 几个消费者put几次 # q.put(None) q.join() # 分析 # 生产者生产完毕--这是主进程最后一行代码结束--q.join()消费者已经取干净了,没有存在的意义了. #这是主进程最后一行代码结束,消费者已经取干净了,没有存在的意义了.守护进程的概念.

Production of buns rocky 0
MAC produced leek 0
Nick produced garlic 0
Nick produced garlic. 1
MAC production. 1 leek
rocky buns produced. 1
Nick produced garlic 2
ho Nange eat leek 0
MAC produced leek 2
to eat brother the garlic 0
Rocky 2 produced buns
to eat garlic 1 brother
Hao Nan Ge eat buns 0
Hao Nan Ge eat garlic 2
Hao Nan Ge eat steamed bun
to eat leek 1 brother
Hao Nan Ge eat leeks 2
to Columbia eat buns 2

As for why such a mess, because the operating system scheduler who do not know how he is, you do not know what process he executed first and so on and so on and so on. Finally, the program ends.

Guess you like

Origin www.cnblogs.com/whnbky/p/11528075.html