113 Python program operation process - process synchronization (multiprocess.Lock)

Although concurrent programming so that we can more fully utilize IO resources, but also brought us a new problem: when multiple processes use the same data resources, security or order of the data will lead to confusion.

First, process synchronization

Execute multiple processes simultaneously, in order to restrict the processes mutual access to resources, making the execution of each process synchronized with each other.

In my understanding, the synchronization process can be considered as a means of communication (ipc) between processes.

Second, why the need to process synchronization

Multi-process can cause problems to seize the resources, in order to solve this problem we need mutual synchronization between processes, process control access to critical resources.

Process of achieving synchronization in C language There are many ways, such as: semaphores, locking mechanism.

Example: demo multi-process resource preemption

import os
import time
import random
from multiprocessing import Process

def work(n):
    print('%s: %s is running' %(n,os.getpid()))
    time.sleep(random.random())
    print('%s:%s is done' %(n,os.getpid()))

if __name__ == '__main__':
    for i in range(3):
        p=Process(target=work,args=(i,))
        p.start()

Three, Python achieve synchronization process

  1. First of allfrom multiprocessing import Lock
  2. In the main process, instantiating be locked, lock = Lock()and passed to the child process
  3. Realization of multi-process control of resources in the child process by locking and unlocking. lock.acquire()Lock, lock.release()unlock

Demo: control resource access by Python lock mechanism

def work(lock,n):
    print(f"等待开锁 进程{n}")
    # 上锁
    lock.acquire()     # 当上锁之后别的进程无法访问,会阻塞住
    print(f'进程{n} pid: %s is running' % ( os.getpid()))
    time.sleep(random.random())
    print(f'进程{n} pid: %s is done' % (os.getpid()))
    # 开锁
    lock.release()

if __name__ == '__main__':
    lock=Lock() # 实例化得到锁
    for i in range(3):
        p=Process(target=work,args=(lock,i))
        p.start()

Wait for unlock process 1
process 1 pid: 8696 is running
waiting for unlock process 0
waiting for unlock process 2
process 1 pid: 8696 is done
process 0 pid: 14264 is running
processes 0 pid: 14264 is done
process 2 pid: 22724 is running
process 2 pid : 22724 is done

The results can be found, the code before locking. It is a multi-process concurrent access, but then locked and unlocked until after the second person to have access. And so on. As if on queue toilet, shut the door and locked inside a person, second person waiting.

But you have to read a question that was locked code, each process in the time of the visit is not accessible in serial order thing.

Indeed, the lock mechanism to guarantee the security of data, but sacrificing efficiency.

Fourth, multi-process simulation at the same time to grab votes

# 文件db的内容为:{"count":1}
# 注意一定要用双引号,不然json无法识别
# 并发运行,效率高,但竞争写同一文件,数据写入错乱
from multiprocessing import Process,Lock
import time,json,random
def search():
    dic=json.load(open('db'))
    print('剩余票数%s' %dic['count'])

def get():
    dic=json.load(open('db'))
    time.sleep(0.1)  # 模拟读数据的网络延迟
    if dic['count'] >0:
        dic['count']-=1
        time.sleep(0.2)  # 模拟写数据的网络延迟
        json.dump(dic,open('db','w'))
        print('购票成功')

def task():
    search()
    get()

if __name__ == '__main__':
    for i in range(100):  # 模拟并发100个客户端抢票
        p=Process(target=task)
        p.start()

The above code, you will find multiple processes at the same time to grab votes, each other to seize the resource file, each process regarded read the file into memory were to grab votes. In fact, the resources occupied by only one person, it will cause problems

4.1 By the process of resource access control lock

def search():
    time.sleep(1)
    with open("db","w",encoding="utf8") as f:
        data = json.load(f)
        print(f'还剩{data["count"]}')

def get():
    with open('db','rt',encoding='utf-8') as f:
        res = json.load(f)
    time.sleep(1)  # 模拟网络io
    if res['count'] > 0:
        res['count'] -= 1
        with open('db', 'w', encoding='utf-8') as f:
            json.dump(res, f)
            print(f'进程{os.getpid()} 抢票成功')
        time.sleep(1.5)  # 模拟网络io
    else:
        print('票已经售空啦!!!!!!!!!!!')

def task(lock):
    print(f"进程:{os.getpid()}  正在抢票中。。。")

    # 上锁     每个进程都会访问一遍,所以加锁就等于上锁到解锁这段代码是串行的
    lock.acquire()  # 当上锁之后别的进程无法访问,会阻塞住
    get()
    # 开锁
    lock.release()

if __name__ == '__main__':
    # 创建锁
    lock = Lock()

    pro_list = []
    # 创建进程
    for i in range(10):
        p = Process(target=task, args=(lock,))
        p.start()
        pro_list.append(p)
    # 回收进程
    for i in range(10):
        pro_list[i].join()

to sum up

Process synchronization in theory should be part of inter-process communication, is a big topic, each language has also realized the needs of process synchronization. The various processes need to restrict access to resources. In various languages ​​also have limitations. Here currently only describes how this kind of process synchronization.

The lock mechanism is locked into a serial code, to ensure the security of data, but sacrificing efficiency.

Guess you like

Origin www.cnblogs.com/XuChengNotes/p/11529599.html