Multi-process operation - using the process of lock multiprocess.Lock

Multi-process operation - using the process of lock multiprocess.Lock

By learning Process module before, we achieved concurrent programming, although more full use of IO resources, but there are drawbacks: When multiple processes share a data resource and they will lead to data security or order in chaos problem.

As problem, we have introduced a process lock to maintain the order of execution

To simulate the rush tickets, for example, look at the importance of data security:

from  multiprocessing import Process,Lock
import json,time,os

# 获取剩余票数
def search():
    time.sleep(1) # 模拟网络io(网络延迟)
    with open('db.txt','rt',encoding='utf-8') as fr:
        res = json.load(fr)
        # print(res)
        print(f"还剩{res['count']}")

def get():
    with open('db.txt','rt',encoding='utf-8') as fr:
        res = json.load(fr)

    time.sleep(1)  # 模拟网络io(网络延迟)
    if res['count'] > 0 :
        res['count'] -= 1
        with open('db.txt','wt',encoding='utf-8') as fw:
            json.dump(res,fw)
            print(f'进程{os.getpid()} 抢票成功')
        time.sleep(1)   # 模拟网络io(网络延迟)

    else:
        print('票已经售空了!!!')

def func(lock):
    search()

    # 锁住
    lock.acquire()
    get()
    lock.release()


if __name__ == '__main__':
    lock = Lock()  # 写在主进程是为了让子进程拿到一把锁
    for i in range(10):
        p = Process(target=func,args=(lock,))
        p.start()
        # p.join()

# 进程锁 是把锁住的代码变成了串行
# join 是把所有非子进程变成了串行

# 为了保证数据的安全,串行牺牲掉了效率

When the lock can modify multiple processes to ensure that the same piece of data, at the same time only one task can be modified, that is, the serial changes, yes, the speed is slow, but speed is sacrificed to ensure data security.

Although it can share data files to achieve inter-process communication, but the question is:

  1. Low efficiency (based on shared data files, and data files are on the hard disk)
  2. It needs its own lock handle

So we'd better look for a solution that can take into account:

  1. High efficiency (multiple processes to share data in a memory)
  2. To help us handle locking problems. This is a message-based IPC communication mechanism mutiprocessing module offers us: queues and pipes.

Queues and pipes are the data stored in the memory, the queue is based on the (pipeline + lock) to achieve, let us free from the lock complex problem, we should try to avoid using shared data, use messaging as much as possible and queue, avoid dealing with complex synchronization and locking problems, but also in the increase in the number of processes can often get a better availability of malleability.

Guess you like

Origin www.cnblogs.com/guapitomjoy/p/11527925.html