Day 29 process mutex / queue / thread

Process mutex

The task is to be executed department code (code modification involves only the shared data) becomes serial

Lock introduced under the category multiprocessing module: Step

Step Two: In the if __name__ = __main__:call at Lock class method metux = Lock()to get an object

The third step: before and after subclass lock need to share data: mutex.acquire(),mutex.release()

'''
抢票功能:
    1. 查看余票
    2. 开始抢票
'''

from multiprocessing import Process, Lock
import time
import json


def search(user):
    with open('data.txt', 'r', encoding='utf8') as f:
        dic = json.load(f)
    print(f'用户{user}查看余票, 还剩{dic.get("ticket_num")}')


def buy(user):
    with open('data.txt', 'r', encoding='utf8') as f:
        dic = json.load(f)

    time.sleep(1)

    if dic.get('ticket_num') > 0:
        dic['ticket_num'] -= 1
        with open('data.txt', 'w', encoding='utf8') as f:
            json.dump(dic, f)
        print(f'用户{user}抢票成功')
    else:
        print(f'用户{user}抢票失败')


def run(user, mutex):
    # 并发:异步执行
    search(user)

    mutex.acquire()
    buy(user)
    mutex.release()


if __name__ == '__main__':
    metux = Lock()

    for i in range(10):
        p = Process(target=run, args=(f'用户{i}', metux))
        p.start()

queue

Concepts

Create a shared queue process, multi-process safe Queue is a queue, you can use Queue to pass data between multiple processes

Methods Introduction

Queue([maxsize])Create a shared process queue

Parameters: maxsize is the maximum number of entries allowed in the queue if this parameter is omitted, no size limit.

And locking the underlying queue implemented using pipes

q.get([block [ ,timeout]])Return an item of q if q is empty, this method will block until the queue items are available for controlling the position stop .block blocking behavior, the default is True. If set to False, throws exception Queue.Empty (defined in the Queue module) .timeout is optional timeout, used in blocking mode. If you do not have to develop the project becomes available in the time interval, an exception is thrown Queue.Empty

q.get_nowait(): Same q.get(False)method

q.put(item [ ,block [,timeout]]): The item in the queue if the queue is full, this method will block until the .block know that space can be controlled blocking behavior, the default is True if set to False, an exception is thrown Queue.Full (Queue defined in the library module). .timeout specify the length of waiting for an available space of time in blocking mode. timeout will trigger an exception Queue.Full

q.empty(): Q is empty if this method is called, returns True if other processes or threads are add items to the queue, the result is unreliable to say, between the return and use the results, the queue may have joined the new. project

q.full(): Returns the current queue correct number

q.qsize(): If q is full, return True

from multiprocessing import Queue


q = Queue(3)

q.put([1, 2, 3], block=True, timeout=5)
q.put(2, block=True, timeout=3)
q.put({'username': 'tiny'}, timeout=4)
# q.put(5, timeout=5)

print(q.qsize())

print(q.get())
print(q.get())
print(q.get())
# print(q.get(timeout=2))


print(q.qsize())

Sub-process sends data to the parent process

from multiprocessing import Queue
from multiprocessing import Process


def task1(q):
    data = '我是数据'
    q.put(data)
    print('进程1开始添加数据到队列中...')


def task2(q):
    data = q.get()
    print(f'进程2从队列中接收数据{data}')


if __name__ == '__main__':
    q = Queue()
    p1 = Process(target=task1, args=(q,))
    p2 = Process(target=task2, args=(q,))

    p1.start()
    p2.start()
    
    print('主')

Producer consumer model

Using the producer consumer model in concurrent programming can solve most of the concurrency problems. The overall pattern to increase the speed of data processing programs by working ability to balance production and consumption thread thread.

Why use producer and consumer patterns?

In the world of the thread, the thread is the producer of production data, the consumer is the thread consumption data. In ah multi-threaded development which, if treated quickly producer, and consumer processing speed is very slow, then the producer must wait consumers have been processed in order to continue production data. By the same token, if the consumer's capacity is larger than the producer, the consumer would have to wait for the producer. to solve this problem so the introduction of producer-consumer model

What is a producer-consumer model?

Producer-consumer model is the strong coupling problem is solved by a producer and consumer of container does not communicate directly with each other between producers and consumers, and to communicate by blocking queue, so do not wait for the consumer after the completion of production data producers who deal directly throw blocking queue, consumers do not find a producer to data, but taken directly from blocking the queue, the queue is equivalent to a blocking buffer, balance the producers and consumers of processing power.

from multiprocessing import Queue
from multiprocessing import Process
import time


def producer(name, food, q):
    for i in range(9):
        data = (food, i)
        msg = f'{name}制作了{food, i}'
        print(msg)
        q.put(data)
        time.sleep(0.1)
    q.put(None)


def consumer(name, q):
    while True:
        data = q.get()
        if data is None:
            break
        print(f'{name}吃完了{data}')
        # time.sleep(0.1)


if __name__ == '__main__':
    q = Queue()

    p1 = Process(target=producer, args=('tiny', '包子', q))
    p2 = Process(target=producer, args=('xiaoxiao', '骨头', q))
    p3 = Process(target=consumer, args=('jane', q))
    p4 = Process(target=consumer, args=('nick', q))

    p1.start()
    p2.start()

    p3.daemon = True
    p4.daemon = True

    p3.start()
    p4.start()

    p2.join()
    print('主')

Thread

The process is the smallest unit of resource allocation

The thread is the smallest unit of CPU scheduling

Each process has at least one thread

The difference between processes and threads

  1. Address space and other resources (such as opening a file): inter-process independent of each other, among the threads share the same process, threads within a process is not visible in the other process
  2. Communication: inter-process communication IPC , the threads can read and write directly process data segment (such as global variables) ---- need to communicate process synchronization and mutual exclusion means aid, in order to ensure data consistency
  3. Scheduling and switching: thread context switching context switching process is much faster than
  4. In a multithreaded operating system, the process is not an executable entity

Features thread

  1. Thread entities have essentially no system resources, just a little essential resources to ensure independent operation

  2. The entity comprises a thread programs, data and TCB. A thread is a dynamic concept, which is described by the dynamic characteristics of the thread control block TCB, TCB program counter for indicating a sequence of instructions to be executed, a local variable to retain a small amount of state parameters and return address, etc. a set of registers and stack

    TCB includes the following information:

    • Thread State
    • When the thread is not running, it is stored on-site resources
    • A group of the execution stack
    • Each thread local variable storage main memory area
    • Access to the same process in main memory and other resources
  3. Resource sharing process

    Threads in the same process each thread can share the resources of the process have, first and foremost in: all threads have the same process id, which means that the thread can access each memory resources the process; also the process can access the file has been opened, timers, semaphores institutions. Since threads share memory and files within the same process, among all the threads communicate with each other without invoking the kernel.

  4. Can be executed concurrently

    Between multiple threads in a process can be executed concurrently, and even allow all the threads can execute concurrently in one process; likewise, different threads in the process can be executed concurrently, full utilization of the processor and peripheral devices ability to work in parallel.

Two ways open thread

from threading import Thread
import time


# 开启线程方式1:
def task():
    print('线程开始')
    time.sleep(1)
    print('线程结束')


if __name__ == '__main__':
    # 调用Thread线程类实例化得到线程对象
    t = Thread(target=task)
    t.start()
    

# 开启线程方式2:
class MyThread(Thread):
    def run(self):
        print('线程开启')
        time.sleep(1)
        print('线程结束')


t = MyThread()  # 如需要传入参数可以传参
t.start()

Thread object attributes

from threading import Thread
import time
from threading import current_thread

'''
current_thread().name  当前线程名字
isAlive()  判断线程是否存活
'''


def task():
    print('线程开启', current_thread().name)
    time.sleep(3)
    print('线程结束', current_thread().name)


if __name__ == '__main__':
    for i in range(10):
        t = Thread(target=task)
        t.start()
    print(t.isAlive())

Thread mutex

from threading import Thread, Lock
import time


mutex = Lock()
n = 100


def task(i):
    print(f'线程{i}启动')
    global n
    mutex.acquire()
    temp = n
    time.sleep(0.1)  # 一共等待10秒
    n = temp - 1
    print(n)
    mutex.release()


if __name__ == '__main__':
    lis = []
    for i in range(100):
        t = Thread(target=task, args=(i,))
        lis.append(t)
        t.start()

    for i in lis:
        i.join()

    print(n)

Guess you like

Origin www.cnblogs.com/2222bai/p/11721663.html