python in a multi-process applications and zombie process, orphaned

A, python how to use multiple processes

Create a child process of the way

1. Import multiprocessing in the Process class instance of this class to perform specified tasks target

import os
from multiprocessing import Process
"""
Process 就表示进程
为什么要开进程
"""

def task():
    print("this is sub process")
    print("sub process id %s" % os.getpid())


if __name__ == '__main__':
    # ######注意 开启进程的代码必须放在 ————main————判断下面
    #  实例化一个进程对象 并制定他要做的事情  用函数来指定
    p = Process(target=task)
    p.start() # 给操作系统发送消息 让它开启进程
    print("this is parent process")
    print("parent process is: %s" % os.getpid())
    print("over")

Linux and windows open process different way

Linux memory data will be complete copy of the parent process to a child process

note:

Code windows will import the parent process from scratch again to get the execution of tasks need to be addressed

So the code you write code if the windows are sure to put the main open process of judgment

linux can hold

This function is performed automatically import multiprocessing 2. Process class inherits the task of this class override the run method will be executed in turn run into the process

from multiprocessing import Process
import os


class Downloader(Process):

    # def __init__(self,url,size,name):
    #     super().__init__()
    #     self.url = url
    #     self.size = size
    #     self.name = name

    def run(self):
        print(os.getpid())
        pass

if __name__ == '__main__':
    m = Downloader()
    m.start()
    print("parent over",os.getpid())

If you need to process it highly customizable objects can inherit it

Mutual isolation between process memory

from multiprocessing import  Process
import os,time

a = 257


def task():
    global a
    # print("2",a,id(a))
    a = 200
     

if __name__ == '__main__':
    p = Process(target=task)
    p.start() # 向操作系统发送指令

    time.sleep(4)
    print(a)

join function

from multiprocessing import Process
import time
def task1(name):
    for i in range(10000):
        print("%s run" % name)

def task2(name):
    for i in range(100):
        print("%s run" % name)

if __name__ == '__main__': # args 是给子进程传递的参数 必须是元组
    p1 = Process(target=task1,args=("p1",))
    p1.start()  # 向操作系统发送指令
    # p1.join()   # 让主进程 等待子进程执行完毕在继续执行

    p2 = Process(target=task2,args=("p2",))
    p2.start()  # 向操作系统发送指令

    p2.join()  # 让主进程 等待子进程执行完毕在继续执行
    p1.join()


    #需要达到的效果是 必须保证两个子进程是并发执行的 并且 over一定是在所有任务执行完毕后执行
    print("over")

Case:

# join的使用
from multiprocessing import Process
import time
def task1(name):
    for i in range(10):
        print("%s run" % name)


if __name__ == '__main__': # args 是给子进程传递的参数 必须是元组


    ps = []
    for i in range(10):
        p = Process(target=task1,args=(i,))
        p.start()
        ps.append(p)

    # 挨个join以下
    for i in ps:
        i.join()

    print("over")

Second, the zombie process and orphaned

Orphan processes when the parent process has ended and the child process is still running child process called orphan process in particular the need exists, there is no adverse effect

Zombie process when a process has ended, however, it still has some data exist at this time is called a zombie process

In linux, there is such a mechanism, the parent no matter what time you can get to the child process some of the data

After the task child process is finished, over for sure but still retains some of the data is designed to allow the parent process can obtain this information

linux can call waitpid to completely remove the child process is residual information

python has been encapsulated operation processing zombie, without concern

Guess you like

Origin www.cnblogs.com/chuwanliu/p/11121734.html