History of computer development process

Multi-channel technology

1. Spatial multiplexing on: a plurality of programs share a common set of computer hardware

2. multiplexed on time: switching state of preservation +

  1. When a program encounters IO operation, the operating system will deprive cpu execute permission of the program (to improve the cpu utilization and does not affect the performance of programs)

  2. When a program for a long time occupied cpu cpu operating system will deprive execute permission of the program (reducing the performance of programs)

  Concurrency: Looks like you can run simultaneously
       in parallel: Perform the same time the true sense of
            the single-core computer can not be achieved in parallel, but can be implemented concurrently

Synchronous asynchronous: that is the task of submitting a way
        sync: mission submitted after still waiting for the task and returns the result to get away before doing anything (program-level performance is stuck) during
        not after job submission: asynchronous another place to wait but to the next line of code (but the result is to be used to obtain otherwise)

Blocking non-blocking: the program running representation
        blocking: blocking state
        non-blocking: ready state to run state

Create a process in two ways

# Create a process code module will be performed on the way down again from 
the way a
 from multiprocessing Import Process
 Import Time
 DEF the Test (name):
     Print ( ' % S ' % name)
    the time.sleep ( 1 )
     Print ( ' hi ' )
 IF  __name__ == ' __main__ ' :   # When you create a process, '__main__' in if __name__ ==: to create within 
    the p-Process = (= the Test target, args = ( ' Egon ' ,))   # create a process object 
    p.start ()   # tell the operating system to create a process 
    the time.sleep (1 )
     Print ( ' the Hello ' )
Second way
from multiprocessing import Process
import time
class MyClass(Process):
    def __init__(self,name):
        super().__init__()
        self.name = name
    def test(self):
        print('%s'%self.name)
        time.sleep(1)
        print('hi')
if __name__ == '__main__':
    p = MyClass ( ' Egon ' )
    p.start()
    p.test()
    print('gun')

After executing the method is to join a subroutine in the main program

from multiprocessing import Process
import time
def test(name,i):
    time.sleep(i)
    print('hello')
if __name__ == '__main__':

    # p_list = []
    # for i in range(3):
    #     p = Process(target=test,args=('进程',i))
    #     p.start()
    #     p_list.append(p)
    # for p in p_list:
    #     p.join()
    # print('gun')

    p = Process(target=test,args=('egon',1))
    p1 = Process(target=test,args=('jason',2))
    start_time = time.time()
    p.start()
    p1.start()
    p.join()
    p1.join()
    print('hi')
    print(time.time()-start_time)

Other methods of process objects

from multiprocessing import Process,current_process
import os
import time
def test(name):
    print('%s'%name,current_process().pid,'子进程%s'%os.getpid(),'父进程%s'%os.getppid())
    time.sleep(2)
    print('hello')
if __name__ == '__main__':
    p = Process(target=test,args=('egon',))
    p.start()
    time.sleep(1)
    p.terminate ()   # kill the current process, in essence, is to allow the operating system to help you to kill a process 
    the time.sleep (1 )
     Print (p.is_alive ())   # judging process is alive 
    Print ( ' Gun ' )

Zombie process and orphaned
    
parent process child process are two ways to recover resources
       1.join method
       2. The parent process wrongful death
 all processes will step into zombies
              
  orphaned
        child process the parent process did not die of accidental death           
          for children's welfare will be linux (init) If the parent accidental death he created the child process will be adopted orphanage

Daemon

from multiprocessing import Process
import time
def test(name):
    print('%s'%name)
    time.sleep(1)
    print('%s'%name)
if __name__ == '__main__':
    p = Process(target=test,args=('egon',))
    p.daemon = True    # This process is set to daemon 
    p.start ()
    time.sleep(1)
    print('bye')

Mutex
        when multiple processes operate the same data may cause confusion when the data
        at this time to be locking process
            will become complicated by the serial
                while lowering efficiency but to improve the security of data
            Note:
                1. Do not use likely to cause lock deadlock
                2. lock in only part of the process in the global data do not lock
        
        must produce to the child process in the main process to use lock

from multiprocessing import Process,Lock
import time
import json
def search(i):
    with open('data','r',encoding='utf-8')as f:
        data = f.read()
    T_d = json.loads (Data)
     Print ( ' User queries I% s votes% s ' % (I, t_d.get ( ' Ticket ' )))
 DEF Buy (I):
    with open('data','r',encoding='utf-8')as f:
        data = f.read()
    t_d = json.loads(data)
    time.sleep(1)
    if t_d.get('ticket') > 0:
        t_d['ticket'] -= 1
        with open('data','w',encoding='utf-8')as f:
            json.dump(t_d,f)
        Print ( ' grab votes success ' )
     the else :
         Print ( ' no ticket ' )
 DEF RUN (i, mutex):
    search(i)
    mutex.acquire()  # 抢锁
    buy(i)
    mutex.release ()   # release lock 
IF  the __name__ == ' __main__ ' :
    the mutex = Lock ()    # generates a lock 
    for I in Range (10 ):
        p = Process(target=run,args=(i,mutex))
        p.start()

data text document
{"ticket": 0}

 

Guess you like

Origin www.cnblogs.com/zrh-960906/p/11329407.html