Review punch --0821 multithreading

 A concurrent and parallel

Concurrency : the number of tasks> CPU core number, through the task scheduling system, switching back and forth, to achieve multiple tasks "with" Run, in fact, not really run together, just switch to run very fast, appears to be performed together of it;

Parallel : the number of tasks <= CPU core number, is running simultaneously with the real.

Synchronization : synchronization refers to the time code calls IO operations, IO operation must wait for the completion of the call only way to return only one main line;

Asynchronous : Asynchronous refers to the time code calls IO operations, IO operation without waiting for the completion of return only way calling, there are a number of main line;

from threading import Thread
import time

def timer(fun):
    def wrapper(*args,**kwargs):
        time1 = time.time()
        fun(*args,**kwargs)
        TIME2 = the time.time ()
         Print ( " current as a function of time: {} " .format (time2- TIME1))
         return time2- TIME1
     return warpper

def work1():
   for i in range(6):
        the time.sleep ( . 1 )
         Print (F ' of the watering times of {i} ' )

def work2(name):
   for i in range(5):
        the time.sleep ( . 1 )
         Print (F ' of times {i} {name} hit the wall ' )

# Synchronous operation follows, needed more than 11s 
@timer
 DEF main ():
    work1()
    work2() 

# Asynchronous operation, only 6s multi 
@timer
 DEF MAIN2 ():
    T1 = the Thread (target = WORK2, args = ( 'Musen',) )   # parameter passing a thread of execution function or the Thread (target = WORK2, kwargs = { 'name': 'Musen'} ) This parameter passing mode
t1.start () # initial preparation and execution of the new thread, not necessarily executed after the main thread, there may precede the main thread  Print ( " hit the wall asynchronous task execution " MAIN2) WORK1 () main () ()

threading module to explain:

Create a thread object: t1 = threading.Thread (target = func) func is a function of task execution of the specified thread

Thread class common methods, attributes :

1 .start () : start thread activity.
2. RUN () : This method is described thread activity, can override this method in a subclass.
3. the Join (timeout = None) : set the main thread will wait for the end of sub-thread execution to continue running. You can set a timeout parameter, to avoid endless waiting. Because the two threads to complete the order, looks like a thread, so called merged threads.
To set a timeout by a parameter passed to join, which is more than the specified time will not join in the blocking process. In the actual application of the test, it was found that not all of the threads within the timeout period is over, but the order of execution Verify timeout in time_out time, for example, the timeout is set to 2s, in front of a thread in the absence of complete followed by a thread of execution will join on a thread from the end of the time since then set the timeout 2s.
4.name : thread name
5.getname (): Returns the thread name
6.setname (): Set the thread name
7.ident: thread identifier
8.is_alive (): Returns the thread is active
9.daemon: Boolean value that indicates if this thread is a daemon thread

threading.active_count () : This function returns the number of the current thread, this value is also equal to the program list 

threading.current_thread () : Returns the current thread object

threading.enumereate () : Returns a list of the currently running thread

threading.main_thread (): return to the main thread, under normal circumstances, the main thread is that the thread starts running when the python interpreter.

# Define thread class
Import
Time from Threading Import the Thread def timer(fun): def wrapper(*args,**kwargs): time1 = time.time() fun(*args,**kwargs) TIME2 = the time.time () Print ( " current as a function of time: {} " .format (time2- TIME1)) return time2- TIME1 return warpper class MyTread(Thread):
   def __init__(self):
    super().__init__(*args,**keargs)
    self.url=url
def run(self): headers={"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.80 afari/537.36"} for i in range(100): #res=requests.get('https://www.baidu.com',headers=headers)
     res=requests.get(url=self.url,headers=headers) 
print(res) @timer def main(): t_list=[] for i in range(10): #t=MyTread()
     t=MyTread('https://www.baidu.com') t.start()
# T.join () if placed here, it will become the single-threaded run, perform a complete re-build a child thread # Traverse all the child thread, the child thread all set to perform after their execution of the main thread for J in t_list: j.join() if __name__=="__main__": main()

Multithreading can share global variables (using the same piece of memory), changes will create competition for resources, and therefore the same process in python multithreading parallel impossible to achieve, can only be concurrent, switching back and forth in the thread.

Between threads which actions will cause the switch?

1. IO consuming operations: networks, files, and other time-consuming input IO operations, will automatically thread switch;

2. Thread switching is executed when the execution time reaches a certain threshold;

1000000 bug workaround:

1. processed by the lock

2. store queue do

from threading import Thread,Lock

a = 0

def work1():
global num
for i in range(1000000): meta.acquire () # locked NUM + 1 = Meta .release () # to unlock def work2():
   global num
for i in range(1000000): meta.acquire () # locked NUM + 1 = Meta .release () # to unlock Meta = Lock () # create a lock def main(): t1=Thread(target=work1) t2=Thread(target=work2) t1.start() t2.start() t1.join() t2.join() print (whether) if __name__=='__main__': main()

GIL Global Interpreter Lock:

IO-intensive tasks: CPU small footprint, most of the time waiting on the IO operation. This is done for multithreading

CPU-intensive tasks: CPU occupancy and more, requires a lot of computing. This is done for single-threaded.

Some explanations about the GIL:

1.python language and GIL does not matter, just due to historical reasons on Cpython virtual machine (interpreter), it is difficult to remove the GIL;

2.GIL: global interpreter lock, each thread in the implementation of all need to get GIL, ensure that only one thread can execute code;

3. Thread the case GIL release the lock, before the IO and other operations may cause blocking system call, you can temporarily release the GIL, after the implementation must obtain GIL, python3 execution time using the timer reaches a threshold (python2 use tickets count reached 100) the current thread release GIL

4. python can use multi-process multi-core CPU resources

Deadlock:

In the multi-thread, the thread can be ensured by the sole possession of the same mutex resources, but after the program becomes complicated, thread A may appear on the lock on resource A, and A thread behind the need to use resources B, a used up before will unlock resources, and thread B to B resource was locked, choose to use the resources behind it a, B will unlock after used, if thread a and thread B runs at the same time, it may cause case: A thread is waiting for thread B to unlock, also waiting for thread A thread B to unlock, this is the deadlock.

Solution: deadlock problem should be avoided when designing the program, or add a wait timeout to detect whether the program produced a deadlock, and the other is through the banker's algorithm also avoid deadlock

Banker's algorithm: thinking bankers algorithm is assumed that banks have $ 10, this time there are three individuals loans, A to 9 yuan loans, B to 3 yuan loans, C wants to borrow $ 8, this time, the bank will certainly not enough everyone meet, was born banker's algorithm

Then all the banks in order to retain customers and to ensure that their money will not lack, they lend to customers in batches, to lend A 2 yuan, 2 yuan B, C 4 yuan, 2 yuan left bank, this time straight B need to borrow one yuan to meet his own needs, the bank will lend him $ 1, own left $ 1, when B runs out, returned to the bank after three yuan, four yuan bank which then lend C, C will meet after the repayment, etc. C, and then the 8 yuan 7 yuan lent a, so that they meet the three dynamic customer demand

Banker's algorithm in the program actually simulates the process of bank loans, the operating system will dynamically allocate resources to individual threads, before distribution, the system will determine the allocation will cause the system to enter a state of insecurity, not just allocating resources to the thread, the thread will make the wait

Guess you like

Origin www.cnblogs.com/qingyuu/p/12255578.html