python basic learning 21 ---- process

Multithreading in python is not really multithreading. If you want to make full use of the resources of multi-core CPU, you need to use multiprocessing in most cases in python.

There are many similarities between processes and threads. For more information about threads, please refer to https://www.cnblogs.com/sfencs-hcy/p/9721362.html

multiprocessing module

1. Process creation

import multiprocessing

def func(msg):
    print(msg)
    print("This is a process")
if __name__=="__main__":
    p=multiprocessing.Process(target=func,args=("hello world",))
    p.start()

Create a process by inheriting a class

import multiprocessing
class Myprocessing(multiprocessing.Process):
    def __init__(self,name,age):
        multiprocessing.Process.__init__(self)
        self.name=name
        self.age=age

    def run(self):
        #Here is the general threading.Thread Medium run Method is overloaded
       print("%s is %d"%(self.name,self.age))
if __name__=="__main__":
    t=Myprocessing("sfencs",19)
    t.start()

2. Parallel process

import multiprocessing
import time
class Myprocessing(multiprocessing.Process):
    def __init__(self,name,age,second):
        multiprocessing.Process.__init__(self)
        self.name=name
        self.second=second
        self.age=age

    def run(self):
        print(self.name)
        time.sleep(self.second)
        print(self.age)

if __name__=="__main__":
    time_begin=time.time()

    p1=Myprocessing("sfencs",19,2)
    p2=Myprocessing("Tom",25,5)
    p1.start()
    p2.start()
    p1.join()
    p2.join()
    time_end=time.time()
    print(time_end-time_begin)
'''
Tom
19
25
5.198107481002808
'''

join is the same as thread

3. Daemons

The principle of daemons is the same as that of daemons, except that the way to set daemons is p.daemon=True

4.lock

lock is the same as multithreading. There are two ways to implement it

import multiprocessing

def func2(lock,f):
    with lock:
        fs=open(f,'a+')

        fs.write('Lockd acquired via with\n')

        fs.close()
def func1(lock,f):
    lock.acquire()
    fs=open(f,'a+')
    fs.write('Lock acquired directly\n')
    fs.close()

    lock.release()
if __name__=="__main__":
    lock=multiprocessing.Lock()
    f = "file.txt"
    p1=multiprocessing.Process(target=func2,args=(lock,f,))
    p2=multiprocessing.Process(target=func1,args=(lock,f,))
    p1.start()
    p2.start()
    p1.join()
    p2.join()

Unlike threads, lock s are passed as parameters, because different processes cannot share resources

5.Semaphore

Used to control the maximum number of accesses to shared resources

import multiprocessing
import time

def func(s, i):
    s.acquire()
    print(multiprocessing.current_process().name + "acquire");
    time.sleep(2)
    print(multiprocessing.current_process().name + "release\n");
    s.release()

if __name__ == "__main__":
    s = multiprocessing.Semaphore(2)
    for i in range(5):
        p = multiprocessing.Process(target = func, args=(s, 2))
        p.start()

6.event is the same as thread

7. Queue

There is a queue dedicated to multiprocessing.Queue

import multiprocessing

def writer(q):
   q.put("hello world")

def reader(q):
    print(q.get())

if __name__ == "__main__":
    q = multiprocessing.Queue()
    pwriter=multiprocessing.Process(target=writer,args=(q,))
    preader = multiprocessing.Process(target=reader, args=(q,))
    pwriter.start()
    preader.start()

8. pipe

The Pipe method returns (conn1, conn2) representing both ends of a Pipe. The Pipe method has a duplex parameter. If the duplex parameter is true (the default value), the Pipe is in full duplex mode, that is, both conn1 and conn2 can be sent and received. Duplex is False, conn1 is only responsible for receiving messages, and conn2 is only responsible for sending messages.

import multiprocessing

def sender(p):
   p.send("hello world")

def receiver(p):
    print(p.recv())

if __name__ == "__main__":
    p = multiprocessing.Pipe()
    psender=multiprocessing.Process(target=sender,args=(p[0],))
    preceiver = multiprocessing.Process(target=receiver, args=(p[1],))
    psender.start()
    preceiver.start()

9.manager

manager realizes data sharing between processes

import multiprocessing

def func(list1,d,i):
    list1[i]=i
    d["a"]=i

if __name__ == "__main__":
    with multiprocessing.Manager() as manager:
        list1=manager.list(range(5,10))
        d=manager.dict()
        plist=[]
        for i in range(5):
            p=multiprocessing.Process(target=func,args=(list1,d,i))
            plist.append(p)
            p.start()
        for i in plist:
            i.join()
        print(list1)
        print(d)

 

Unfinished

Keywords: Python

Added by willeadie on Fri, 20 Dec 2019 21:43:28 +0200