Python function coding questions that always run in parallel.(multiprocessing)

Asked 2 years ago, Updated 2 years ago, 75 views

Hello, I'm coding Python. Even if you search by yourself and code through trials and errors, I'm asking you a question because something doesn't work out.

from time import sleep
from multiprocessing import Process


def append0():
    global result
    while True:
        result.append('0')


def append1(): 
    global result2

    while True: 
        result2.append('1')

if __name__ == '__main__': 
    result2 = []
    result = []
    result.append('0')
    result2.append('1')
    p1 = Process(target=append0)
    p2 = Process(target=append1)
    p1.start()
    p2.start()
    sleep(3)
    print(result)
    print(result2)
    p1.join()
    p2.join()

First of all, what I want is that the main function proceeds according to the main function A structure in which the other two functions must always be running in parallel with them. So I think we should use multiprocessing.

First of all, let's make it simple and see if the code works I'm testing it as above... In this code, What I want is def append0 and def append1 These two functions continue to run You must continue to add '0' and '1' to the result and result2, respectively. In the meantime, the main function has to go back. So I coded it as above If it worked out the way I wanted it to, While the main function sleeps for 3 seconds, result and result2 by append0 and append1 functions I thought '0' and '1' would be included The one that is printed in the result window is the one that was first added to the main function ["0"] ["1"] Only one is coming out "T" That's why the append functions don't seem to be running

What should I do?

python multiprocessing

2022-09-22 18:03

1 Answers

Python has a headache because of the relic called GIL.

In the past, the GIL structure is beneficial in terms of performance only in cpu with one core, but now it is a world where multi-core is basic...I tried to remove GIL from Python 3, but I couldn't remove it for many reasons...In the end, we chose a bypass method, and the result is the introduction of multiprocessing modules.

I'm going to talk about it'll be like...To understand the reason for the question, you need to understand that it is a process.

Each process has its own memory space. On 32-bit operating systems, processes are allocated 4GB of memory each and operate independently; each other's processes cannot invade each other's memory.

So that's the concept and look at the code above. We created two more processes. This means that the above code has created two forked processes in one main process.

The problem is that the results generated by each forked process are not accessible (in the main) because the processes are independent.

Then, how can we send or share variables?

The transmission can be IPC using Queue and pipe, and the sharing of variables mainly uses shared memory.

https://docs.python.org/3.7/library/multiprocessing.html

Please read Exchanging objects between processes, Sharing state between processes in the Help first.


2022-09-22 18:03

If you have any answers or tips


© 2024 OneMinuteCode. All rights reserved.