In order to reduce work time, I'm trying to run two or more functions simultaneously using multiprocessing, and I found that if I want to add or exclude a list in the middle of the function, I can share memory through a separate task.
So I was looking for that method, and I saw that the ProcessPoolExecutor in concurrent.futures is similar to multiprocessing. So I looked it up and saw that it says that the module is using a multiprocessing module.
But I don't really understand that there's a multiprocessing module, but there's another module that runs it.
I don't think there's much difference just by looking at the example. I couldn't have made a difference, so I looked up articles related to ProcessPoolExecutor, but there weren't many articles about it, so it was hard to understand the difference with my insufficient head.
I'm asking if anyone knows the difference.
python multiprocessing
It's hard to reverse the order of understanding.
First, you need to understand why Python needs to implement current in multi-process form.
Other runtime or languages such as jvm and clr usually use threads to implement concurrency. When you create and run two threads, they are naturally distributed to two cores and run. Python, by the way, has a reference count whenever gc approaches the object as a reference counting model. As a result, multiple references at the same time require complex implementations and performance losses.
So Python has a device called GIL, and Python VM allows only one thread to be accessed at the same time. This provides the advantage of no performance sacrifice and simplified implementation in single thread, but the problem arises that only one core can be used in 8-core cpu. Since Python was developed in the 1980s, it was not a time to worry about the possibility of multiple cores in the processor.
But now it's a multi-core era, and in the case of GO language, it supports synchronization very comfortably with Gorutin.
So is there a way to use all the cores of cpu in Python...s because the process in OS is independent, so you can simply run multiple Pythons. That's what multiprocess modules do. But there's a downside. We just want to do one program using all the cores, so we have to share variables and data with each other. However, unlike threads, processes in os are independent spaces, so processes cannot invade each other's memory. If it's a thread, you can just share variables, but it's a process, so you can't share variables because it's a separate, completely independent space.
So processes solve this problem by utilizing shared memory for variable, data sharing.
However, as we try to solve the concurrency with multi-processes, which are independent of each other, there are many things to consider and it is difficult to implement stably, so high-level modules come out.
Unfortunately, Python has clear performance and concurrency issues and is difficult to solve. Moreover, it is difficult to be optimistic about Python's future as development languages and easy-to-use environments suitable for the multi-core era continue to emerge recently.
The concurrent.futures module provides a high-level interface for asynchronously executing callables.
multiprocessing is a package that supports spawning processes using an API similar to the threading module.
" To reduce working time, we are trying to run two or more functions simultaneously using multiprocessing. "
"To add or exclude a list in the middle of a function ""
"" I learned that they share memory through a separate task. "
I'm not sure exactly what you want to do, but I guess you can use multiprocessing.
Note.
https://docs.python.org/3/library/multiprocessing.html#multiprocessing-managers
© 2024 OneMinuteCode. All rights reserved.