please tell me
If you give an appropriate argument to the following main and execute it, the number of processes will increase and the memory will be tight, so I want to prevent this.
import multiprocessing
def score4abc(a,b):
score = a,b
return score
def wrapper_score4multi(args):
return score4abc(*args)
def main(a,foo):
for i in range(500):
pool = multiprocessing.Pool(32)
pool.map(wrapper_score4multi,[[a,b] for b in foo]#32 processes are always in R state, and terminated processes remain in S state.
pool.close()
To put it plainly, it feels pretty good to be able to kill the remaining processes with a glue method like pool.kill ().
When I checked it with ps aux, the process that was parallelizing while executing the following code remained suspended for a long time, so I suspected that was the cause. According to the official document, it says that it stops with Pool.terminate () and ends with Pool.close (), but it remains perfectly even after Pool.close (). I forgot the details, but I can't use Process to pass multiple arguments to the function I want to process in parallel, and I think it would have been difficult without using pool.map as shown below. Also, if you identify the process and kill it in the shell every minute, the number of processes will quickly recover over time (too low layer to identify the cause yourself). The above is compiled and executed by Cython. The python version of the main caller above is 2.7.3.
Recommended Posts