Get the result when running subprocess, close it if the result is True

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
from multiproceeing import Pool
import Queue
import time

def test(p):
time.sleep(0.001)
if p == 10000:
return True
else:
return False

if __name__=="__main__":
pool = Pool(processes=10)
q = Queue.Queue()
for i in xrange(50000):
'''
save the subprocess into queue
'''
q.put(pool.apply_async(test, args=(1,))) # keep 10 subprocess, if one closed add another one subprocess
while 1:
if q.get().get():
pool.terminate() # close all subprocess if return True
break
pool.join()

Note:
the total subprocess is 50000(concurrent is 10), after one subprocess return True, end the process pool. because use apply_async, after add subprocess by for loop, just use while to check the result.

Advantage:
no need waiting all subprocess end.

Shortcoming:
when the subprocess is large, the for loop need add all subprocess. it waste long time.

Multiprocessing + threading

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
from multiprocessing import Pool
import Queue
import threading
import time

def test(p):
time.sleep(0.001)
if p == 10000:
return True
else:
return False
if __name == "__main__":
result = Queue.Queue()
pool = Pool()

def pool_th():
for i in xtange(500000000):
try:
result.put(pool.apply_async(test, args=(i,)))
except:
break

def result_th():
while 1:
a = result.get().get()
if a:
pool.terminate()
break
'''
use threading, runniung Pool function create subprocess and get result from subprocess
'''

t1 = threading.Thread(target=pool_th)
t2 = threading.Thread(target=result_th)
t1.start()
t2.start()
t1.join()
t2.join()

pool.join()

The running process:
use threading create pool_th thread and result_th thread. pool_th is add subprocess into pool, and save the result into queur. result_th if get the result from queue, call get() get the result. when find the result is True, end all process, then end threading.