python - Why is multiprocessing's apply_async so picky? -



python - Why is multiprocessing's apply_async so picky? -

sample code works without issue:

from multiprocessing import * import time import random def myfunc(d): = random.randint(0,1000) d[a] = print("process; %s" % a) print("starting mass threads") man = manager() d = man.dict() p = pool(processes=8) in range(0,100): p.apply_async(myfunc, [d]) p.close() p.join() print(d) print("ending multiprocessing")

if alter p.apply_async(myfunc, [d]) p.apply_async(myfunc, (d)) or p.apply_async(myfunc, d) pool not work @ all. if add together arg myfunc , pass in none it'll work p.apply_async(myfunc, (none, d)) — why?

the documentation apply_async says following:

apply(func[, args[, kwds]])

call func arguments args , keyword arguments kwds. blocks until result ready. given blocks, apply_async() improve suited performing work in parallel. additionally, func executed in 1 of workers of pool.

thus instead of taking star , double star arguments, takes positional arguments , keyword arguments passed target function 2nd , 3rd arguments function; sec must iterable , 3rd 1 mapping, respectively.

notice since apply works asynchronously, won't see exceptions, unless .wait , .get them results;

you can seek simply:

for in range(0,100): result = p.apply_async(myfunc, d) print(result.get())

in code above, result.get() waits completion of 100th thread , returns returned value - or tries fail, because managed dictionary cannot used positional arguments:

traceback (most recent phone call last): file "test.py", line 21, in <module> print(result.get()) file "/usr/lib/pythonn.n/multiprocessing/pool.py", line 558, in raise self._value keyerror: 0

thus, looking @ original question: note [d] list of length 1; (d) same d; have tuple of length 1 need type (d,). python 3 tutorial section 5.3:

a special problem construction of tuples containing 0 or 1 items: syntax has quirks accommodate these. empty tuples constructed empty pair of parentheses; tuple 1 item constructed next value comma (it not sufficient enclose single value in parentheses). ugly, effective. example:

>>> empty = () >>> singleton = 'hello', # <-- note trailing comma >>> len(empty) 0 >>> len(singleton) 1 >>> singleton ('hello',)

(d,), [d], {d}, or iter(frozenset(d)) or {d: true} work nicely positional arguments; these args result in iterable iterator yields 1 value - of d. on other hand, if had passed other kind of value unfortunate managed dictionary, have gotten much more usable error; if value 42, you'd have got:

typeerror: myfunc() argument after * must sequence, not int

python multithreading multiprocessing python-multithreading gil

Comments

Popular posts from this blog

java - How to set log4j.defaultInitOverride property to false in jboss server 6 -

c - GStreamer 1.0 1.4.5 RTSP Example Server sends 503 Service unavailable -

Using ajax with sonata admin list view pagination -