Multiprocessing AsyncResult.get() hangs in Python 3.7.2 but not in 3.6

I think this is a regression in Python 3.7.2 as described here. It seems to only affect users when running in a virtualenv. For the time being you can work-around it by doing what’s described in this comment on the bug thread. import _winapi import multiprocessing.spawn multiprocessing.spawn.set_executable(_winapi.GetModuleFileName(0)) That will force the subprocesses to spawn using … Read more

Python Multiprocessing error: AttributeError: module ‘__main__’ has no attribute ‘__spec__’

The problem is not with the code / Python 3.6, it is with Spyder. After some investigation I found that the code runs fine when executed in an external system terminal but not when run in Spyder’s IPython console. I was able to dump the contents of spec and assign them to a variable that … Read more

multiprocessing pool example does not work and freeze the kernel

This happens because you didn’t protect your “procedural” part of the code from re-execution when your child processes are importing f. They need to import f, because Windows doesn’t support forking as start method for new processes (only spawn). A new Python process has to be started from scratch, f imported and this import will … Read more

What’s the difference between ThreadPool vs Pool in the multiprocessing module?

The multiprocessing.pool.ThreadPool behaves the same as the multiprocessing.Pool with the only difference that uses threads instead of processes to run the workers logic. The reason you see hi outside of main() being printed multiple times with the multiprocessing.Pool is due to the fact that the pool will spawn 5 independent processes. Each process will initialize … Read more

How to get the return value of a function passed to multiprocessing.Process?

Use shared variable to communicate. For example like this: import multiprocessing def worker(procnum, return_dict): “””worker function””” print(str(procnum) + ” represent!”) return_dict[procnum] = procnum if __name__ == “__main__”: manager = multiprocessing.Manager() return_dict = manager.dict() jobs = [] for i in range(5): p = multiprocessing.Process(target=worker, args=(i, return_dict)) jobs.append(p) p.start() for proc in jobs: proc.join() print(return_dict.values())

Processes stuck in loop with PyInstaller-executable

You need to use multiprocessing.freeze_support() when you produce a Windows executable with PyInstaller. Straight out from the docs: multiprocessing.freeze_support() Add support for when a program which uses multiprocessing has been frozen to produce a Windows executable. (Has been tested with py2exe, PyInstaller and cx_Freeze.) One needs to call this function straight after the if name … Read more

Using stdin in a child Process

I solved a similar issue by passing the original stdin file descriptor to the child process and re-opening it there. def sub_proc(q,fileno): sys.stdin = os.fdopen(fileno) #open stdin in this process some_str = “” while True: some_str = raw_input(“> “) if some_str.lower() == “quit”: return q.put_nowait(some_str) if __name__ == “__main__”: q = Queue() fn = sys.stdin.fileno() … Read more

How to change the serialization method used by the multiprocessing module?

I believe the patch you’re referring to works if you’re using a multiprocessing “context” object. Using your pickle2reducer.py, your client should start with: import pickle2reducer import multiprocessing as mp ctx = mp.get_context() ctx.reducer = pickle2reducer.Pickle2Reducer() And ctx has the same API as multiprocessing. Hope that helps!