Hi all, I’m experiencing an issue with multiprocessing on the new MacBook Pro with Apple M1 chip.
I’m running pymc3 under emulation through Rosetta2 , and I’m trying to run NUTS with 4 parallel chains:
with model:
trace = pm.sample(5000, tune = 1000, chains = 4)
but it returns the following error:
Traceback (most recent call last):
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/multiprocessing/forkserver.py", line 278, in main
code = _serve_one(child_r, fds,
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/multiprocessing/forkserver.py", line 317, in _serve_one
code = spawn._main(child_r, parent_sentinel)
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/multiprocessing/spawn.py", line 125, in _main
prepare(preparation_data)
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/multiprocessing/spawn.py", line 236, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/multiprocessing/spawn.py", line 287, in _fixup_main_from_path
main_content = runpy.run_path(main_path,
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/runpy.py", line 262, in run_path
code, fname = _get_code_from_file(run_name, path_name)
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/runpy.py", line 232, in _get_code_from_file
with io.open_code(fname) as f:
FileNotFoundError: [Errno 2] No such file or directory: '/Users/benedetto/PycharmProjects/BayesianProject/<input>'
Traceback (most recent call last):
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/code.py", line 90, in runcode
exec(code, self.locals)
File "<input>", line 2, in <module>
File "/Users/benedetto/PycharmProjects/BayesianProject/venv/lib/python3.8/site-packages/pymc3/sampling.py", line 559, in sample
trace = _mp_sample(**sample_args, **parallel_args)
File "/Users/benedetto/PycharmProjects/BayesianProject/venv/lib/python3.8/site-packages/pymc3/sampling.py", line 1461, in _mp_sample
sampler = ps.ParallelSampler(
File "/Users/benedetto/PycharmProjects/BayesianProject/venv/lib/python3.8/site-packages/pymc3/parallel_sampling.py", line 431, in __init__
self._samplers = [
File "/Users/benedetto/PycharmProjects/BayesianProject/venv/lib/python3.8/site-packages/pymc3/parallel_sampling.py", line 432, in <listcomp>
ProcessAdapter(
File "/Users/benedetto/PycharmProjects/BayesianProject/venv/lib/python3.8/site-packages/pymc3/parallel_sampling.py", line 292, in __init__
self._process.start()
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/multiprocessing/process.py", line 121, in start
self._popen = self._Popen(self)
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/multiprocessing/context.py", line 290, in _Popen
return Popen(process_obj)
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/multiprocessing/popen_forkserver.py", line 35, in __init__
super().__init__(process_obj)
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/multiprocessing/popen_fork.py", line 19, in __init__
self._launch(process_obj)
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/multiprocessing/popen_forkserver.py", line 58, in _launch
f.write(buf.getbuffer())
BrokenPipeError: [Errno 32] Broken pipe
On the other hand, running the NUTS without parallelisation runs smoothly without errors.
I read that some other people had problems with pymc3 multiprocessing and M1 chips so I was wondering if there was any official update or workaround about that.
Thanks a lot in advance!