Multithreading broken pipeline on custom Env
See original GitHub issueFirst of all, thank you for this wonderful project, I can’t stress it enough how badly baselines was in need of such a project.
Now, the Multiprocessing Tutorial created by stable-baselines (see) states that the following is to be used to generate multiple envs - as an example of course:
def make_env(env_id, rank, seed=0):
"""
Utility function for multiprocessed env.
:param env_id: (str) the environment ID
:param num_env: (int) the number of environment you wish to have in subprocesses
:param seed: (int) the inital seed for RNG
:param rank: (int) index of the subprocess
"""
def _init():
env = gym.make(env_id)
env.seed(seed + rank)
return env
set_global_seeds(seed)
return _init
However, for some obscure reason, python never calls _init, for some obvious reason: even though it has no arguments, it is still a function hence, please replace it with ‘return _init()’.
Secondly, even doing so results in an error when building the SubprocVecEnv([make_env(env_id, i) for i in range(numenvs)]), namely:
Traceback (most recent call last):
File “<ipython-input-4-1379f0286cfd>”, line 1, in <module> runfile(‘C:/Users/X/Desktop/thesis.py’, wdir=‘C:/Users/X/Desktop’)
File “D:\Programs\Anaconda3\lib\site-packages\spyder\utils\site\sitecustomize.py”, line 705, in runfile execfile(filename, namespace)
File “D:\Programs\Anaconda3\lib\site-packages\spyder\utils\site\sitecustomize.py”, line 102, in execfile exec(compile(f.read(), filename, ‘exec’), namespace)
File “C:/Users/X/Desktop/thesis.py”, line 133, in <module> env = SubprocVecEnv([make_env(env_id, i) for i in range(numenvs)])
File “D:\Programs\Anaconda3\lib\site-packages\stable_baselines\common\vec_env\subproc_vec_env.py”, line 52, in init process.start()
File “D:\Programs\Anaconda3\lib\multiprocessing\process.py”, line 105, in start self._popen = self._Popen(self)
File “D:\Programs\Anaconda3\lib\multiprocessing\context.py”, line 223, in _Popen return _default_context.get_context().Process._Popen(process_obj)
File “D:\Programs\Anaconda3\lib\multiprocessing\context.py”, line 322, in _Popen return Popen(process_obj)
File “D:\Programs\Anaconda3\lib\multiprocessing\popen_spawn_win32.py”, line 65, in init reduction.dump(process_obj, to_child)
File “D:\Programs\Anaconda3\lib\multiprocessing\reduction.py”, line 60, in dump ForkingPickler(file, protocol).dump(obj)
BrokenPipeError: [Errno 32] Broken pipe
Any ideas on how to fix this? I have implemented a simply Gym env, does it need to extend/implement SubprocVecEnv?
Issue Analytics
- State:
- Created 5 years ago
- Comments:38
Top GitHub Comments
I had no idea such existed (MultiDiscrete). You are the most helpful person of all times. I’ll give it a try and give some feedback afterwards so you can close the ticket - hopefully - and I wont waste more of your time.
On a side note, should I scale my rewards and observations? Given NN tend to learn better for smaller scales, or does baselines automatically do it for us? (for the actions and observations, I take it for rewards one has to do it manually)
Before:
After:
What you described is called “MultiDiscrete” space, see https://github.com/openai/gym/blob/master/gym/spaces/multi_discrete.py