question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Bug][RLlib] Gym environment registration does not work when using Ray Client and ray.init

See original GitHub issue

Search before asking

  • I searched the issues and found no similar issues.

Ray Component

RLlib

What happened + What you expected to happen

When using RLlib and Ray Client then you will receive an error (see below) when relying on: ray.init(f"ray://127.0.0.1:10001") whereas things work when using: export RAY_ADDRESS="ray://127.0.0.1:10001"

In particular this error only happens when using the default gym registered strings. When using a custom registration then code runs as expected.

So:

  • gym-string + ray.init -> error
  • gym-string + RAY_ADDRESS -> works
  • self-registration + ray.init -> works
  • self-registration + RAY_ADDRESS -> works
2022-01-20 03:24:32,339 INFO trainer.py:2054 -- Your framework setting is 'tf', meaning you are using static-graph mode. Set framework='tf2' to enable eager execution with tf2.x. You may also then want to set eager_tracing=True in order to reach similar execution speed as with static-graph mode.
Traceback (most recent call last):
  File "rllib4.py", line 28, in <module>
    trainer = PPOTrainer(config=config)
  File "/home/ray/anaconda3/lib/python3.8/site-packages/ray/rllib/agents/trainer.py", line 728, in __init__
    super().__init__(config, logger_creator, remote_checkpoint_dir,
  File "/home/ray/anaconda3/lib/python3.8/site-packages/ray/tune/trainable.py", line 122, in __init__
    self.setup(copy.deepcopy(self.config))
  File "/home/ray/anaconda3/lib/python3.8/site-packages/ray/rllib/agents/trainer.py", line 754, in setup
    self.env_creator = _global_registry.get(ENV_CREATOR, env)
  File "/home/ray/anaconda3/lib/python3.8/site-packages/ray/tune/registry.py", line 168, in get
    return pickle.loads(value)
EOFError: Ran out of input

Versions / Dependencies

Ray 1.10.0-py38 Docker image with TensorFlow installed.

>>> ray.__commit__
'1583379dce891e96e9721bb958e80d485753aed7'
>>> ray.__version__
'1.10.0'

Reproduction script

# Import the RL algorithm (Trainer) we would like to use.
import ray

ray.init(f"ray://127.0.0.1:10001")  # Comment out to make this work.

from ray.rllib.agents.ppo import PPOTrainer
from ray.tune.registry import register_env
from gym.envs.classic_control.cartpole import CartPoleEnv

def env_creator(config):
    return CartPoleEnv()

register_env("my_env", env_creator)


# Configure the algorithm.
config = {
    # Environment (RLlib understands openAI gym registered strings).
    "env" : "CartPole-v1",  # <-- Fails
    #"env" : "my_env",  # <-- Works
    "num_workers": 2,
    "framework": "tf"
}

trainer = PPOTrainer(config=config)
for _ in range(3):
    print(trainer.train())


Anything else

Happens always.

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:17 (17 by maintainers)

github_iconTop GitHub Comments

1reaction
jovany-wangcommented, Apr 19, 2022

This is a P0 issue from our side. @ericl CC

0reactions
mwtiancommented, Apr 20, 2022

Let’s see if https://github.com/ray-project/ray/pull/24058 can fix the issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Register a custom environment and runing PPOTrainer on that ...
I'm trying to run the PPO algorithm on my custom gym environment (I'm new ... import DogEnv2 env = gym.make('dog-v2') return env ray.init() ......
Read more >
Environments — Ray 2.2.0
The gym registry is not compatible with Ray. Instead, always use the registration flows documented above to ensure Ray workers can access the...
Read more >
Trying to set up external RL environment and having trouble
My thesis project involves using an RL policy to manage a hyperparameter tuning setup. This is kind of in reverse to what people...
Read more >
RLlib Environments — Ray 0.7.3 documentation
This function must take a single env_config parameter and return an env instance: from ray.tune.registry ... The gym registry is not compatible with...
Read more >
RLlib Environments — Ray 0.7.4 documentation
You can also register a custom env creator function with a string name. This function must take a single ... The gym registry...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found