DQN does not allow custom models
See original GitHub issueSystem information
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 18.04
- Ray installed from (source or binary): source
- Ray version: 0.8.0.dev6
- Python version: 3.7.5
- Exact command to reproduce:
The following code tries to set built-in VisionNetwork from TF as custom model and it errors out as described below. However, the code succeeds if custom model was not set in which case exact same VisionNetwork gets selected automatically by _get_v2_model. The cause of this issue is explained below however I’m not sure about the fix.
import ray
from ray.rllib.agents.dqn import DQNTrainer
from ray.rllib.models import ModelCatalog
from ray.rllib.models.tf.visionnet_v2 import VisionNetwork
ModelCatalog.register_custom_model("my_model", VisionNetwork)
config = {'model': {
"custom_model": "my_model",
"custom_options": {}, # extra options to pass to your model
}}
ray.init()
agent = DQNTrainer(config=config, env="BreakoutNoFrameskip-v4")
Describe the problem
Current code in master is not allowing the use of custom models in DQN. When trying to use custom model (either for TF or PyTorch), error is thrown indicating that model has not been subclassed from DistributionalQModel
. This happens even when custom model is set to simply ray.rllib.models.tf.visionnet_v2.VisionNetwork
.
Error message:
'The given model must subclass', <class 'ray.rllib.agents.dqn.distributional_q_model.DistributionalQModel'>)
Source code / logs
Cause of this issue is this check. Notice that this check is only done if custom_model is set. Apparently built-in models don’t subclass DistributionalQModel
either however as this check is not applied to built-in models they work fine.
Issue Analytics
- State:
- Created 4 years ago
- Comments:6 (4 by maintainers)
Top GitHub Comments
@sytelus Hey I ran into this exact issue a few days back and all I subclassed the right Model and everything works as expected. Copy-paste this Model code below.
`
That sounds good!