question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Support for Fetch environments?

See original GitHub issue

It seems like baselines is not directly implemented to deal with Box() type action spaces. This same exact code works for the CartPole environment. It fails on FetchReach-v1. Here is the code:

import gym
from baselines import deepq


def callback(lcl, _glb):
    # stop training if reward exceeds 199
    is_solved = lcl['t'] > 100 and sum(lcl['episode_rewards'][-101:-1]) / 100 >= 199
    return is_solved


def main():
    env = gym.make("FetchReach-v1")
    model = deepq.models.mlp([64])
    act = deepq.learn(
        env,
        q_func=model,
        lr=1e-3,
        max_timesteps=100000,
        buffer_size=50000,
        exploration_fraction=0.1,
        exploration_final_eps=0.02,
        print_freq=10,
        callback=callback
    )
    print("Saving model to Fetch_model.pkl")
    act.save("Fetch_model.pkl")


if __name__ == '__main__':
    main()

When I try to use the same deepq algorithm trained on cartpole, with a discrete action space, on FetchReach-V1, I get the following:

File "train_FetchReach.py", line 31, in <module>
    main()
  File "train_FetchReach.py", line 24, in main
    callback=callback
  File "/home/jeremy/.local/share/virtualenvs/cgw-i4TbRcn4/lib/python3.6/site-packages/baselines/deepq/simple.py", line 180, in learn
    num_actions=env.action_space.n,
AttributeError: 'Box' object has no attribute 'n'

I tried adding

env.action_space.n = len(env.action_space.sample())

but that just lead to more errors:

Traceback (most recent call last):
  File "train_FetchReach.py", line 32, in <module>
    main()
  File "train_FetchReach.py", line 25, in main
    callback=callback
  File "/home/jeremy/.local/share/virtualenvs/cgw-i4TbRcn4/lib/python3.6/site-packages/baselines/deepq/simple.py", line 184, in learn
    param_noise=param_noise
  File "/home/jeremy/.local/share/virtualenvs/cgw-i4TbRcn4/lib/python3.6/site-packages/baselines/deepq/build_graph.py", line 376, in build_train
    act_f = build_act(make_obs_ph, q_func, num_actions, scope=scope, reuse=reuse)
  File "/home/jeremy/.local/share/virtualenvs/cgw-i4TbRcn4/lib/python3.6/site-packages/baselines/deepq/build_graph.py", line 177, in build_act
    observations_ph = make_obs_ph("observation")
  File "/home/jeremy/.local/share/virtualenvs/cgw-i4TbRcn4/lib/python3.6/site-packages/baselines/deepq/simple.py", line 175, in make_obs_ph
    return BatchInput(observation_space_shape, name=name)
  File "/home/jeremy/.local/share/virtualenvs/cgw-i4TbRcn4/lib/python3.6/site-packages/baselines/deepq/utils.py", line 66, in __init__
    super().__init__(tf.placeholder(dtype, [None] + list(shape), name=name))
TypeError: 'NoneType' object is not iterable

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:12 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
carlo-commented, Oct 5, 2018

The robotics environments are not compatible with baselines out of the box (except maybe with HER), because they have a dictionary-based observation space. To use them, you have to flatten the observation space into an array like so:

env = gym.wrappers.FlattenDictWrapper(env, dict_keys=[′observation′, ′desired_goal′])

This worked for me perfectly with DDPG and FetchPickAndPlace.

More info on this at the bottom of this page.

1reaction
elifriedmancommented, May 1, 2018

The problem is that DQN is built to handle a discrete number of actions, so it works for CartPole, which uses action_space = spaces.Discrete(2)

The FetchReach environment, however, uses a 4 dimensional continuous action space from -1 to 1 rather than using a fixed number of actions, so DQN won’t work out of the box. You could try using the other algorithms that do support continuous action spaces.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Challenging Robotics Environments and Request for Research
The tasks include pushing, sliding and pick. & place with a Fetch robotic arm as well as in-hand object manipulation with a Shadow...
Read more >
Issues and Support - Fetch & Freight Manual - Fetch Robotics
Contacting Fetch Support​​ When purchasing robots, customers will have accounts created on the Fetch Support Website. Logging in and creating a ticket here...
Read more >
Fetch environment metadata - Dynamics 365 - Microsoft Learn
This article explains how to fetch environment metadata through Microsoft Dynamics Lifecycle Services (LCS) via the LCS Environment API.
Read more >
Environments | Upstash: Documentation
We support various platforms, such as nodejs, cloudflare and fastly. Platforms differ slightly when it comes to environment variables and their fetch api....
Read more >
gym-fetch - PyPI
We extend existing Fetch environments from gym, with 7 new manipulation tasks. The gym.Fetch environment are much better engineered than the sawyer ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found