question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[rllib] How can I record the full episode results of rllib?

See original GitHub issue

What is your question?

Ray version and other system information (Python version, TensorFlow version, OS):

Python version: Python 3.6.10 :: Anaconda, Inc.
Anaconda version: 4.7.12
TensorFlow version: 1.15.0
OS: Ubuntu 16.04
ray version: 0.8.1

Hello I’ve trained an RL model using rllib I have tested the breakout environment and the agent is running successfully.

Here is the python code I used. I used Rllib’s python API If you test at openai gym after learning, it works successfully.

I want to record the results of the agent. Agent test was executed using the rollout command of rllib. My result photo is below. This is the command I executed

rllib rollout checkpoint_4301/checkpoint-4301 \
    --run PPO --env BreakoutNoFrameskip-v4 --monitor --config '{"monitor": true}'

image

When testing with rollout, on average 8 games run. Many episodes ran, but only the results of one game (four lives in the case of breakout) were saved as videos. And I think it’s a mix of different game parts.

I additionally tested ms pacman. Pacman’s recording showed the same problem.

I have attached my rllib model and the recorded results file. I would be grateful if you could tell me a document or a way to help me record the video.

PPO_BreakoutNoFrameskip-v4_2020-02-11_21-32-52m48o1b48.zip BreakoutModel.zip

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:1
  • Comments:5 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
sven1977commented, Feb 26, 2020

I can reproduce the issue with CartPole as well. It’s only recording a very short sequence of the rolled out episode, and only a few of these as well (4 out of 50 rolled out episodes have mpeg snippet files). Taking a look. …

0reactions
sven1977commented, Feb 27, 2020

Ok, checkout this PR and let me know, whether this fixes your problems: https://github.com/ray-project/ray/pull/7347

I separated the --monitor option from the --out option (which used to be silently(!) required to record any videos into the --out + "monitor" directory).

The new command line would be (tested on CartPole with episodes of 1000 ts): rllib rollout [your checkpoint file] --run PPO --env BreakoutNoFrameskip-v4 --video-dir [some dir where all(!) episode videos will be stored in full length]

Read more comments on GitHub >

github_iconTop Results From Across the Web

Getting Started with RLlib — Ray 2.2.0 - the Ray documentation
By default, the results will be logged to a subdirectory of ~/ray_results . ... a file result.json which contains a training summary for...
Read more >
Examples — Ray 2.2.0
Example of how to setup an RLlib Algorithm against a locally running Unity3D editor instance to learn any Unity3D game (including support for...
Read more >
ray.rllib.evaluation.sampler — Ray 2.2.0
Users can add these "extra batches" to an episode by calling the episode's ... at the end of the episode and instead record...
Read more >
ray.rllib.algorithms.algorithm — Ray 3.0.0.dev0
As a result, Algorithm should never access the underlying actor handles directly. Instead, always access them via all the foreach APIs with assigned...
Read more >
How To Customize Policies — Ray 2.2.0
You might find this useful if modifying or adding new algorithms to RLlib. ... you want to compute actions based on the current...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found