question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[tune] adjusting custom trial names

See original GitHub issue

I want to change my trial name, as suggested here: https://ray.readthedocs.io/en/latest/tune-usage.html#custom-trial-names

Example (from https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/logging_example.py)

def trial_name_string(trial):
    return "{}_{}_123".format(trial.trainable_name, trial.trial_id)

Are there more options for changing the name? I tried custom return values (like str(trial).split(‘-’)[0] or just “Pendulumv0”) and got errors or the trial name remained default. I also want to create an ‘info.txt’ file in that directory, can I access the trial name outside of the tune.run() function?

similar issue #3034

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

2reactions
richardliawcommented, Jan 30, 2020

Can you provide an issue for reproduction for what you tried and what errors you got?

Also you should be able to access trial names as:

analysis = tune.run(...)
names = [str(trial) for trial in analysis.trials]
1reaction
ninafionacommented, Feb 3, 2020

Thank you @richardliaw, I was able to access the trial names and create files into the directories (I think it would be better to create the files before running the tune.run(), in case there is an error coming up etc, but it’s better than nothing.)

I wasn’t able to reproduce the error for the trial name, the code below is running without any errors now. The resulting filename is “SAC_Pendulum-v0_MLP_2020-02-03_14-00-56g21_qlu9”, so the date and id are always added automatically. I tried to call my info_to_file() function inside the trial_str_creator to save the info file before running the algorithm, but the trial_id isn’t the same in that case.

def trial_str_creator(trial):
    trialname = algorithm + "_" + environment + "_" + model 
    info_to_file(trialname + "_" + time.strftime("%Y-%m-%d_%H-%M-%S") + trial.trial_id)
    return trialname

ray.init()
analysis = tune.run(
    algorithm,
    local_dir=output_dir,
    stop={"episode_reward_mean": -200},
    config=config,
    trial_name_creator=trial_str_creator
)
Read more comments on GitHub >

github_iconTop Results From Across the Web

Tune Internals — Ray 2.2.0 - the Ray documentation
Called after a trial saved a checkpoint with Tune. Parameters. iteration – Number of iterations of the tuning loop. trials – List of...
Read more >
Hyperparameter tuning with Ray Tune - PyTorch
Hyperparameter tuning can make the difference between an average model and a highly accurate one. ... Trial name | status | loc |...
Read more >
Create a hyperparameter tuning job | Vertex AI - Google Cloud
In a hyperparameter tuning job, Vertex AI creates trials of your training ... Use the same argument names when configuring your hyperparameter training...
Read more >
tune.tune | FLAML
tune.report(metric2minimize=metric2minimize, time2eval=time2eval) ... re-running those trials by passing in the reward attributes as a list so the optimiser ...
Read more >
How to change the experiment file path generated when ...
You can set custom trial names - https://ray.readthedocs.io/en/latest/tune-usage.html#custom-trial-names. Let me know if that works for ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found