Can I see evaluation metrics during training and send them to W&B?
See original GitHub issueI’ve been having a great time playing with the library, nice work!
I was wondering if I’m doing something wrong in this gist? https://gist.github.com/galtay/10852bb03b354b2562997973bc29c679
I’m hoping to monitor metrics like “roc_auc_score” during training (and hopefully send them to a weights and biases project). When I run that code I see the “Running loss” printed out, but not the extra metrics I specified. Is there a way to log these to the screen or file or wandb?
I do get the metrics in the model.eval_model(eval_df, **eval_metrics)
call, but not during training. Also it would be nice if metrics that take predictions instead of probabilities (like sklearn.metrics.f1_score
) could be calculated using the user defined per class threshold.
https://github.com/ThilinaRajapakse/simpletransformers#special-attributes
great package, thanks!
Issue Analytics
- State:
- Created 3 years ago
- Comments:8 (8 by maintainers)
Top GitHub Comments
PR is now merged, thanks @ThilinaRajapakse !
https://github.com/ThilinaRajapakse/simpletransformers/pull/351