question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Save CatBoost model Object from Optuna Objective

See original GitHub issue

Hello,

I’m currently tuning a CatBoost model using the standard Optuna Objective function approach like so:

def objective(trial: optuna.Trial):
    
  ...
  ...
  ...

    
    # Initialsing CatBoost model
    model = cat.CatBoostClassifier(
        loss_function = "CrossEntropy",
        eval_metric = "AUC",
#         task_type = "GPU",
        **params)
    
    model.fit(training_set, early_stopping_rounds = 10)
    
    # Predicting
    prediction = model.predict(validation_set)
    prediction_label = np.rint(prediction)
    
    # Evaluating
    roc_auc = roc_auc_score(y_val, prediction_label)
    f1 = f1_score(y_val, prediction_label)
    recall = recall_score(y_val, prediction_label)
    
    print('ROC AUC Score:', roc_auc)
    print('F1 Score:', f1)
    print('Recall Score:', recall)
    
    return roc_auc, f1, recall

However, I’m trying to save the best CatBoost model that was trained and evaluated on my train and validation sets. I want to save the model as an object to then test on my test set. I appreciate Optuna is used for hyperparameter tuning but I wanted to know whether it was possible to just extract the best model directly from my objective function to avoid re-fitting a new model.

I’ve attempted to follow this example that achieves this using a LightGBM model. Is there a way to achieve this with a CatBoost instead? Thank you

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:6

github_iconTop GitHub Comments

1reaction
nzw0301commented, Sep 1, 2021

I suppose that we can save catboost models by using save_model function rather than re-fitting.

0reactions
ziadzeecommented, Sep 1, 2021

Hi, @nzw0301 - thank you for the reply. It seems to work as intended when I convert the objective function to a single objective function, however not have much luck when I attempt with multiple objectives. In this case, I’ll just have to re-fit a new CatBoost model with the hyperparameters obtained from my multi-objective function.

Read more comments on GitHub >

github_iconTop Results From Across the Web

optuna.integration - Read the Docs
The integration module contains classes used to integrate Optuna with external machine learning frameworks. For most of the ML frameworks supported by Optuna, ......
Read more >
Optuna tutorial for hyperparameter optimization - Kaggle
Optuna tutorial for begginers: hyperparameter optimization framework. When I try building a model (XGBoost, LightGBM, CatBoost, Neural Network ...
Read more >
Optimize your optimizations using Optuna - Analytics Vidhya
Optuna is a state-of-the-art automatic hyperparameter tuning ... A trial object is the input in an objective method and returns a score.
Read more >
Parameter tuning - CatBoost
This can be done by setting the number of iterations to a large value, using the overfitting detector parameters and turning the use...
Read more >
Why Is Everyone at Kaggle Obsessed with Optuna For ...
To start the optimization, we create a study object from Optuna and pass the objective function to its optimize method: ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found