'Optuna catboost pruning
is there a way to have pruning with CatBoost and Optuna (in LightGBM it's easy but in Catboost I can't find any hint). My code is like this
def objective(trial):
param = {
'iterations':trial.suggest_int('iterations', 100,1500, step=100),
'learning_rate':trial.suggest_uniform("learning_rate", 0.001, 0.3),
'random_strength':trial.suggest_int("random_strength", 1,10),
'max_bin':trial.suggest_categorical('max_bin', [2,3,4,5,6,8,10,20,30]),
'grow_policy':trial.suggest_categorical('grow_policy', ['SymmetricTree', 'Depthwise', 'Lossguide']),
"colsample_bylevel": trial.suggest_uniform("colsample_bylevel", 0.1, 1),
'od_type' : "Iter",
'od_wait' : 30,
"depth": trial.suggest_int("max_depth", 1,12),
"l2_leaf_reg": trial.suggest_loguniform("l2_leaf_reg", 1e-8, 100),
'custom_metric' : ['AUC'],
"loss_function": "Logloss",
}
if param['grow_policy'] == "SymmetricTree":
param["boosting_type"]= trial.suggest_categorical("boosting_type", ["Ordered", "Plain"])
else:
param["boosting_type"] = "Plain"
# Added subsample manually
param["subsample"] = trial.suggest_float("subsample", 0.1, 1)
### CV ###
# How to add a callback for pruning?
scores = cv(train_dataset,
param,
fold_count=5,
early_stopping_rounds=30,
plot=False, verbose=False)
return scores['test-AUC-mean'].mean()
Solution 1:[1]
No, because catboost doesn't provide any callback like the other boosting libraries. However, catboost plans to introduce a callback function in the near future. After the release of the feature, optuna may implement integration for catboost like LightGBM
. See also the feature request on github https://github.com/optuna/optuna/issues/2464.
Solution 2:[2]
Yes, CatBoost does now support pruning with Optuna. Adding to @nzw0301's comment, please see Optuna's example of pruning a CatBoost model here:
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | |
Solution 2 | K. Thorspear |