Issue
How do I optimize for multiple metrics simultaneously inside the objective
function of Optuna. For example, I am training an LGBM classifier and want to find the best hyperparameter set for all common classification metrics like F1, precision, recall, accuracy, AUC, etc.
def objective(trial):
# Train
gbm = lgb.train(param, dtrain)
preds = gbm.predict(X_test)
pred_labels = np.rint(preds)
# Calculate metrics
accuracy = sklearn.metrics.accuracy_score(y_test, pred_labels)
recall = metrics.recall_score(pred_labels, y_test)
precision = metrics.precision_score(pred_labels, y_test)
f1 = metrics.f1_score(pred_labels, y_test, pos_label=1)
...
How do I do it?
Solution
After defining the grid and fitting the model with these params and generate predictions, calculate all metrics you want to optimize for:
def objective(trial):
param_grid = {"n_estimators": trial.suggest_int("n_estimators", 2000, 10000, step=200}
clf = lgbm.LGBMClassifier(objective='binary', **param_grid)
clf.fit(X_train, y_train)
preds = clf.predict(X_valid)
probs = clf.predict_proba(X_valid)
# Metrics
f1 = sklearn.metrics.f1_score(y_valid, press)
accuracy = ...
precision = ...
recall = ...
logloss =
and return them in the order you want:
def objective(trial):
...
return f1, logloss, accuracy, precision, recall
Then, in the study object, specify whether you want to minimize or maximize each metric to directions
like so:
study = optuna.create_study(directions=['maximize', 'minimize', 'maximize', 'maximize', 'maximize'])
study.optimize(objective, n_trials=100)
For more details, see Multi-objective Optimization with Optuna in the documentation.
Answered By - Bex T.
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.