Issue
I have a neural-network where I'm using Optuna to find some optimal hyper-parameters e.g batch-size etc.
I want to save the nets-parameters when Optuna finds a new best parameter-combination.
I have tried the following two approahces:
SCORE = 0
def objective(trial):
BATCH_SIZE = trial.suggest_int("BATCH_SIZE",20,100)
LEARNING_RATE = trial.suggest_float("LEARNING_RATE",0.05,1)
DROPOUT = trial.suggest_float("DROPOUT",0.1,0.9)
Y_SCORE,Y_VAL = train_NN(X,y,word_model,BATCH_SIZE,250,LEARNING_RATE,DROPOUT)
y_val_pred = Y_SCORE.argmax(axis=1)
labels = encode.inverse_transform(np.arange(6))
a = classification_report(Y_VAL, y_val_pred,zero_division=0,target_names=labels,output_dict=True)
score = a.get("macro avg").get("f1-score")
if score>SCORE: #New best weights found - save the net-parameters
SCORE = score
torch.save(net,"../model_weights.pt")
return score
which fails with UnboundLocalError: local variable 'SCORE' referenced before assignment
but if I move SCORE=0
inside the function at the top, it resets at each trial.
The reason I want to save the weights right away, and not just run another training with study.best_params
at the end, is that sometimes the random initialization of the weights has an impact and gives a higher score (although if the training is robust, it should not make a difference) - but that is not the point/the issue.
Solution
This answer is helpful to save the neural network's weights; We can use the callback function to save the model's checkpoint.
Answered By - nzw0301
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.