Issue
I am trying to measure the prediction speed (in seconds) of my python machine learning model. I can't find a good answer for this yet. Can you guys help me? this is my example code:
# Create Decision Tree classifer object
clf = DecisionTreeClassifier(criterion="entropy", max_depth=3)
# Train Decision Tree Classifer
clf = clf.fit(X_train,y_train)
#Predict the response for test dataset. I want to measure the speed of this prediction here.
y_pred = clf.predict(X_test)
Solution
You just need to time it! Use timestamps before and after the inference and take the difference.
import time
#get the timestamp before inference in seconds
start_ts = time.time()
#Predict the response for test dataset. I want to measure the speed of this prediction here.
y_pred = clf.predict(X_test)
#get the timestamp after the inference in second
end_ts = time.time()
# print the time difference in between start and end timestamps in seconds
print(f"Prediction Time [s]: {(end_ts-start_ts):.3f}")
Answered By - William Wang
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.