Issue
When I ran best_model = compare_models()
there is a huge load on CPU memory, while my GPU is unutilized. How do I run the setup()
or compare_models()
on GPU?
Is there an in-built method in PyCaret?
Solution
Only some models can run on GPU, and they must be properly installed to use GPU. For example, for xgboost
, you must install it with pip and have CUDA 10+ installed (or install a GPU xgboost
version from anaconda, etc). Here is the list of estimators that can use GPU and their requirements: https://pycaret.readthedocs.io/en/latest/installation.html?highlight=gpu#pycaret-on-gpu
As Yatin said, you need to use use_gpu=True
in setup()
. Or you can specify it when creating an individual model, like xgboost_gpu = create_model('xgboost', fold=3, tree_method='gpu_hist', gpu_id=0)
.
For installing CUDA, I like using Anaconda since it makes it easy, like conda install -c anaconda cudatoolkit
. It looks like for the non-boosted methods, you need to install cuML for GPU use.
Oh, and looks like pycaret can't use tune-sklearn with GPU (in the warnings here at the bottom of the tune_model
doc section).
Answered By - wordsforthewise
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.