Issue
Is it necessary to convert tensors and models to CUDA with tensor.to
in colab when I've chosen runtime type as GPU?
I want use CUDA for training my model
Solution
- tensor.to(device) transfer data to the given device.
- Yes, you need to transfer model, input, labels etc to whatever device you are intending to use
Answered By - CodeBuster17
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.