Issue
When coding PyTorch in torch.nn.utils
I see two functions, clip_grad_norm
and clip_grad_norm_
.
I want to know the difference so I went to check the documentation but when I searched I only found the clip_grad_norm_
and not clip_grad_norm
.
So I'm here to ask if anyone knows the difference.
Solution
Pytorch uses the trailing underscore convention for in-place operations. So the difference is that the one with an underscore modifies the tensor in place and the other one leaves the original tensor unmodified and returns a new tensor.
Answered By - jodag
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.