Issue
I am not asking about registers which are memory locations to store the content.
I am asking about the usage of word 'register' in PyTorch documentation.
While I am reading the documenation regarding MODULE in PyTorch, I encountered the usage of word registers, registered several times.
The context of usage is as follows
1. tensor (Tensor) – buffer to be registered.
2. Submodules assigned in this way will be registered, and will have their parameters converted too when you call to(), etc.
3. Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
4. Registers a backward hook on the module.
5. Registers a forward hook on the module.
.....
And the word 'register' has been used in the names of several methods
1. register_backward_hook(hook)
2. register_buffer(name, tensor, persistent=True)
3. register_forward_hook(hook)
4. register_forward_pre_hook(hook)
5. register_parameter(name, param)
......
What does it mean by the usage of word register programmatically?
Does it just mean the act of recording a name or information on an official list as in plain English or has any significance programmatically?
Solution
This "register" in pytorch doc and methods names means "act of recording a name or information on an official list".
For instance, register_backward_hook(hook)
adds the function hook
to a list of other functions that nn.Module
executes during the execution of the forward
pass.
Similarly, register_parameter(name, param)
adds an nn.Parameter
param
with name
to the list of trainable parameters of the nn.Module
.
It is crucial to register trainable parameters so pytorch will know what tensors to pass to the optimizer and what tensors to store as part of the nn.Module
's state_dict
.
Answered By - Shai
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.