Issue
What does the negative slope in a LeakyReLU function refer to?
The term "negative slope" is used in the documentation of both TensorFlow and Pytorch, but it does not seem to point to reality.
The slope of a LeakyReLU function for both positive and negative inputs is generally non-negative.
The Pytorch and TensorFlow docs provide examples of setting the negative slope, and they both use a positive value. TensorFlow explicitly enforces non-negative values. (See below)
Are they just wrong or am I missing something?
CLASStorch.nn.LeakyReLU(negative_slope=0.01, inplace=False)
Args alpha Float >= 0. Negative slope coefficient. Default to 0.3.
Solution
negative_slope
in this context means the negative half of the Leaky ReLU's slope. It is not describing a slope which is necessarily negative.
When naming kwargs it's normal to use concise terms, and here "negative slope" and "positive slope" refer to the slopes of the linear splines spanning the negative [-∞,0] and positive (0,∞] halves of the Leaky ReLU's domain.
Answered By - iacob
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.