Issue
I've been reading the tutorials on TensorFlow where they have written
with tf.name_scope('read_inputs') as scope:
# something
The example
a = tf.constant(5)
and
with tf.name_scope('s1') as scope:
a = tf.constant(5)
seem to have the same effect. So, why do we use name_scope
?
Solution
I don't see the use case for reusing constants but here is some relevant information on scopes and variable sharing.
Scopes
name_scope
will add scope as a prefix to all operationsvariable_scope
will add scope as a prefix to all variables and operations
Instantiating Variables
tf.Variable()
constructer prefixes variable name with currentname_scope
andvariable_scope
tf.get_variable()
constructor ignoresname_scope
and only prefixes name with the currentvariable_scope
For example:
with tf.variable_scope("variable_scope"):
with tf.name_scope("name_scope"):
var1 = tf.get_variable("var1", [1])
with tf.variable_scope("variable_scope"):
with tf.name_scope("name_scope"):
var2 = tf.Variable([1], name="var2")
Produces
var1 = <tf.Variable 'variable_scope/var1:0' shape=(1,) dtype=float32_ref>
var2 = <tf.Variable 'variable_scope/name_scope/var2:0' shape=(1,) dtype=string_ref>
Reusing Variables
Always use
tf.variable_scope
to define the scope of a shared variableThe easiest way to do reuse variables is to use the
reuse_variables()
as shown below
with tf.variable_scope("scope"):
var1 = tf.get_variable("variable1",[1])
tf.get_variable_scope().reuse_variables()
var2=tf.get_variable("variable1",[1])
assert var1 == var2
tf.Variable()
always creates a new variable, when a variable is constructed with an already used name it just appends_1
,_2
etc. to it - which can cause conflicts :(
Answered By - Soph
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.