Issue
I need a conditional control flow in my graph. If pred
is True
, the graph should call an op that updates a variable and then returns it, otherwise it returns the variable unchanged. A simplified version is:
pred = tf.constant(True)
x = tf.Variable([1])
assign_x_2 = tf.assign(x, [2])
def update_x_2():
with tf.control_dependencies([assign_x_2]):
return tf.identity(x)
y = tf.cond(pred, update_x_2, lambda: tf.identity(x))
with tf.Session() as session:
session.run(tf.initialize_all_variables())
print(y.eval())
However, I find that both pred=True
and pred=False
lead to the same result y=[2]
, which means the assign op is also called when update_x_2
is not selected by tf.cond
. How to explain this? And how to solve this problem?
Solution
TL;DR: If you want tf.cond()
to perform a side effect (like an assignment) in one of the branches, you must create the op that performs the side effect inside the function that you pass to tf.cond()
.
The behavior of tf.cond()
is a little unintuitive. Because execution in a TensorFlow graph flows forward through the graph, all operations that you refer to in either branch must execute before the conditional is evaluated. This means that both the true and the false branches receive a control dependency on the tf.assign()
op, and so y
always gets set to 2
, even if pred is False
.
The solution is to create the tf.assign()
op inside the function that defines the true branch. For example, you could structure your code as follows:
pred = tf.placeholder(tf.bool, shape=[])
x = tf.Variable([1])
def update_x_2():
with tf.control_dependencies([tf.assign(x, [2])]):
return tf.identity(x)
y = tf.cond(pred, update_x_2, lambda: tf.identity(x))
with tf.Session() as session:
session.run(tf.initialize_all_variables())
print(y.eval(feed_dict={pred: False})) # ==> [1]
print(y.eval(feed_dict={pred: True})) # ==> [2]
Answered By - mrry
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.