I'm having trouble recovering a tensor by name, I don't even know if it's possible.
I have a function that creates my graph:
def create_structure(tf, x, input_size,dropout):
with tf.variable_scope("scale_1") as scope:
W_S1_conv1 = deep_dive.weight_variable_scaling([7,7,3,64], name='W_S1_conv1')
b_S1_conv1 = deep_dive.bias_variable([64])
S1_conv1 = tf.nn.relu(deep_dive.conv2d(x_image, W_S1_conv1,strides=[1, 2, 2, 1], padding='SAME') + b_S1_conv1, name="Scale1_first_relu")
.
.
.
return S3_conv1,regularizer
I want to access the variable S1_conv1 outside this function. I tried:
with tf.variable_scope('scale_1') as scope_conv:
tf.get_variable_scope().reuse_variables()
ft=tf.get_variable('Scale1_first_relu')
But that is giving me an error:
ValueError: Under-sharing: Variable scale_1/Scale1_first_relu does not exist, disallowed. Did you mean to set reuse=None in VarScope?
But this works:
with tf.variable_scope('scale_1') as scope_conv:
tf.get_variable_scope().reuse_variables()
ft=tf.get_variable('W_S1_conv1')
I can get around this with
return S3_conv1,regularizer, S1_conv1
but I don't want to do that.
I think my problem is that S1_conv1 is not really a variable, it's just a tensor. Is there a way to do what I want?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…