Skip to content Skip to sidebar Skip to footer

Tensorflow Initialization Gives All Ones

tensorflow 1.12.0 In the code snipped below, it seems that wrapped_rv_val and seq_rv_val should be equivalent, but they are not. Instead, seq_rv_val is correctly initialized to the

Solution 1:

In fact, seq_rv_val and wrapped_rv_val both will be correctly initialized to the randomly generated init_val array when you do the following.

# changewrapped_rv = tf.nn.softmax(tf.get_variable('wrapped_rv', initializer=init_val))
# towrapped_rv = tf.nn.softmax(tf.get_variable('wrapped_rv', initializer=init_val), axis=2)

Next I'll explain why wrapped_rv is initialized to 1. Let's look at the formula of softmax. enter image description here

The number of denominator summation items will be 16 when you set axis=2. But the number of denominator summation items will be 1 when you set axis=-1(default). So the molecule is the same as the denominator and the result is 1 when you set it to axis=-1. You can run the following example to understand the problem.

import tensorflow as tf

y = tf.constant([[1],[0],[1]],dtype=tf.float32)
y1 = tf.constant([[1],[2],[3]],dtype=tf.float32)
y2 = tf.constant([[1],[3],[7]],dtype=tf.float32)

softmax_var1 = tf.nn.softmax(logits=y1)
softmax_var2 = tf.nn.softmax(logits=y2)

with tf.Session() as sess:
    print(sess.run(softmax_var1))
    print(sess.run(softmax_var2))

[[1.]
 [1.]
 [1.]][[1.]
 [1.]
 [1.]]

Post a Comment for "Tensorflow Initialization Gives All Ones"