What is logits, softmax and softmax_cross_entropy_with_logits?

Logits is a function which operates on the unscaled output of earlier layers and on a linear scale to understand the linear units. In Mathematics, Logits is a function that maps probabilities ( [0, 1] ) to R ( (-inf, inf) ) .

 tf.nn.softmax gives only the result of applying the softmax function to an input tensor. The softmax "squishes" the inputs so that sum(input) = 1,it is a simple way of normalizing. Moreover, the shape of output is the same as that of input, it just normalizes the values.It is used during the evaluation of the model when you compute the probabilities that the model outputs.

a = tf.constant(np.array([[.1, .3, .5, .9]]))

 print s.run(tf.nn.softmax(a))

[[ 0.16838508 0.205666 0.25120102 0.37474789]

tf.nn.softmax_cross_entropy_with_logits is mainly used for computing the cross entropy of the result after the softmax function has been applied. It is only used during training.Its result is similar to-

sf = tf.nn.softmax(x)

 c = cross_entropy(sf)

Ex- If tf.nn.softmax_cross_entropy_with_logits is applied on a shape [2,5] gives a output shape of[2,1] .

In case, you want to minimize the cross-entropy and you are softmaxing after your last layer then you should use tf.nn.softmax_cross_entropy_with_logits as it covers numerically unstable corner cases.



Your Answer

Interviews

Parent Categories