Is cross entropy always larger than entropy?

import numpy as np

p = [0.5, 0.2, 0.2, 0.1]

q_1 = [0.4, 0.2, 0.2, 0.2]

q = [0.4, 0.2, 0.2, 0.2]

def cross_entropy(p, q):

  return np.sum(-1*np.multiply(p, np.log(q)))

# Get cross entropy

print(cross_entropy(p, q))

# Get entropy

print(cross_entropy(q_1, q))

 In the above code the cross entropy is less.Why?

Cross entropy is always larger than entropy. In the code we used

np.sum(-1*np.multiply(p, np.log(q)))

So the formula should be corrected as below

np.sum(-1*np.multiply(p, np.log(1/q)))



Your Answer

Interviews

Parent Categories