Explain entropy

975    Asked by NikitaGavde in Data Science , Asked on Nov 9, 2019
Answered by Nikita Gavde

Entropy is the measure of impurity present in the data and it is derived from information theory. The entropy is considered as zero, if the sample is completely homogeneous and is considered as one of the sample is equally divided. In decision trees, the predictor which has the most heterogeneous will be considered nearest to the root node to classify the given data into classes in a greedy mode. Entropy works on the below formula


Here n is the number of classes. Entropy is maximum in the middle with a value of 1 and minimum at the extremes with a value of 0. The low valued entropy will segregate the classes better.

Here the below representation gives an idea of how entropy works.




Your Answer

Answers (2)

We understand the significance of a well-crafted Gi, which is why we dedicate ourselves to producing the best in the market.BJJ Gi

1 Week

Great support! Have you ever tried playing Doodle Jump?

4 Weeks

Interviews

Parent Categories