Cross-entropy is a amount that’s usually utilized in Machine Studying (ML), and by extension to Deep Studying (DL) as a loss perform. Nevertheless, this loss measure is usually launched as part of the equipment of ML; use binary cross-entropy for binary classification and categorical cross-entropy for multi-class classification with none background of why or the way it applies.
Try the complete article on Notion the place:
- I’ll clarify the origins of cross-entropy as a loss measure.
- The way it pertains to necessary ideas in data idea equivalent to Entropy and KL (Kullback-Leibler) Divergence.
- And its’ connection to Most Chance Estimation (MLE) and why it seems as a loss perform there.