Cross-Entropy Loss

Authors
Date
Jun 20, 2023 05:25 AM
Field
Machine Learning
Main Tags
Tags
Additional Tags
The main loss function we could use out of the box for multi-class classification for `N` samples and `C` number of classes is:

Cross EntropyLoss

Cross entropy loss for multi-classification problem

This criterion expects a batch of predictions x with shape (N, C) and class index in the range as the target (label) for each N samples, hence a batch of labels with shape (N, ). There are other optional parameters like class weights and class ignores. Feel free to check the PyTorch documentation for more detail. Additionally, here you can learn where is appropriate to use the CrossEntropyLoss.
To get CrossEntropyLoss of a sample , we could first calculate and then take the element corresponding to as the loss. However, due to numerical stability, we implement this more stable equivalent form,