Dice Loss is designed to maximise the Dice Coefficient during training, by minimising the inverse of the Dice Coefficient.
- Minimising Dice Loss encourages greater overlap between predicted and true masks.
- It inherently addresses class imbalance.
$$ Dice Loss=1-Dice Coefficient $$
Binary Cross-Entropy Loss (BCE) measures the error between predicted probabilities and actual binary labels for each pixel.
- It is used for pixel-wise classification tasks, which is how segmentation can be viewed (each pixel is classified as foreground or background).
$$ BCE=-\frac{1}{N}\sum_{i=1}^{N}\left [ _{y_1}log(_{\hat{h}_i})+(1-_{y_i})log(1-_{\hat{y}_i}) \right ] $$
Combined Loss (BCE + Dice Loss) leverages the strengths of BCE Loss and Dice Loss.
- BCE guides the model to learn pixel-wise classification accurately, pushing predicted probabilities towards 0 for background and 1 for foreground. It's good for overall learning!
- Dice Loss directly optimises for overlap, which is crucial for segmentation performance, especially for small or imbalance foreground objects. It helps the model prioritise getting the object boundaries right and not being overwhelmed by the background.
$$ Combined Loss=BCE+Dice Loss $$
Tversky Loss extends Dice Loss by introducing two coefficients, 𝛼 and 𝛽, which control the relative importance of False Positives (FP) and False Negatives (FN).
- By weighting FP and FN differently, it becomes especially useful for imbalanced segmentation tasks, where either precision or recall needs to be emphasised.
$$ Tversky Index=\frac{TP}{TP+\alpha FP+\beta FN} $$
Focal Tversky Loss combines the ideas of Tversky Loss and Focal Loss.
- Focal Loss was introduced to address the problem of class imbalance by down-weighting the loss contribution from easy examples (well-classified instances) and focusing more on hard examples (misclassified or difficult-to-classify instances).
$$ Focal Tversky Loss=(1-Tversky Index)^{\gamma }=(1-\frac{TP+\epsilon }{TP+\alpha FP+\beta FN+\epsilon }) $$
'MachineLearning > Evaluation' 카테고리의 다른 글
Metrics (0) | 2025.06.26 |
---|