multi label loss function
发布时间
阅读量:
阅读量
基本思想还是转化为多个二分类
https://github.com/keras-team/keras/issues/10371
For multi-label classification tasks, one can experiment with the tanh plus hinge loss function when assigning {-1, 1} to label values such as (1,-,-,-). Alternatively,following a sigmoid combined with Hamming loss approach is also feasible by setting label values to {0,-}. In my experience,it was effective to employ the sigmoid combined with focal loss when assigning {0,-} to label values such as (,,,), achieving good results.
例如一批样本数量为32时输出包含8个多标签可以等价拆解为一个由32×8个独立的一对多分类任务组成其中这32×8个样本中正负样本分布严重失衡若每个样本仅包含1、2个标签时此时Focal Loss就能够充分发挥其优势了
https://www.kaggle.com/rejpalcz/focalloss-for-keras
class FocalLoss(nn.Module):
def __init__(self, gamma=2):
super().__init__()
self.gamma = gamma
def forward(self, input, target):
if not (target.size() == input.size()):
raise ValueError("Target size ({}) must be the same as input size ({})"
.format(target.size(), input.size()))
max_val = (-input).clamp(min=0)
loss = input - input * target + max_val + \
((-max_val).exp() + (-input - max_val).exp()).log()
invprobs = F.logsigmoid(-input * (target * 2.0 - 1.0))
loss = (invprobs * self.gamma).exp() * loss
return loss.sum(dim=1).mean()
代码解释
focal loss参考https://zhuanlan.zhihu.com/p/32423092
全部评论 (0)
还没有任何评论哟~
