首页 文章

示例CrossEntropyLoss用于pytorch中的3D语义分段

提问于
浏览
2

我有一个网络在5D输入张量上执行3D卷积 . 我的网络的输出大小(1,12,60,36,60)对应于(BatchSize,NumClasses,x-dim,y-dim,z-dim) . 我需要计算一个体素方面的交叉熵损失 . 但是我继续犯错误 .

当尝试使用 torch.nn.CrossEntropyLoss() 计算交叉熵损失时,我继续收到以下错误消息:

RuntimeError: multi-target not supported at .../src/THCUNN/generic/ClassNLLCriterion.cu:16

这是我的代码的摘录:

import torch
import torch.nn as nn
from torch.autograd import Variable
criterion = torch.nn.CrossEntropyLoss()
images = Variable(torch.randn(1, 12, 60, 36, 60)).cuda()
labels = Variable(torch.zeros(1, 12, 60, 36, 60).random_(2)).long().cuda()
loss = criterion(images.view(1,-1), labels.view(1,-1))

当我为标签创建单热张量时也会发生同样的情况:

nclasses = 12
labels = (np.random.randint(0,12,(1,60,36,60))) # Random labels with values between [0..11]
labels = (np.arange(nclasses) == labels[..., None] - 1).astype(int) # Converts labels to one_hot_tensor
a = np.transpose(labels,(0,4,3,2,1)) #  Reorder dimensions to match shape of "images" ([1, 12, 60, 36, 60])
b = Variable(torch.from_numpy(a)).cuda()
loss = criterion(images.view(1,-1), b.view(1,-1))

知道我做错了什么吗?有人能提供在5D输出张量上计算交叉熵的例子吗?

2 回答

  • 1

    刚刚检查了一些实现(fcn)的2D语义分割,并尝试将其应用于3D语义分割 . 不能保证这是正确的,我将不得不仔细检查......

    import torch
     import torch.nn.functional as F
     def cross_entropy3d(input, target, weight=None, size_average=True):
        # input: (n, c, h, w, z), target: (n, h, w, z)
        n, c, h, w , z = input.size()
        # log_p: (n, c, h, w, z)
        log_p = F.log_softmax(input, dim=1)
        # log_p: (n*h*w*z, c)
        log_p = log_p.permute(0, 4, 3, 2, 1).contiguous().view(-1, c) # make class dimension last dimension
        log_p = log_p[target.view(n, h, w, z, 1).repeat(1, 1, 1, 1, c) >= 0] # this looks wrong -> Should rather be a one-hot vector
        log_p = log_p.view(-1, c)
        # target: (n*h*w*z,)
        mask = target >= 0
        target = target[mask]
        loss = F.nll_loss(log_p, target.view(-1), weight=weight, size_average=False)
        if size_average:
            loss /= mask.data.sum()
        return loss
    images = Variable(torch.randn(5, 3, 16, 16, 16))
    labels = Variable(torch.LongTensor(5, 16, 16, 16).random_(3))
    cross_entropy3d(images, labels, weight=None, size_average=True)
    
  • 0

    docs解释了这种行为(底线,看起来它实际上是计算稀疏的交叉熵损失,因此不需要输出的所有维度的目标,而只需要所需的索引)...它们具体说明:

    Input: (N,C), where C = number of classes
    Target: (N), where each value is 0 <= targets[i] <= C-1
    Output: scalar. If reduce is False, then (N) instead.
    

    我不确定你的用例,但你可能想要使用KL DivergenceBinary Cross Entropy Loss . 两者都是在相同大小的输入和目标上定义的 .

相关问题