Skip to content

CrossEntropyLoss2d #10

@kshpv

Description

@kshpv

Hello!
Thank you for your work!
I am a little bit confused about CrossEntropyLoss2d class, especially:

targets_m[mask] -= 1
loss_all = self.ce_loss(inputs, targets_m.long())
losses.append(torch.sum(torch.masked_select(loss_all, mask)) / torch.sum(mask.float()))
  1. Why do you subtract 1 from targets masks?
  2. Why do you use torch.masked_select(loss_all, mask)) for calculating a loss for a single target? It seems incorrect because in that case we forget about a background.

Thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions