Shortcuts

MaskedBalancedBCEWithLogitsLoss

class mmocr.models.common.MaskedBalancedBCEWithLogitsLoss(reduction='none', negative_ratio=3, fallback_negative_num=0, eps=1e-06)[源代码]

This loss combines a Sigmoid layers and a masked balanced BCE loss in one single class. It’s AMP-eligible.

参数
  • reduction (str, optional) – The method to reduce the loss. Options are ‘none’, ‘mean’ and ‘sum’. Defaults to ‘none’.

  • negative_ratio (float or int, optional) – Maximum ratio of negative samples to positive ones. Defaults to 3.

  • fallback_negative_num (int, optional) – When the mask contains no positive samples, the number of negative samples to be sampled. Defaults to 0.

  • eps (float, optional) – Eps to avoid zero-division error. Defaults to 1e-6.

返回类型

None

forward(pred, gt, mask=None)[源代码]

Forward function.

参数
  • pred (torch.Tensor) – The prediction in any shape.

  • gt (torch.Tensor) – The learning target of the prediction in the same shape as pred.

  • mask (torch.Tensor, optional) – Binary mask in the same shape of pred, indicating positive regions to calculate the loss. Whole region will be taken into account if not provided. Defaults to None.

返回

The loss value.

返回类型

torch.Tensor

Read the Docs v: dev-1.x
Versions
latest
stable
0.x
dev-1.x
Downloads
pdf
html
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.