Package ai.djl.training.loss
Class MaskedSoftmaxCrossEntropyLoss
java.lang.Object
ai.djl.training.evaluator.Evaluator
ai.djl.training.loss.Loss
ai.djl.training.loss.MaskedSoftmaxCrossEntropyLoss
MaskedSoftmaxCrossEntropyLoss is an implementation of Loss that only considers a
specific number of values for the loss computations, and masks the rest according to the given
sequence.-
Field Summary
Fields inherited from class ai.djl.training.evaluator.Evaluator
totalInstances -
Constructor Summary
ConstructorsConstructorDescriptionCreates a new instance ofSoftmaxCrossEntropyLosswith default parameters.Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.MaskedSoftmaxCrossEntropyLoss(String name, float weight, int classAxis, boolean sparseLabel, boolean fromLogit) Creates a new instance ofMaskedSoftmaxCrossEntropyLosswith the given parameters. -
Method Summary
Methods inherited from class ai.djl.training.loss.Loss
addAccumulator, elasticNetWeightedDecay, elasticNetWeightedDecay, elasticNetWeightedDecay, elasticNetWeightedDecay, getAccumulator, hingeLoss, hingeLoss, hingeLoss, l1Loss, l1Loss, l1Loss, l1WeightedDecay, l1WeightedDecay, l1WeightedDecay, l2Loss, l2Loss, l2Loss, l2WeightedDecay, l2WeightedDecay, l2WeightedDecay, maskedSoftmaxCrossEntropyLoss, maskedSoftmaxCrossEntropyLoss, maskedSoftmaxCrossEntropyLoss, quantileL1Loss, quantileL1Loss, resetAccumulator, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, updateAccumulator, updateAccumulatorsMethods inherited from class ai.djl.training.evaluator.Evaluator
checkLabelShapes, checkLabelShapes, getName
-
Constructor Details
-
MaskedSoftmaxCrossEntropyLoss
public MaskedSoftmaxCrossEntropyLoss()Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters. -
MaskedSoftmaxCrossEntropyLoss
Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.- Parameters:
name- the name of the loss
-
MaskedSoftmaxCrossEntropyLoss
public MaskedSoftmaxCrossEntropyLoss(String name, float weight, int classAxis, boolean sparseLabel, boolean fromLogit) Creates a new instance ofMaskedSoftmaxCrossEntropyLosswith the given parameters.- Parameters:
name- the name of the lossweight- the weight to apply on the loss value, default 1classAxis- the axis that represents the class probabilities, default -1sparseLabel- whether labels are 1-D integer array of [batch_size] (false) or 2-D probabilities of [batch_size, n-class] (true), default truefromLogit- if true, the inputs are assumed to be the numbers before being applied with softmax. Then logSoftmax will be applied to input, default false
-
-
Method Details
-
evaluate
Calculates the evaluation between the labels and the predictions. Thelabelparameter is anNDListthat contains the label and the mask sequence in that order.
-