Package ai.djl.training.loss
Class SoftmaxCrossEntropyLoss
java.lang.Object
ai.djl.training.evaluator.Evaluator
ai.djl.training.loss.Loss
ai.djl.training.loss.SoftmaxCrossEntropyLoss
SoftmaxCrossEntropyLoss is a type of Loss that calculates the softmax cross
entropy loss.
If sparse_label is true (default), label should contain integer
category indicators. Then, \(L = -\sum_i \log p_{i, label_i}\). If sparse_label is
false, label should be one-hot class coding or probability distribution and its shape
should be the same as the shape of prediction. Then, \(L = -\sum_i \sum_j {label}_j \log
p_{ij}\).
-
Field Summary
Fields inherited from class ai.djl.training.evaluator.Evaluator
totalInstances -
Constructor Summary
ConstructorsConstructorDescriptionCreates a new instance ofSoftmaxCrossEntropyLosswith default parameters.Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.SoftmaxCrossEntropyLoss(String name, float weight, int classAxis, boolean sparseLabel, boolean fromLogit) Creates a new instance ofSoftmaxCrossEntropyLosswith the given parameters. -
Method Summary
Methods inherited from class ai.djl.training.loss.Loss
addAccumulator, elasticNetWeightedDecay, elasticNetWeightedDecay, elasticNetWeightedDecay, elasticNetWeightedDecay, getAccumulator, hingeLoss, hingeLoss, hingeLoss, l1Loss, l1Loss, l1Loss, l1WeightedDecay, l1WeightedDecay, l1WeightedDecay, l2Loss, l2Loss, l2Loss, l2WeightedDecay, l2WeightedDecay, l2WeightedDecay, maskedSoftmaxCrossEntropyLoss, maskedSoftmaxCrossEntropyLoss, maskedSoftmaxCrossEntropyLoss, quantileL1Loss, quantileL1Loss, resetAccumulator, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, sigmoidBinaryCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, softmaxCrossEntropyLoss, updateAccumulator, updateAccumulatorsMethods inherited from class ai.djl.training.evaluator.Evaluator
checkLabelShapes, checkLabelShapes, getName
-
Constructor Details
-
SoftmaxCrossEntropyLoss
public SoftmaxCrossEntropyLoss()Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters. -
SoftmaxCrossEntropyLoss
Creates a new instance ofSoftmaxCrossEntropyLosswith default parameters.- Parameters:
name- the name of the loss
-
SoftmaxCrossEntropyLoss
public SoftmaxCrossEntropyLoss(String name, float weight, int classAxis, boolean sparseLabel, boolean fromLogit) Creates a new instance ofSoftmaxCrossEntropyLosswith the given parameters.- Parameters:
name- the name of the lossweight- the weight to apply on the loss value, default 1classAxis- the axis that represents the class probabilities, default -1sparseLabel- whether labels are rank-1 integer array of [batch_size] (true) or rank-2 one-hot or probability distribution of shape [batch_size, n-class] (false), default truefromLogit- if true, the inputs are assumed to be the numbers before being applied with softmax. Then logSoftmax will be applied to input, default true
-
-
Method Details