Class SoftmaxCrossEntropyLoss


public class SoftmaxCrossEntropyLoss extends Loss
SoftmaxCrossEntropyLoss is a type of Loss that calculates the softmax cross entropy loss.

If sparse_label is true (default), label should contain integer category indicators. Then, \(L = -\sum_i \log p_{i, label_i}\). If sparse_label is false, label should be one-hot class coding or probability distribution and its shape should be the same as the shape of prediction. Then, \(L = -\sum_i \sum_j {label}_j \log p_{ij}\).

  • Constructor Details

    • SoftmaxCrossEntropyLoss

      public SoftmaxCrossEntropyLoss()
      Creates a new instance of SoftmaxCrossEntropyLoss with default parameters.
    • SoftmaxCrossEntropyLoss

      public SoftmaxCrossEntropyLoss(String name)
      Creates a new instance of SoftmaxCrossEntropyLoss with default parameters.
      Parameters:
      name - the name of the loss
    • SoftmaxCrossEntropyLoss

      public SoftmaxCrossEntropyLoss(String name, float weight, int classAxis, boolean sparseLabel, boolean fromLogit)
      Creates a new instance of SoftmaxCrossEntropyLoss with the given parameters.
      Parameters:
      name - the name of the loss
      weight - the weight to apply on the loss value, default 1
      classAxis - the axis that represents the class probabilities, default -1
      sparseLabel - whether labels are rank-1 integer array of [batch_size] (true) or rank-2 one-hot or probability distribution of shape [batch_size, n-class] (false), default true
      fromLogit - if true, the inputs are assumed to be the numbers before being applied with softmax. Then logSoftmax will be applied to input, default true
  • Method Details

    • evaluate

      public NDArray evaluate(NDList label, NDList prediction)
      Calculates the evaluation between the labels and the predictions.
      Specified by:
      evaluate in class Evaluator
      Parameters:
      label - the correct values
      prediction - the predicted values
      Returns:
      the evaluation result