Package ai.djl.nn.norm
package ai.djl.nn.norm
Contains classes that define normalizing neural network operations.
-
ClassesClassDescriptionIn batch training (training with more than one samples per iteration), a batch normalization layer works by normalizing the values of input data to have mean of 0 and variance of 1.BatchNorm.BaseBuilder<T extends BatchNorm.BaseBuilder<T>>The Builder to construct a
BatchNorm.A dropout layer benefits a network by allowing some units (neurons), and hence their respective connections, of a network to be randomly and temporarily removed by setting its value to 0 only during training by specified probability \(p\), usually set to 0.5.GhostBatchNormis similar toBatchNormexcept that it splits a batch into a smaller sub-batches aka ghost batches, and normalize them individually to have a mean of 0 and variance of 1 and finally concatenate them again to a single batch.The Builder to construct aGhostBatchNorm.Layer normalization works by normalizing the values of input data for each input sample to have mean of 0 and variance of 1.The Builder to construct aLayerNorm.