Class Activation
Many networks make use of the Linear block and other similar linear
transformations. However, any number of linear transformations that are composed will only result
in a different linear transformation (\($f(x) = W_2(W_1x) = (W_2W_1)x = W_{combined}x\)). In
order to represent non-linear data, non-linear functions called activation functions are
interspersed between the linear transformations. This allows the network to represent non-linear
functions of increasing complexity.
See wikipedia for more details.
-
Method Summary
Modifier and TypeMethodDescriptionstatic NDArrayApplies ELU activation on the inputNDArray.static NDListApplies ELU(Exponential Linear Unit) activation on the input singletonNDList.static BlockeluBlock(float alpha) Creates aLambdaBlockthat applies theELUactivation function in its forward function.static NDArrayApplies GELU(Gaussian Error Linear Unit) activation on the inputNDArray.static NDListApplies GELU(Gaussian Error Linear Unit) activation on the input singletonNDList.static BlockCreates aLambdaBlockthat applies theGELUactivation function in its forward function.static NDArrayApplies Leaky ReLU activation on the inputNDArray.static NDListApplies Leaky ReLU activation on the input singletonNDList.static BlockleakyReluBlock(float alpha) Creates aLambdaBlockthat applies theLeakyReLUactivation function in its forward function.static NDArrayApplies Mish activation on the inputNDArray.static NDListApplies Mish activation on the input singletonNDList.static BlockCreates aLambdaBlockthat applies theMishactivation function in its forward function.static BlockReturns aPrelublock.static NDArrayApplies ReLU activation on the inputNDArray.static NDListApplies ReLU activation on the input singletonNDList.static NDArrayApplies ReLU6 activation on the inputNDArray.static NDListApplies ReLU6 activation on the input singletonNDList.static BlockCreates aLambdaBlockthat applies theReLU6activation function in its forward function.static BlockCreates aLambdaBlockthat applies theReLUactivation function in its forward function.static NDArrayApplies Scaled ELU activation on the inputNDArray.static NDListApplies Scaled ELU activation on the input singletonNDList.static BlockCreates aLambdaBlockthat applies theSELUactivation function in its forward function.static NDArrayApplies Sigmoid activation on the inputNDArray.static NDListApplies Sigmoid activation on the input singletonNDList.static BlockCreates aLambdaBlockthat applies theSigmoidactivation function in its forward function.static NDArrayApplies softPlus activation on the inputNDArray.static NDListApplies softPlus activation on the input singletonNDList.static BlockCreates aLambdaBlockthat applies thesoftPlus(NDList)activation function in its forward function.static NDArrayApplies softSign activation on the inputNDArray.static NDListApplies softPlus activation on the input singletonNDList.static BlockCreates aLambdaBlockthat applies thesoftSign(NDList)activation function in its forward function.static NDArrayApplies Swish activation on the inputNDArray.static NDListApplies SWish activation on the input singletonNDList.static BlockswishBlock(float beta) Creates aLambdaBlockthat applies theSwishactivation function in its forward function.static NDArrayApplies Tanh activation on the inputNDArray.static NDListApplies Tanh activation on the input singletonNDList.static BlockCreates aLambdaBlockthat applies theTanhactivation function in its forward function.
-
Method Details
-
relu
Applies ReLU activation on the inputNDArray.ReLU is defined by: \( y = max(0, x) \)
-
relu
Applies ReLU activation on the input singletonNDList.ReLU is defined by: \( y = max(0, x) \)
-
relu6
Applies ReLU6 activation on the inputNDArray.ReLU6 is defined by: \( y = min(6,max(0, x)) \)
-
relu6
Applies ReLU6 activation on the input singletonNDList.ReLU6 is defined by: \( y = min(6,max(0, x)) \)
-
sigmoid
Applies Sigmoid activation on the inputNDArray.Sigmoid is defined by: \( y = 1 / (1 + e^{-x}) \)
-
sigmoid
Applies Sigmoid activation on the input singletonNDList.Sigmoid is defined by: \( y = 1 / (1 + e^{-x}) \)
-
tanh
Applies Tanh activation on the inputNDArray.Tanh is defined by: \( y = (e^x - e^{-x}) / (e^x + e^{-x}) \)
-
tanh
Applies Tanh activation on the input singletonNDList.Tanh is defined by: \( y = (e^x - e^{-x}) / (e^x + e^{-x}) \)
-
softPlus
Applies softPlus activation on the inputNDArray.softPlus is defined by: \( y = log(1 + e^x) \)
-
softPlus
Applies softPlus activation on the input singletonNDList.softPlus is defined by: \( y = log(1 + e^x) \)
-
softSign
Applies softSign activation on the inputNDArray.softPlus is defined by: \( y = x / 1 + |x| \)
-
softSign
Applies softPlus activation on the input singletonNDList.softPlus is defined by: \( y = x / 1 + |x| \)
-
leakyRelu
Applies Leaky ReLU activation on the inputNDArray.Leaky ReLU is defined by: \( y = x \gt 0 ? x : alpha * x \)
-
leakyRelu
Applies Leaky ReLU activation on the input singletonNDList.Leaky ReLU is defined by: \( y = x \gt 0 ? x : alpha * x \)
-
elu
Applies ELU activation on the inputNDArray.ELU is defined by: \( y = x \gt 0 ? x : alpha * (e^x - 1) \)
-
elu
Applies ELU(Exponential Linear Unit) activation on the input singletonNDList.ELU is defined by: \( y = x \gt 0 ? x : alpha * (e^x - 1) \)
-
selu
Applies Scaled ELU activation on the inputNDArray.Scaled ELU is defined by: \( y = lambda * (x \gt 0 ? x : alpha * (e^x - 1))\) where \(lambda = 1.0507009873554804934193349852946\) and \(alpha = 1.6732632423543772848170429916717\)
-
selu
Applies Scaled ELU activation on the input singletonNDList.Scaled ELU is defined by: \( y = lambda * (x \gt 0 ? x : alpha * (e^x - 1))\) where \(lambda = 1.0507009873554804934193349852946\) and \(alpha = 1.6732632423543772848170429916717 \)
-
gelu
Applies GELU(Gaussian Error Linear Unit) activation on the inputNDArray. -
gelu
Applies GELU(Gaussian Error Linear Unit) activation on the input singletonNDList. -
swish
Applies Swish activation on the inputNDArray.Swish is defined as \(y = x * sigmoid(beta * x)\)
-
swish
Applies SWish activation on the input singletonNDList.Swish is defined as \(y = x * sigmoid(beta * x)\)
-
mish
Applies Mish activation on the inputNDArray.Mish is defined as \(y = x * tanh(ln(1 + e^x)\) defined by Diganta Misra in his paper Mish: A Self Regularized Non-Monotonic Neural Activation Function
-
mish
Applies Mish activation on the input singletonNDList.Mish is defined as \(y = x * tanh(ln(1 + e^x)\) defined by Diganta Misra in his paper Mish: A Self Regularized Non-Monotonic Neural Activation Function
-
reluBlock
Creates aLambdaBlockthat applies theReLUactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theReLUactivation function
-
relu6Block
Creates aLambdaBlockthat applies theReLU6activation function in its forward function.- Returns:
- the
LambdaBlockthat applies theReLUactivation function
-
sigmoidBlock
Creates aLambdaBlockthat applies theSigmoidactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theSigmoidactivation function
-
tanhBlock
Creates aLambdaBlockthat applies theTanhactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theTanhactivation function
-
softPlusBlock
Creates aLambdaBlockthat applies thesoftPlus(NDList)activation function in its forward function.- Returns:
- the
LambdaBlockthat applies thesoftPlus(NDList)activation function
-
softSignBlock
Creates aLambdaBlockthat applies thesoftSign(NDList)activation function in its forward function.- Returns:
- the
LambdaBlockthat applies thesoftSign(NDList)activation function
-
leakyReluBlock
Creates aLambdaBlockthat applies theLeakyReLUactivation function in its forward function.- Parameters:
alpha- the slope for the activation- Returns:
- the
LambdaBlockthat applies theLeakyReLUactivation function
-
eluBlock
Creates aLambdaBlockthat applies theELUactivation function in its forward function.- Parameters:
alpha- the slope for the activation- Returns:
- the
LambdaBlockthat applies theELUactivation function
-
seluBlock
Creates aLambdaBlockthat applies theSELUactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theSELUactivation function
-
geluBlock
Creates aLambdaBlockthat applies theGELUactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theGELUactivation function
-
swishBlock
Creates aLambdaBlockthat applies theSwishactivation function in its forward function.- Parameters:
beta- a hyper-parameter- Returns:
- the
LambdaBlockthat applies theSwishactivation function
-
mishBlock
Creates aLambdaBlockthat applies theMishactivation function in its forward function.- Returns:
- the
LambdaBlockthat applies theMishactivation function
-
preluBlock
Returns aPrelublock.- Returns:
- a
Prelublock
-