Package ai.djl.training.optimizer
Class Optimizer
java.lang.Object
ai.djl.training.optimizer.Optimizer
An
Optimizer updates the weight parameters to minimize the loss function.
Optimizer is an abstract class that provides the base implementation for optimizers.-
Nested Class Summary
Nested Classes -
Field Summary
Fields -
Constructor Summary
ConstructorsConstructorDescriptionOptimizer(Optimizer.OptimizerBuilder<?> builder) Creates a new instance ofOptimizer. -
Method Summary
Modifier and TypeMethodDescriptionstatic Adadelta.Builderadadelta()Returns a new instance ofAdadelta.Builderthat can build anAdadeltaoptimizer.static Adagrad.Builderadagrad()Returns a new instance ofAdagrad.Builderthat can build anAdagradoptimizer.static Adam.Builderadam()Returns a new instance ofAdam.Builderthat can build anAdamoptimizer.static AdamW.BuilderadamW()Returns a new instance ofAdamW.Builderthat can build anAdamWoptimizer.protected floatGets the value of weight decay.static Nag.Buildernag()Returns a new instance ofNag.Builderthat can build anNagoptimizer.static RmsProp.Builderrmsprop()Returns a new instance ofRmsProp.Builderthat can build anRmsPropoptimizer.static Sgd.Buildersgd()Returns a new instance ofSgd.Builderthat can build anSgdoptimizer.abstract voidUpdates the parameters according to the gradients.protected intupdateCount(String parameterId) protected NDArraywithDefaultState(Map<String, Map<Device, NDArray>> state, String key, Device device, Function<String, NDArray> defaultFunction)
-
Field Details
-
rescaleGrad
protected float rescaleGrad -
clipGrad
protected float clipGrad
-
-
Constructor Details
-
Optimizer
Creates a new instance ofOptimizer.- Parameters:
builder- the builder used to create an instance ofOptimizer
-
-
Method Details
-
sgd
Returns a new instance ofSgd.Builderthat can build anSgdoptimizer.- Returns:
- the
SgdSgd.Builder
-
nag
Returns a new instance ofNag.Builderthat can build anNagoptimizer.- Returns:
- the
NagNag.Builder
-
adam
Returns a new instance ofAdam.Builderthat can build anAdamoptimizer.- Returns:
- the
AdamAdam.Builder
-
adamW
Returns a new instance ofAdamW.Builderthat can build anAdamWoptimizer.- Returns:
- the
AdamWAdamW.Builder
-
rmsprop
Returns a new instance ofRmsProp.Builderthat can build anRmsPropoptimizer.- Returns:
- the
RmsPropRmsProp.Builder
-
adagrad
Returns a new instance ofAdagrad.Builderthat can build anAdagradoptimizer.- Returns:
- the
AdagradAdagrad.Builder
-
adadelta
Returns a new instance ofAdadelta.Builderthat can build anAdadeltaoptimizer.- Returns:
- the
AdadeltaAdadelta.Builder
-
getWeightDecay
protected float getWeightDecay()Gets the value of weight decay.- Returns:
- the value of weight decay
-
updateCount
-
update
Updates the parameters according to the gradients.- Parameters:
parameterId- the parameter to be updatedweight- the weights of the parametergrad- the gradients
-
withDefaultState
-