Class Nadam
java.lang.Object
org.tensorflow.framework.optimizers.Optimizer
org.tensorflow.framework.optimizers.Nadam
Nadam Optimizer that implements the NAdam algorithm.
Much like Adam is essentially RMSprop with momentum, Nadam is Adam with Nesterov momentum.
- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from class Optimizer
Optimizer.GradAndVar<T>, Optimizer.Options -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final floatstatic final floatstatic final floatstatic final Stringstatic final floatstatic final Stringstatic final StringFields inherited from class Optimizer
globals, graph, tf, VARIABLE_V2 -
Constructor Summary
ConstructorsConstructorDescriptionCreates a Nadam OptimizerCreates a Nadam OptimizerCreates a Nadam OptimizerCreates a Nadam OptimizerCreates a Nadam Optimizer -
Method Summary
Modifier and TypeMethodDescriptionapplyDense(Ops deps, Output<T> gradient, Output<T> variable) Generates the gradient update operations for the specific variable and gradient.protected voidcreateSlots(List<Output<? extends TType>> variables) Performs a No-op slot creation method.protected OpGathers up the update operations into a single op that can be used as a run target.Get the Name of the optimizer.Returns a No-op prepare.Methods inherited from class Optimizer
applyGradients, computeGradients, createName, createSlot, getSlot, getTF, minimize, minimize
-
Field Details
-
LEARNING_RATE_DEFAULT
public static final float LEARNING_RATE_DEFAULT- See Also:
-
EPSILON_DEFAULT
public static final float EPSILON_DEFAULT- See Also:
-
BETA_ONE_DEFAULT
public static final float BETA_ONE_DEFAULT- See Also:
-
BETA_TWO_DEFAULT
public static final float BETA_TWO_DEFAULT- See Also:
-
FIRST_MOMENT
- See Also:
-
SECOND_MOMENT
- See Also:
-
MOMENTUM
- See Also:
-
-
Constructor Details
-
Nadam
-
Nadam
Creates a Nadam Optimizer- Parameters:
graph- the TensorFlow graphlearningRate- the learning rate, defaults to 0.001
-
Nadam
Creates a Nadam Optimizer- Parameters:
graph- the TensorFlow graphlearningRate- the learning rate, defaults to 0.001betaOne- The exponential decay rate for the 1st moment estimates. Default is 0.9.betaTwo- The exponential decay rate for the exponentially weighted infinity norm. Default is 0.999.epsilon- A small constant for numerical stability. Default is 1e-8.
-
Nadam
-
Nadam
public Nadam(Graph graph, String name, float learningRate, float betaOne, float betaTwo, float epsilon) Creates a Nadam Optimizer- Parameters:
graph- the TensorFlow graphname- the name for this Optimizer, defaults to "Nadam"learningRate- the learning rate, defaults to 0.001betaOne- The exponential decay rate for the 1st moment estimates. Default is 0.9.betaTwo- The exponential decay rate for the exponentially weighted infinity norm. Default is 0.999.epsilon- A small constant for numerical stability. Default is 1e-8.
-
-
Method Details
-
createSlots
Performs a No-op slot creation method.- Overrides:
createSlotsin classOptimizer- Parameters:
variables- The variables to create slots for.
-
prepare
-
applyDense
Generates the gradient update operations for the specific variable and gradient.- Specified by:
applyDensein classOptimizer- Type Parameters:
T- The type of the variable.- Parameters:
gradient- The gradient to use.variable- The variable to update.- Returns:
- An operand which applies the desired optimizer update to the variable.
-
finish
Gathers up the update operations into a single op that can be used as a run target.Adds the betaOne, betaTwo and mu updates to the end of the updates list.
-
getOptimizerName
Get the Name of the optimizer.- Specified by:
getOptimizerNamein classOptimizer- Returns:
- The optimizer name.
-