Class Adamax
java.lang.Object
org.tensorflow.framework.optimizers.Optimizer
org.tensorflow.framework.optimizers.Adamax
Optimizer that implements the Adamax algorithm.
It is a variant of Adam based on the infinity norm. Default parameters follow those provided in the paper. Adamax is sometimes superior to adam, specially in models with embeddings.
- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from class Optimizer
Optimizer.GradAndVar<T>, Optimizer.Options -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final floatstatic final floatstatic final floatstatic final Stringstatic final floatstatic final StringFields inherited from class Optimizer
globals, graph, tf, VARIABLE_V2 -
Constructor Summary
ConstructorsConstructorDescriptionCreates an Optimizer that implements the Adamax algorithm.Creates an Optimizer that implements the Adamax algorithm.Creates an Optimizer that implements the Adamax algorithm.Creates an Optimizer that implements the Adamax algorithm.Creates an Optimizer that implements the Adamax algorithm.Creates an Optimizer that implements the Adamax algorithm. -
Method Summary
Modifier and TypeMethodDescriptionapplyDense(Ops deps, Output<T> gradient, Output<T> variable) Generates the gradient update operations for the specific variable and gradient.protected voidcreateSlots(List<Output<? extends TType>> variables) Performs a No-op slot creation method.protected OpGathers up the update operations into a single op that can be used as a run target.Get the Name of the optimizer.Returns a No-op prepare.Methods inherited from class Optimizer
applyGradients, computeGradients, createName, createSlot, getSlot, getTF, minimize, minimize
-
Field Details
-
FIRST_MOMENT
- See Also:
-
SECOND_MOMENT
- See Also:
-
LEARNING_RATE_DEFAULT
public static final float LEARNING_RATE_DEFAULT- See Also:
-
EPSILON_DEFAULT
public static final float EPSILON_DEFAULT- See Also:
-
BETA_ONE_DEFAULT
public static final float BETA_ONE_DEFAULT- See Also:
-
BETA_TWO_DEFAULT
public static final float BETA_TWO_DEFAULT- See Also:
-
-
Constructor Details
-
Adamax
Creates an Optimizer that implements the Adamax algorithm.- Parameters:
graph- the TensorFlow graph
-
Adamax
-
Adamax
Creates an Optimizer that implements the Adamax algorithm.- Parameters:
graph- the TensorFlow graphlearningRate- The learning rate.
-
Adamax
-
Adamax
Creates an Optimizer that implements the Adamax algorithm.- Parameters:
graph- the TensorFlow graphlearningRate- The learning rate.betaOne- The exponential decay rate for the 1st moment estimates.betaTwo- The exponential decay rate for the exponentially weighted infinity norm.epsilon- A small constant for numerical stability.
-
Adamax
public Adamax(Graph graph, String name, float learningRate, float betaOne, float betaTwo, float epsilon) Creates an Optimizer that implements the Adamax algorithm.- Parameters:
graph- the TensorFlow graphname- name for the operations Created when applying gradients. Defaults to "Adamax".learningRate- The learning rate.betaOne- The exponential decay rate for the 1st moment estimates.betaTwo- The exponential decay rate for the exponentially weighted infinity norm.epsilon- A small constant for numerical stability.
-
-
Method Details
-
prepare
-
createSlots
Performs a No-op slot creation method.- Overrides:
createSlotsin classOptimizer- Parameters:
variables- The variables to create slots for.
-
applyDense
Generates the gradient update operations for the specific variable and gradient.- Specified by:
applyDensein classOptimizer- Type Parameters:
T- The type of the variable.- Parameters:
gradient- The gradient to use.variable- The variable to update.- Returns:
- An operand which applies the desired optimizer update to the variable.
-
finish
Gathers up the update operations into a single op that can be used as a run target. -
getOptimizerName
Get the Name of the optimizer.- Specified by:
getOptimizerNamein classOptimizer- Returns:
- The optimizer name.
-