Class AdaGrad
java.lang.Object
org.tensorflow.framework.optimizers.Optimizer
org.tensorflow.framework.optimizers.AdaGrad
Optimizer that implements the Adagrad algorithm.
Adagrad is an optimizer with parameter-specific learning rates, which are adapted relative to how frequently a parameter gets updated during training. The more updates a parameter receives, the smaller the updates.
- See Also:
-
Nested Class Summary
Nested classes/interfaces inherited from class Optimizer
Optimizer.GradAndVar<T>, Optimizer.Options -
Field Summary
FieldsFields inherited from class Optimizer
globals, graph, tf, VARIABLE_V2 -
Constructor Summary
ConstructorsConstructorDescriptionCreates an AdaGrad OptimizerCreates an AdaGrad OptimizerCreates an AdaGrad OptimizerCreates an AdaGrad OptimizerCreates an AdaGrad Optimizer -
Method Summary
Modifier and TypeMethodDescriptionapplyDense(Ops deps, Output<T> gradient, Output<T> variable) Generates the gradient update operations for the specific variable and gradient.protected voidcreateSlots(List<Output<? extends TType>> variables) Performs a No-op slot creation method.Get the Name of the optimizer.toString()Methods inherited from class Optimizer
applyGradients, computeGradients, createName, createSlot, finish, getSlot, getTF, minimize, minimize, prepare
-
Field Details
-
ACCUMULATOR
- See Also:
-
LEARNING_RATE_DEFAULT
public static final float LEARNING_RATE_DEFAULT- See Also:
-
INITIAL_ACCUMULATOR_DEFAULT
public static final float INITIAL_ACCUMULATOR_DEFAULT- See Also:
-
-
Constructor Details
-
AdaGrad
-
AdaGrad
Creates an AdaGrad Optimizer- Parameters:
graph- the TensorFlow GraphlearningRate- the learning rate
-
AdaGrad
Creates an AdaGrad Optimizer- Parameters:
graph- the TensorFlow GraphlearningRate- the learning rateinitialAccumulatorValue- Starting value for the accumulators, must be non-negative.- Throws:
IllegalArgumentException- if initialAccumulatorValue is negative
-
AdaGrad
-
AdaGrad
Creates an AdaGrad Optimizer- Parameters:
graph- the TensorFlow Graphname- the name for this Optimizer (defaults to 'Adagrad')learningRate- the learning rateinitialAccumulatorValue- Starting value for the accumulators, must be non-negative.- Throws:
IllegalArgumentException- if initialAccumulatorValue is negative
-
-
Method Details
-
createSlots
Performs a No-op slot creation method.- Overrides:
createSlotsin classOptimizer- Parameters:
variables- The variables to create slots for.
-
applyDense
Generates the gradient update operations for the specific variable and gradient.- Specified by:
applyDensein classOptimizer- Type Parameters:
T- The type of the variable.- Parameters:
gradient- The gradient to use.variable- The variable to update.- Returns:
- An operand which applies the desired optimizer update to the variable.
-
toString
-
getOptimizerName
Get the Name of the optimizer.- Specified by:
getOptimizerNamein classOptimizer- Returns:
- The optimizer name.
-