Class Optimizer
java.lang.Object
org.tensorflow.framework.optimizers.Optimizer
- Direct Known Subclasses:
AdaDelta, AdaGrad, AdaGradDA, Adam, Adamax, Ftrl, GradientDescent, Momentum, Nadam, RMSProp
Base class for gradient optimizers.
-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic classOptimizer.GradAndVar<T extends TType>A class that holds a paired gradient and variable.static classOptional attributes forOptimizer -
Field Summary
Fields -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionapplyDense(Ops opDependencies, Output<T> gradient, Output<T> variable) Generates the gradient update operations for the specific variable and gradient.applyGradients(List<Optimizer.GradAndVar<? extends TType>> gradsAndVars, String name) Applies gradients to variables<T extends TType>
List<Optimizer.GradAndVar<?>> computeGradients(Operand<?> loss) Computes the gradients based on a loss operand.static StringcreateName(Output<? extends TType> variable, String slotName) Creates a name by combining a variable name and a slot nameprotected <T extends TType>
voidcreateSlot(Output<T> variable, String slotName, Operand<T> initializer) Creates a slot in the graph for the specified variable with the specified name.protected voidcreateSlots(List<Output<? extends TType>> variables) Performs a No-op slot creation method.protected OpGathers up the update operations into a single op that can be used as a run target.abstract StringGet the Name of the optimizer.Gets the slot associated with the specified variable and slot name.final OpsgetTF()Gets the Optimizer's Ops instanceMinimizes the loss by updating the variablesMinimizes the loss by updating the variablesReturns a No-op prepare.
-
Field Details
-
VARIABLE_V2
- See Also:
-
globals
-
graph
The Graph this optimizer is operating on. -
tf
The ops builder for the graph.
-
-
Constructor Details
-
Optimizer
Builds an optimizer for the supplied graph.Uses the name from
getOptimizerName()to name the operations.- Parameters:
graph- The graph to optimize.
-
Optimizer
-
-
Method Details
-
createName
-
getTF
-
minimize
-
minimize
-
computeGradients
Computes the gradients based on a loss operand.- Type Parameters:
T- the data type of the loss, gradients and variables.- Parameters:
loss- the loss operation- Returns:
- the computed gradients
-
applyGradients
Applies gradients to variables- Parameters:
gradsAndVars- the list of (gradient, variable) pairs.name- the name of the apply gradients operation- Returns:
- an Op that applies the gradients to the variables.
-
getSlot
-
createSlot
protected <T extends TType> void createSlot(Output<T> variable, String slotName, Operand<T> initializer) Creates a slot in the graph for the specified variable with the specified name. Adds the slot's initializer to the graph's initializers, and the slot to the Optimizer's slot map.- Type Parameters:
T- The type of the variable.- Parameters:
variable- The variable to create the slot for.slotName- The name of the slot.initializer- The initializer for the slot.
-
prepare
-
createSlots
-
applyDense
protected abstract <T extends TType> Op applyDense(Ops opDependencies, Output<T> gradient, Output<T> variable) Generates the gradient update operations for the specific variable and gradient.- Type Parameters:
T- The type of the variable.- Parameters:
gradient- The gradient to use.variable- The variable to update.- Returns:
- An operand which applies the desired optimizer update to the variable.
-
finish
Gathers up the update operations into a single op that can be used as a run target.- Parameters:
updateOperations- The update operations.name- The name of the run target.- Returns:
- A NoOp with a control dependency on each update operation.
-
getOptimizerName
Get the Name of the optimizer.- Returns:
- The optimizer name.
-