Package org.tensorflow.framework.optimizers
package org.tensorflow.framework.optimizers
-
ClassDescriptionOptimizer that implements the Adadelta algorithm.Optimizer that implements the Adagrad algorithm.Optimizer that implements the Adagrad Dual-Averaging algorithm.Optimizer that implements the Adam algorithm.Optimizer that implements the Adamax algorithm.Optimizer that implements the FTRL algorithm.Basic Stochastic gradient descent optimizer.Stochastic gradient descent plus momentum, either nesterov or traditional.Nadam Optimizer that implements the NAdam algorithm.Base class for gradient optimizers.Optimizer.GradAndVar<T extends TType>A class that holds a paired gradient and variable.Optional attributes for
OptimizerEnumerator used to create a new Optimizer with default parameters.Optimizer that implements the RMSProp algorithm.