Class AdaGrad

java.lang.Object
org.tensorflow.framework.optimizers.Optimizer
org.tensorflow.framework.optimizers.AdaGrad

public class AdaGrad extends Optimizer
Optimizer that implements the Adagrad algorithm.

Adagrad is an optimizer with parameter-specific learning rates, which are adapted relative to how frequently a parameter gets updated during training. The more updates a parameter receives, the smaller the updates.

See Also:
  • Field Details

  • Constructor Details

    • AdaGrad

      public AdaGrad(Graph graph)
      Creates an AdaGrad Optimizer
      Parameters:
      graph - the TensorFlow Graph
    • AdaGrad

      public AdaGrad(Graph graph, float learningRate)
      Creates an AdaGrad Optimizer
      Parameters:
      graph - the TensorFlow Graph
      learningRate - the learning rate
    • AdaGrad

      public AdaGrad(Graph graph, float learningRate, float initialAccumulatorValue)
      Creates an AdaGrad Optimizer
      Parameters:
      graph - the TensorFlow Graph
      learningRate - the learning rate
      initialAccumulatorValue - Starting value for the accumulators, must be non-negative.
      Throws:
      IllegalArgumentException - if initialAccumulatorValue is negative
    • AdaGrad

      public AdaGrad(Graph graph, String name, float learningRate)
      Creates an AdaGrad Optimizer
      Parameters:
      graph - the TensorFlow Graph
      name - the name for this Optimizer (defaults to 'Adagrad')
      learningRate - the learning rate
    • AdaGrad

      public AdaGrad(Graph graph, String name, float learningRate, float initialAccumulatorValue)
      Creates an AdaGrad Optimizer
      Parameters:
      graph - the TensorFlow Graph
      name - the name for this Optimizer (defaults to 'Adagrad')
      learningRate - the learning rate
      initialAccumulatorValue - Starting value for the accumulators, must be non-negative.
      Throws:
      IllegalArgumentException - if initialAccumulatorValue is negative
  • Method Details

    • createSlots

      protected void createSlots(List<Output<? extends TType>> variables)
      Performs a No-op slot creation method.
      Overrides:
      createSlots in class Optimizer
      Parameters:
      variables - The variables to create slots for.
    • applyDense

      protected <T extends TType> Op applyDense(Ops deps, Output<T> gradient, Output<T> variable)
      Generates the gradient update operations for the specific variable and gradient.
      Specified by:
      applyDense in class Optimizer
      Type Parameters:
      T - The type of the variable.
      Parameters:
      gradient - The gradient to use.
      variable - The variable to update.
      Returns:
      An operand which applies the desired optimizer update to the variable.
    • toString

      public String toString()
      Overrides:
      toString in class Object
    • getOptimizerName

      public String getOptimizerName()
      Get the Name of the optimizer.
      Specified by:
      getOptimizerName in class Optimizer
      Returns:
      The optimizer name.