Class Adamax

java.lang.Object
org.tensorflow.framework.optimizers.Optimizer
org.tensorflow.framework.optimizers.Adamax

public class Adamax extends Optimizer
Optimizer that implements the Adamax algorithm.

It is a variant of Adam based on the infinity norm. Default parameters follow those provided in the paper. Adamax is sometimes superior to adam, specially in models with embeddings.

See Also:
  • Field Details

  • Constructor Details

    • Adamax

      public Adamax(Graph graph)
      Creates an Optimizer that implements the Adamax algorithm.
      Parameters:
      graph - the TensorFlow graph
    • Adamax

      public Adamax(Graph graph, String name)
      Creates an Optimizer that implements the Adamax algorithm.
      Parameters:
      graph - the TensorFlow graph
      name - name for the operations Created when applying gradients. Defaults to "Adamax".
    • Adamax

      public Adamax(Graph graph, float learningRate)
      Creates an Optimizer that implements the Adamax algorithm.
      Parameters:
      graph - the TensorFlow graph
      learningRate - The learning rate.
    • Adamax

      public Adamax(Graph graph, String name, float learningRate)
      Creates an Optimizer that implements the Adamax algorithm.
      Parameters:
      graph - the TensorFlow graph
      name - name for the operations Created when applying gradients. Defaults to "Adamax".
      learningRate - The learning rate.
    • Adamax

      public Adamax(Graph graph, float learningRate, float betaOne, float betaTwo, float epsilon)
      Creates an Optimizer that implements the Adamax algorithm.
      Parameters:
      graph - the TensorFlow graph
      learningRate - The learning rate.
      betaOne - The exponential decay rate for the 1st moment estimates.
      betaTwo - The exponential decay rate for the exponentially weighted infinity norm.
      epsilon - A small constant for numerical stability.
    • Adamax

      public Adamax(Graph graph, String name, float learningRate, float betaOne, float betaTwo, float epsilon)
      Creates an Optimizer that implements the Adamax algorithm.
      Parameters:
      graph - the TensorFlow graph
      name - name for the operations Created when applying gradients. Defaults to "Adamax".
      learningRate - The learning rate.
      betaOne - The exponential decay rate for the 1st moment estimates.
      betaTwo - The exponential decay rate for the exponentially weighted infinity norm.
      epsilon - A small constant for numerical stability.
  • Method Details

    • prepare

      protected Optional<Op> prepare(String scopeName)
      Returns a No-op prepare.
      Overrides:
      prepare in class Optimizer
      Parameters:
      scopeName - The scope name to use for any variable creations.
      Returns:
      a No-op to prepare this optimizer, or empty if none.
    • createSlots

      protected void createSlots(List<Output<? extends TType>> variables)
      Performs a No-op slot creation method.
      Overrides:
      createSlots in class Optimizer
      Parameters:
      variables - The variables to create slots for.
    • applyDense

      protected <T extends TType> Op applyDense(Ops deps, Output<T> gradient, Output<T> variable)
      Generates the gradient update operations for the specific variable and gradient.
      Specified by:
      applyDense in class Optimizer
      Type Parameters:
      T - The type of the variable.
      Parameters:
      gradient - The gradient to use.
      variable - The variable to update.
      Returns:
      An operand which applies the desired optimizer update to the variable.
    • finish

      protected Op finish(List<Op> updateOperations, String name)
      Gathers up the update operations into a single op that can be used as a run target.
      Overrides:
      finish in class Optimizer
      Parameters:
      updateOperations - The update operations.
      name - The name of the run target.
      Returns:
      A NoOp with a control dependency on each update operation.
    • getOptimizerName

      public String getOptimizerName()
      Get the Name of the optimizer.
      Specified by:
      getOptimizerName in class Optimizer
      Returns:
      The optimizer name.