Uses of Class
org.tensorflow.op.train.ApplyAdagradDa
Packages that use ApplyAdagradDa
-
Uses of ApplyAdagradDa in org.tensorflow.op
Methods in org.tensorflow.op that return ApplyAdagradDaModifier and TypeMethodDescription<T extends TType>
ApplyAdagradDa<T> TrainOps.applyAdagradDa(Operand<T> var, Operand<T> gradientAccumulator, Operand<T> gradientSquaredAccumulator, Operand<T> grad, Operand<T> lr, Operand<T> l1, Operand<T> l2, Operand<TInt64> globalStep, ApplyAdagradDa.Options... options) Update '*var' according to the proximal adagrad scheme. -
Uses of ApplyAdagradDa in org.tensorflow.op.train
Subclasses with type arguments of type ApplyAdagradDa in org.tensorflow.op.trainMethods in org.tensorflow.op.train that return ApplyAdagradDaModifier and TypeMethodDescriptionstatic <T extends TType>
ApplyAdagradDa<T> ApplyAdagradDa.create(Scope scope, Operand<T> var, Operand<T> gradientAccumulator, Operand<T> gradientSquaredAccumulator, Operand<T> grad, Operand<T> lr, Operand<T> l1, Operand<T> l2, Operand<TInt64> globalStep, ApplyAdagradDa.Options... options) Factory method to create a class wrapping a new ApplyAdagradDA operation.