Class ApplyAdagradDa<T extends TType>
java.lang.Object
org.tensorflow.op.RawOp
org.tensorflow.op.train.ApplyAdagradDa<T>
-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic classApplyAdagradDa.Inputs<T extends TType>static classOptional attributes forApplyAdagradDa -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final StringThe name of this op, as known by TensorFlow core engine -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionasOutput()Returns the symbolic handle of the tensor.static <T extends TType>
ApplyAdagradDa<T> create(Scope scope, Operand<T> var, Operand<T> gradientAccumulator, Operand<T> gradientSquaredAccumulator, Operand<T> grad, Operand<T> lr, Operand<T> l1, Operand<T> l2, Operand<TInt64> globalStep, ApplyAdagradDa.Options... options) Factory method to create a class wrapping a new ApplyAdagradDA operation.out()Gets out.static ApplyAdagradDa.OptionsuseLocking(Boolean useLocking) Sets the useLocking option.
-
Field Details
-
OP_NAME
The name of this op, as known by TensorFlow core engine- See Also:
-
-
Constructor Details
-
ApplyAdagradDa
-
-
Method Details
-
create
@Endpoint(describeByClass=true) public static <T extends TType> ApplyAdagradDa<T> create(Scope scope, Operand<T> var, Operand<T> gradientAccumulator, Operand<T> gradientSquaredAccumulator, Operand<T> grad, Operand<T> lr, Operand<T> l1, Operand<T> l2, Operand<TInt64> globalStep, ApplyAdagradDa.Options... options) Factory method to create a class wrapping a new ApplyAdagradDA operation.- Type Parameters:
T- data type forApplyAdagradDAoutput and operands- Parameters:
scope- current scopevar- Should be from a Variable().gradientAccumulator- Should be from a Variable().gradientSquaredAccumulator- Should be from a Variable().grad- The gradient.lr- Scaling factor. Must be a scalar.l1- L1 regularization. Must be a scalar.l2- L2 regularization. Must be a scalar.globalStep- Training step number. Must be a scalar.options- carries optional attribute values- Returns:
- a new instance of ApplyAdagradDa
-
useLocking
Sets the useLocking option.- Parameters:
useLocking- If True, updating of the var and accum tensors will be protected by a lock; otherwise the behavior is undefined, but may exhibit less contention.- Returns:
- this Options instance.
-
out
-
asOutput
Description copied from interface:OperandReturns the symbolic handle of the tensor.Inputs to TensorFlow operations are outputs of another TensorFlow operation. This method is used to obtain a symbolic handle that represents the computation of the input.
-