Package org.tensorflow.framework.activations
package org.tensorflow.framework.activations
-
ClassDescriptionAbstract base class for ActivationsInterface for ActivationsThe Enumerations for creating Activations based an activation name, with either an empty constructor or a constructor that takes a Map object that contains the Activation's state.Exponential linear unit.Exponential activation function.Applies the Gaussian error linear unit (GELU) activation function.Hard sigmoid activation.Linear activation function (pass-through).Rectified Linear Unit(ReLU) activation.Scaled Exponential Linear Unit (SELU).Sigmoid activation.Softmax converts a real vector to a vector of categorical probabilities.Softplus activation function,
softplus(x) = log(exp(x) + 1).Softsign activation function,softsign(x) = x / (abs(x) + 1).Swish activation function.Hyperbolic tangent activation function.