Class GELU
java.lang.Object
org.tensorflow.framework.activations.AbstractActivation
org.tensorflow.framework.activations.GELU
- All Implemented Interfaces:
Activation
Applies the Gaussian error linear unit (GELU) activation function.
Gaussian error linear unit (GELU) computes x * P(X <= x), where P(X) ~ N(0,
1). The (GELU) nonlinearity weights inputs by their value, rather than gates inputs by their
sign as in ReLU.
For example:
x = tf.constant(new float[] {-3.0f, -1.0f, 0.0f, 1.0f, 3.f});
GELU gelu = new GELU();
y = gelu.call(tf, x);
// output [-0.00404951f, -0.15865529f, 0.f , 0.8413447f , 2.9959507f ]
}-
Field Summary
FieldsFields inherited from class AbstractActivation
NAME_KEY, tf -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionGets the calculation operation for the activation.Applies the Gaussian error linear unit (GELU) activation function with approximate set to false.Applies the Gaussian error linear unit (GELU) activation function.Gets a configuration map with entriesapproximateand value set withapproximate.getName()Get the name of the activation as known by the TensorFlow EnginebooleanGets the flag whether to enable approximation.Methods inherited from class AbstractActivation
checkClassName, checkConfigKeys, getDefaultConfig, getTF, setTF
-
Field Details
-
NAME
-
-
Constructor Details
-
GELU
public GELU()Creates a Gaussian error linear unit (GELU) activation. -
GELU
public GELU(boolean approximate) Creates a Gaussian error linear unit (GELU) activation.- Parameters:
approximate- whether to enable approximation.
-
GELU
Creates a GELU activation from a config map.- Parameters:
config- the configuration map, if the map contains an entry forapproximatethat value is used, otherwise false is used.- Throws:
IllegalArgumentException- if the configuration contains unsupported keys for this class or if the value for the name key does not match the name for the Activation
-
-
Method Details
-
gelu
Applies the Gaussian error linear unit (GELU) activation function with approximate set to false.Example Usage:
Operand<TFloat32> input = ...; Operand<TFloat32> result = Gelu.gelu(tf, input);- Type Parameters:
T- the data type for the input- Parameters:
tf- the TensorFlow Opsinput- the input- Returns:
- the exponential activation:
exp(x).
-
gelu
Applies the Gaussian error linear unit (GELU) activation function.Example Usage:
Operand<TFloat32> input = ...; Operand<TFloat32> result = Gelu.gelu(tf, input, true);- Type Parameters:
T- the data type for the input- Parameters:
tf- the TensorFlow Opsinput- the inputapproximate- whether to enable approximation.- Returns:
- the exponential activation:
exp(x).
-
getConfig
Gets a configuration map with entriesapproximateand value set withapproximate.
- Specified by:
getConfigin classAbstractActivation- Returns:
- config the configuration map
-
call
-
getName
Get the name of the activation as known by the TensorFlow Engine- Specified by:
getNamein classAbstractActivation- Returns:
- the name of the activation as known by the TensorFlow Engine
-
isApproximate
public boolean isApproximate()Gets the flag whether to enable approximation.- Returns:
- the flag whether to enable approximation.
-