Class SELU
java.lang.Object
org.tensorflow.framework.activations.AbstractActivation
org.tensorflow.framework.activations.SELU
- All Implemented Interfaces:
Activation
Scaled Exponential Linear Unit (SELU).
The Scaled Exponential Linear Unit (SELU) activation function is defined as:
if x > 0: return scale * xif x < 0: return scale * alpha * (exp(x) - 1)
where alpha and scale are pre-defined constants (alpha=1.67326324 and
scale=1.05070098).
Basically, the SELU activation function multiplies scale (> 1) with the output of
the elu function to ensure a slope larger than one for positive inputs.
The values of alpha and scale are chosen so that the mean and variance of the
inputs are preserved between two consecutive layers as long as the weights are initialized
correctly (see LeCun with Normal Distribution) and
the number of input units is "large enough"
Notes: To be used together with the LeCun initializer with Normal Distribution.
- See Also:
-
Field Summary
FieldsFields inherited from class AbstractActivation
NAME_KEY, tf -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionGets the calculation operation for the activation.Gets a configuration mapgetName()Get the name of the activation as known by the TensorFlow EngineApplies Scaled Exponential Linear Unit (SELU) activation functionMethods inherited from class AbstractActivation
checkClassName, checkConfigKeys, getDefaultConfig, getTF, setTF
-
Field Details
-
NAME
-
-
Constructor Details
-
SELU
public SELU()Creates a Scaled Exponential Linear Unit (SELU) activation. -
SELU
Creates a new Exponential from a configuration Map- Parameters:
config- the configuration map, this class does not use any of the entries in the configuration map- Throws:
IllegalArgumentException- if the configuration contains unsupported keys for this class or if the value for the name key does not match the name for the Activation
-
-
Method Details
-
selu
Applies Scaled Exponential Linear Unit (SELU) activation functionExample Usage:
Operand<TFloat32> input = ...; Operand<TFloat32> result = SELU.selu(tf, input);- Type Parameters:
T- the data type for the input- Parameters:
tf- the TensorFlow Opsinput- the input- Returns:
- the input, unmodified.
-
call
-
getConfig
Gets a configuration map- Specified by:
getConfigin classAbstractActivation- Returns:
- the configuration map
-
getName
Get the name of the activation as known by the TensorFlow Engine- Specified by:
getNamein classAbstractActivation- Returns:
- the name of the activation as known by the TensorFlow Engine
-