Class ELU
- All Implemented Interfaces:
Activation
The exponential linear unit (ELU) with alpha > 0 is:
x if x > 0 and alpha * (exp(x) - 1) if x < 0.
The ELU hyperparameter alpha controls the value to which an ELU saturates for negative
net inputs. ELUs diminish the vanishing gradient effect.
ELUs have negative values which pushes the mean of the activations closer to zero. Mean activations that are closer to zero enable faster learning as they bring the gradient closer to the natural gradient. ELUs saturate to a negative value when the argument gets smaller. Saturation means a small derivative which decreases the variation and the information that is propagated to the next layer.
Example Usage:
Operand<TFloat32> input = ...;
ELU<TFloat32> elu = new ELU<>(tf, 2.0);
Operand<TFloat32> result = elu.call(input);
}- See Also:
-
Field Summary
FieldsFields inherited from class AbstractActivation
NAME_KEY, tf -
Constructor Summary
Constructors -
Method Summary
Methods inherited from class AbstractActivation
checkClassName, checkConfigKeys, getDefaultConfig, getTF, setTF
-
Field Details
-
NAME
-
-
Constructor Details
-
ELU
public ELU()Creates a new ELU with alpha=ALPHA_DEFAULT. -
ELU
public ELU(double alpha) Creates a new ELU- Parameters:
alpha- A scalar, slope of negative section. It controls the value to which an ELU saturates for negative net inputs.
-
ELU
Creates a new ELU from a configuration Map- Parameters:
config- the configuration map, if the map contains an entry foralphathat value is used, otherwiseALPHA_DEFAULTis used.- Throws:
IllegalArgumentException- if the configuration contains unsupported keys for this class or if the value for the name key does not match the name for the Activation
-
-
Method Details
-
elu
Computes the Exponential linear unit.The exponential linear unit (ELU) with
alpha > 0is:xifx > 0andalpha * (exp(x) - 1)ifx < 0.}The ELU hyperparameter
alphacontrols the value to which an ELU saturates for negative net inputs. ELUs diminish the vanishing gradient effect.ELUs have negative values which pushes the mean of the activations closer to zero. Mean activations that are closer to zero enable faster learning as they bring the gradient closer to the natural gradient. ELUs saturate to a negative value when the argument gets smaller. Saturation means a small derivative which decreases the variation and the information that is propagated to the next layer.
Example Usage:
Operand<TFloat32> input = ...; Operand<TFloat32> result = ELU.elu(tf, input, 2.0);}- Type Parameters:
T- the data type for the input- Parameters:
tf- the TensorFlow Opsinput- the inputalpha- scalar, slope of negative section.alphacontrols the value to which an ELU saturates for negative net inputs.- Returns:
- The exponential linear unit (ELU) activation function:
xifx > 0andalpha * (exp(x) - 1)ifx < 0.
-
getConfig
Gets a configuration map- Specified by:
getConfigin classAbstractActivation- Returns:
- the configuration map
-
call
-
getName
Get the name of the activation as known by the TensorFlow Engine- Specified by:
getNamein classAbstractActivation- Returns:
- the name of the activation as known by the TensorFlow Engine
-
getAlpha
public double getAlpha()Gets the slope of negative section.- Returns:
- the slope of negative section.
-