Class ELU

All Implemented Interfaces:
Activation

public class ELU extends AbstractActivation
Exponential linear unit.

The exponential linear unit (ELU) with alpha > 0 is:

x if x > 0 and alpha * (exp(x) - 1) if x < 0.

The ELU hyperparameter alpha controls the value to which an ELU saturates for negative net inputs. ELUs diminish the vanishing gradient effect.

ELUs have negative values which pushes the mean of the activations closer to zero. Mean activations that are closer to zero enable faster learning as they bring the gradient closer to the natural gradient. ELUs saturate to a negative value when the argument gets smaller. Saturation means a small derivative which decreases the variation and the information that is propagated to the next layer.

Example Usage:

Operand<TFloat32> input = ...;
ELU<TFloat32> elu = new ELU<>(tf, 2.0);
Operand<TFloat32> result = elu.call(input);

}
See Also:
  • Field Details

  • Constructor Details

    • ELU

      public ELU()
      Creates a new ELU with alpha=ALPHA_DEFAULT.
    • ELU

      public ELU(double alpha)
      Creates a new ELU
      Parameters:
      alpha - A scalar, slope of negative section. It controls the value to which an ELU saturates for negative net inputs.
    • ELU

      public ELU(Map<String,Object> config)
      Creates a new ELU from a configuration Map
      Parameters:
      config - the configuration map, if the map contains an entry for alpha that value is used, otherwise ALPHA_DEFAULT is used.
      Throws:
      IllegalArgumentException - if the configuration contains unsupported keys for this class or if the value for the name key does not match the name for the Activation
  • Method Details

    • elu

      public static <T extends TNumber> Operand<T> elu(Ops tf, Operand<T> input, double alpha)
      Computes the Exponential linear unit.

      The exponential linear unit (ELU) with alpha > 0 is:

      x if x > 0 and alpha * (exp(x) - 1) if x < 0.}

      The ELU hyperparameter alpha controls the value to which an ELU saturates for negative net inputs. ELUs diminish the vanishing gradient effect.

      ELUs have negative values which pushes the mean of the activations closer to zero. Mean activations that are closer to zero enable faster learning as they bring the gradient closer to the natural gradient. ELUs saturate to a negative value when the argument gets smaller. Saturation means a small derivative which decreases the variation and the information that is propagated to the next layer.

      Example Usage:

      Operand<TFloat32> input = ...;
      Operand<TFloat32> result = ELU.elu(tf, input, 2.0);
      
      }
      Type Parameters:
      T - the data type for the input
      Parameters:
      tf - the TensorFlow Ops
      input - the input
      alpha - scalar, slope of negative section. alpha controls the value to which an ELU saturates for negative net inputs.
      Returns:
      The exponential linear unit (ELU) activation function: x if x > 0 and alpha * (exp(x) - 1) if x < 0 .
    • getConfig

      public Map<String,Object> getConfig()
      Gets a configuration map
      Specified by:
      getConfig in class AbstractActivation
      Returns:
      the configuration map
    • call

      public <T extends TNumber> Operand<T> call(Ops tf, Operand<T> input)
      Gets the calculation operation for the activation.
      Type Parameters:
      T - the data type of the input and the result
      Parameters:
      tf - the TensorFlow Ops
      input - the input tensor
      Returns:
      The operand for the activation
    • getName

      public String getName()
      Get the name of the activation as known by the TensorFlow Engine
      Specified by:
      getName in class AbstractActivation
      Returns:
      the name of the activation as known by the TensorFlow Engine
    • getAlpha

      public double getAlpha()
      Gets the slope of negative section.
      Returns:
      the slope of negative section.