Class ReLU
java.lang.Object
org.tensorflow.framework.activations.AbstractActivation
org.tensorflow.framework.activations.ReLU
- All Implemented Interfaces:
Activation
Rectified Linear Unit(ReLU) activation.
With default values, this returns the standard ReLU activation: max(x, 0), the
element-wise maximum of 0 and the input tensor.
Modifying default parameters allows you to use non-zero thresholds, change the max value of the activation, and to use a non-zero multiple of the input for values below the threshold.
For example:
Operand<TFloat32> input = tf.constant(
new float[] {-10f, -5f, 0.0f, 5f, 10f});
// With default parameters
ReLU<TFloat32> relu = new ReLU<>(tf);
Operand<TFloat32> result = relu.call(input);
// result is [0.f, 0.f, 0.f, 5.f, 10.f]
// With alpha = 0.5
relu = new ReLU<>(tf, 0.5f, ReLU.MAX_VALUE_DEFAULT, ReLU.THRESHOLD_DEFAULT);
result = relu.call(input);
// result is [-5.f , -2.5f, 0.f , 5.f , 10.f]
// With maxValue = 5
relu = new ReLU<>(tf, ReLU.ALPHA_DEFAULT, 5f, ReLU.THRESHOLD_DEFAULT);
result = relu.call(input);
// result is [0.f, 0.f, 0.f, 5.f, 5.f]
// With threshold = 5
relu = new ReLU<>(tf, ReLU.ALPHA_DEFAULT, ReLU.MAX_VALUE_DEFAULT, 5f);
result = relu.call(input);
// result is [-0.f, -0.f, 0.f, 0.f, 10.f]
-
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final floatstatic final floatstatic final StringThe activation name as known by TensorFlowstatic final floatFields inherited from class AbstractActivation
NAME_KEY, tf -
Constructor Summary
ConstructorsConstructorDescriptionReLU()Creates a new ReLU with alpha=ALPHA_DEFAULT, maxValue=MAX_VALUE_DEFAULT, threshold=THRESHOLD_DEFAULT,ReLU(float alpha, float maxValue, float threshold) Creates a new ReLUCreates a ReLU activation from a config map. -
Method Summary
Modifier and TypeMethodDescriptionGets the calculation operation for the activation.floatgetAlpha()Gets the value that governs the slope for values lower than the threshold.Gets a configuration map with entriesalphaand value set withalpha.floatGets the saturation threshold (the largest value the function will return).getName()Get the name of the activation as known by the TensorFlow EnginefloatGets the saturation threshold (the largest value the function will return).Applies the rectified linear unit activation function with default values.Applies the rectified linear unit activation function.Methods inherited from class AbstractActivation
checkClassName, checkConfigKeys, getDefaultConfig, getTF, setTF
-
Field Details
-
NAME
-
ALPHA_DEFAULT
public static final float ALPHA_DEFAULT- See Also:
-
MAX_VALUE_DEFAULT
public static final float MAX_VALUE_DEFAULT- See Also:
-
THRESHOLD_DEFAULT
public static final float THRESHOLD_DEFAULT- See Also:
-
-
Constructor Details
-
ReLU
public ReLU()Creates a new ReLU with alpha=ALPHA_DEFAULT, maxValue=MAX_VALUE_DEFAULT, threshold=THRESHOLD_DEFAULT, -
ReLU
public ReLU(float alpha, float maxValue, float threshold) Creates a new ReLU- Parameters:
alpha- governs the slope for values lower than the threshold.maxValue- sets the saturation threshold (the largest value the function will return).threshold- the threshold value of the activation function below which values will be damped or set to zero.
-
ReLU
Creates a ReLU activation from a config map.- Parameters:
config- the configuration map,- if the map contains an entry for
alphathat value is used, otherwiseALPHA_DEFAULTis used. - if the map contains an entry for
max_valuethat value is used, otherwiseMAX_VALUE_DEFAULTis used. - if the map contains an entry for
thresholdthat value is used, otherwiseTHRESHOLD_DEFAULTis used.
- if the map contains an entry for
- Throws:
IllegalArgumentException- if the configuration contains unsupported keys for this class or if the value for the name key does not match the name for the Activation
-
-
Method Details
-
relu
Applies the rectified linear unit activation function with default values.Example Usage:
Operand<TFloat32> input = ...; Operand<TFloat32> result = ReLU.relu(tf, input);- Type Parameters:
T- the data type for the input- Parameters:
tf- the TensorFlow Opsinput- the input- Returns:
- the input, unmodified.
-
relu
public static <T extends TNumber> Operand<T> relu(Ops tf, Operand<T> input, float alpha, float maxValue, float threshold) Applies the rectified linear unit activation function.Example Usage:
Operand<TFloat32> input = ...; Operand<TFloat32> result = ReLU.relu(tf, input);- Type Parameters:
T- the data type for the input- Parameters:
tf- the TensorFlow Opsinput- the inputalpha- governs the slope for values lower than the threshold.maxValue- sets the saturation threshold (the largest value the function will return).threshold- the threshold value of the activation function below which values will be damped or set to zero.- Returns:
- the input, unmodified.
-
getConfig
Gets a configuration map with entriesalphaand value set withalpha.max_valueand value set withmaxValue.thresholdand value set withthreshold.
- Specified by:
getConfigin classAbstractActivation- Returns:
- config the configuration map
-
call
-
getName
Get the name of the activation as known by the TensorFlow Engine- Specified by:
getNamein classAbstractActivation- Returns:
- the name of the activation as known by the TensorFlow Engine
-
getAlpha
public float getAlpha()Gets the value that governs the slope for values lower than the threshold.- Returns:
- the value that governs the slope for values lower than the threshold.
-
getThreshold
public float getThreshold()Gets the saturation threshold (the largest value the function will return).- Returns:
- the saturation threshold (the largest value the function will return). public float
getMaxValue() { return maxValue; }
/** Gets the threshold value of the activation function below which values will be damped or set to zero.
-
getMaxValue
public float getMaxValue()Gets the saturation threshold (the largest value the function will return).- Returns:
- the saturation threshold (the largest value the function will return).
-