tf.keras.layers.ReLU has negative_slope argument which is explained as Float >= 0. Negative slope coefficient. Default to 0.
tf.keras.layers.ReLU(
max_value=None,
negative_slope=0.0,
threshold=0.0,
**kwargs
)
Is this to make it as Leaky ReLU? If so, is it the same with the alpha argument of the tf.keras.layers.LeakyReLU?
tf.keras.layers.LeakyReLU(alpha=0.3, **kwargs)
* alpha: Float >= 0. Negative slope coefficient. Default to 0.3.
Short answer:
Yes, the
negative_slopeparameter oftf.keras.layers.ReLUplays the same role asalphadoes intf.keras.layers.LeakyReLU. For example,tf.keras.layers.ReLU(negative_slope=0.5)andtf.keras.layers.LeakyReLU(alpha=0.5)have the same behavior.Here is a visualization of their behavior:
Output: