# How to use ReLU activation in machine learning | tf.keras

Formula for ReLU or Rectified Linear Unit is `max(0,x)`. With this formula ReLU returns element-wise maximum of 0 and the input tensor values. `relu` activation function takes input `x` and returns output as per the the function `max(0, x)`.

Refer below snippet to use `relu activation` with `tf.keras.activations`.

``````
import tensorflow as tf

input = tf.random.normal([1,10], mean=3.0)
output = tf.keras.activations.relu(input)

print("Input")
print(input)

print("Output after applying relu activation")
print(output)
```
```

Example output:

``````
Input
tf.Tensor(
[[4.8911924 3.7609506 1.6037421 2.8501108 2.3062882 3.580803  2.5677848
3.6137307 4.663064  4.5395136]], shape=(1, 10), dtype=float32)
Output after applying relu activation
tf.Tensor(
[[4.8911924 3.7609506 1.6037421 2.8501108 2.3062882 3.580803  2.5677848
3.6137307 4.663064  4.5395136]], shape=(1, 10), dtype=float32)
```
```

###### Limit max output values in ReLU with `max_value` parameter

``````
import tensorflow as tf

input = tf.random.normal([1,10], mean=1.0)
output = tf.keras.activations.relu(input, max_value=2)

print("Input")
print(input)

print("Output after applying relu with max_value paramter ")
print(output)
```
```

Example output:

``````
Input
tf.Tensor(
[[2.8053546 1.8733189 1.9014599 2.320188  1.6549678 2.7530499 1.5154703
1.9352622 2.3958783 1.8461647]], shape=(1, 10), dtype=float32)

Output after applying relu with max_value parameter
tf.Tensor(
[[2.        1.8733189 1.9014599 2.        1.6549678 2.        1.5154703
1.9352622 2.        1.8461647]], shape=(1, 10), dtype=float32)
```
```

Similar Articles