In ANN, the activation function of a node defines the output of that node given an input or set of inputs. ELU or Exponential Linear Unit function defines the output in following way for a given input tensor `x`.

- if `x` > 0, output would be `x`
- if `x` < 0, output would be calculated by function `alpha * (exp(x)-1)`, where `alpha` is a scalar.

`tf.keras.activations`

module of `tf.keras`

api provides built-in activation to use,
refer following code to use ELU (Exponential Linear Unit) activation function on tensors.

```
import tensorflow as tf
input_tensor = tf.constant([-3.32,-1.0,-2.3, 1.0, 1.4, 0.56, 2.1], dtype = tf.float32)
output_tensor = tf.keras.activations.elu(input_tensor, alpha=1.0)
print(output_tensor)
```

**Example output: ** notice how values > 0 remained unchanged and values < 0 got modified.

```
tf.Tensor(
[-0.96384716 -0.63212055 -0.8997412 1. 1.4 0.56
2.1 ], shape=(7,), dtype=float32)
```

Similar Articles