How to use softplus activation in ANN | tf.keras

softplus activation function takes input x and returns output calculated by function log(exp(x) + 1).

Refer below snippet to use softplus activation with tf.keras.activations.


import tensorflow as tf

input_softplus = tf.random.normal([1,7])
output_softplus = tf.keras.activations.softmax(input_softplus)

print("Input")
print(input_softplus)

print("Output after applying softplus activation")
print(output_softplus)

Example output:


Input
tf.Tensor(
[[ 0.39520025 -0.25938022  0.7229915  -0.5662393   0.30284765  0.44534028
   0.6976337 ]], shape=(1, 7), dtype=float32)
    
    
Output after applying softplus activation
tf.Tensor(
[[0.15137178 0.07866186 0.21008877 0.0578758  0.13801831 0.15915506
  0.20482835]], shape=(1, 7), dtype=float32)