# How to calculate BinaryCrossEntropy loss in TensorFlow

Binary Cross Entropy loss is used when there are only two label classes, for example in cats and dogs image classification there are only two classes i.e cat or dog, in this case Binary Cross Entropy loss can be used. `tf.keras` api provides implementation of `BinaryCrossEntropy`, lets understand this with below code snippet.

Create two examples for actual values and predicted values

``````
import tensorflow as tf

actual_values = [[0, 1, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0]]

predicted_values = [[.5, .7, .2, .3, .5, .6],[.5, .7, .7, .2, .5, .6], [.5, .7, .2, .8, .2, .1] ]

```
```

`actual_values` comprises of three batch of actual labels, `predicted_values` are batches of corresponding predicted values.

Instantiate `BinaryCrossEntropy` object and compute cross-entropy loss

``````
binary_cross_entropy = tf.keras.losses.BinaryCrossentropy()
loss = binary_cross_entropy(actual_values, predicted_values).numpy()
print(loss)

```
```

Output

``````
0.53984624
```
```

Complete Code

``````
import tensorflow as tf

actual_values = [[0, 1, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0]]
predicted_values = [[.5, .7, .2, .3, .5, .6],[.5, .7, .2, .3, .5, .6], [.5, .7, .2, .3, .5, .6] ]

binary_cross_entropy = tf.keras.losses.BinaryCrossentropy()
print(binary_cross_entropy)

loss = binary_cross_entropy(actual_values, predicted_values).numpy()
print(loss)

```
```

Similar Articles