diff --git a/subjects/ai/keras-2/README.md b/subjects/ai/keras-2/README.md index 1263d34f5c..84fb0369e1 100644 --- a/subjects/ai/keras-2/README.md +++ b/subjects/ai/keras-2/README.md @@ -138,7 +138,7 @@ Let us assume we want to classify images and we know they contain either apples, ### Exercise 4: Multi classification - Optimize -The goal of this exercise is to learn to optimize a multi-classification neural network. As learnt previously, the loss function used in binary classification is the log loss - also called in Keras `binary_crossentropy`. This function is defined for binary classification and can be extended to multi-classification. In Keras, the extended loss that supports multi-classification is `binary_crossentropy`. There's no code to run in that exercise. +The goal of this exercise is to learn to optimize a multi-classification neural network. As learnt previously, the loss function used in binary classification is the log loss - also called in Keras `binary_crossentropy`. This function is defined for binary classification and can be extended to multi-classification. In Keras, the extended loss that supports multi-classification is `categorical_crossentropy`. There's no code to run in that exercise. 1. Fill the chunk of code below in order to optimize the neural network defined in the previous exercise. Choose the adapted loss, adam as optimizer and the accuracy as metric.