_loss
cross_entropy_loss(y_true, y_pred)
⚓︎
Compute the categorical cross entropy loss from the given true labels and predicted labels.
We add \(1\mathrm{e}{-7}\) (epsilon) to the prediction to avoid taking the log of \(0\)
- Inspired by keras implemenation: Keras implementation where epsilon is defined here
Parameters:
Name | Type | Description | Default |
---|---|---|---|
y_true |
1d ndarray
|
true class labels of size 1 x n where n is the number of data points. |
required |
y_pred |
1d ndarray
|
predicted class labels of size 1 x n where n is the number of data points. |
required |
Returns:
Type | Description |
---|---|
float
|
Cross entropy score for the given prediction |
Source code in mlproject/neural_net/_loss.py
4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
|