Skip to content

_activations

leaky_relu(z) ⚓︎

Leaky relu activation function

Parameters:

Name Type Description Default
z 2d ndarray

input to the leaky relu activation function

required

Returns:

Type Description
2d ndarray

leaky relu 'activated' version of the input z

Source code in mlproject/neural_net/_activations.py
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
def leaky_relu(z):
    """Leaky relu activation function

    Parameters
    ----------
    z : 2d ndarray
        input to the leaky relu activation function

    Returns
    -------
    2d ndarray
        leaky relu 'activated' version of the input `z`
    """
    return np.where(z > 0, z, z * 0.01)

leaky_relu_der(z) ⚓︎

Derivative of the leaky relu activation function

Parameters:

Name Type Description Default
z 2d ndarray

input to calculate the derivative of

required

Returns:

Type Description
2d ndarray

derivative of the specific neuron with a leaky relu activation function

Source code in mlproject/neural_net/_activations.py
20
21
22
23
24
25
26
27
28
29
30
31
32
33
def leaky_relu_der(z):
    """Derivative of the leaky relu activation function

    Parameters
    ----------
    z : 2d ndarray
        input to calculate the derivative of

    Returns
    -------
    2d ndarray
        derivative of the specific neuron with a leaky relu activation function
    """
    return np.where(z > 0, 1, 0.01)

stable_softmax(z) ⚓︎

Numerically stable softmax activation function

Inspired by https://stackoverflow.com/a/50425683 & https://github.com/scipy/scipy/blob/v1.9.3/scipy/special/_logsumexp.py#L130-L223

Parameters:

Name Type Description Default
z 2d ndarray

input to the softmax activation function

required

Returns:

Type Description
2d ndarray

softmax 'activated' version of the input z

Source code in mlproject/neural_net/_activations.py
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
def stable_softmax(z):
    """Numerically stable softmax activation function

    Inspired by https://stackoverflow.com/a/50425683 &
    https://github.com/scipy/scipy/blob/v1.9.3/scipy/special/_logsumexp.py#L130-L223

    Parameters
    ----------
    z : 2d ndarray
        input to the softmax activation function

    Returns
    -------
    2d ndarray
        softmax 'activated' version of the input `z`
    """
    # When keepdims is set to True we keep the original dimensions/shape of the input.
    # axis = 1 means that we find the maximum value along the first axis i.e. the rows.
    e_max = np.amax(z, axis=1, keepdims=True)
    e = np.subtract(z, e_max)
    e = np.exp(e)
    return e / np.sum(e, axis=1, keepdims=True)