This code shows how to implement a softmax function in Python without using a library like numpy.

Softmax function with plain Python. No numpy.

The softmax function is used to squish output neuron values between 0.0 and 1.1.

-2 -> 0.00000199
 4 -> 0.00080254
 9 -> 0.11910702
11 -> 0.88008845

The result is a list of probabilities that together should add up to 1.0.

Here is the code with the softmax function.

import math

def softmax(inputs):
    temp = [math.exp(v) for v in inputs]
    total = sum(temp)
    return [t / total for t in temp]

act = softmax([-2, 4, 9, 11])
for a in act:
    print(f"{a:.8f}")

print(f"total: {sum(act)}")

Output:

0.00000199
0.00080254
0.11910702
0.88008845
total: 1.0

Why softmax?

In a neural network, each layer performs a matrix calculation of weights, biases and activations from the previous layer.

The result of a matrix calculations is an output column vector that can have any number, like -3, 7 or 651.

When the last layer of a network is some sort of classification like "dog" or "cat", a softmax can be used to squish the output values between 0.0 and 1.1.

The output with the highest activation is the predicted classification.

Written by Loek van den Ouweland on 2021-11-19.
Questions regarding this artice? You can send them to the address below.