AI Building Blocks: How Sigmoid Works
Sigmoid functions can be used as activation functions in neural nets. This post explains what the Sigmoid function is, how it works, and what it looks like.
The Sigmoid function I’ll be discussing is a mathematical equation, shown here:

The Sigmoid function squashes an input, x
, between two numbers — 0
and 1
in this example. As x
decreases, output approaches 0
and as x
increases, output approaches 1
. Here’s the Sigmoid function in Python code:
This example uses the numpy exp()
function for e
in the diagram, standing for Euler’s number, which is an irrational number approximately equal to 2.718281
.
The following code tests the sigmoid()
function, exporting it’s output to a *.csv
file:
This code passes integers -9
through 9
to sigmoid()
and writes results to the sigmoid.csv
file. The purpose of the sigmoid.csv
file is to visualize the Sigmoid function, shown in the following figure:

Though the image shows the line nearly horizontal on the ends, 0
and 1
are limits. That is, as x
decreases or increases without bound, the line approaches, but is never equal to 0
or 1
, respectively. You can see this behavior from the numbers in column A
and can test it by increasing the range of numbers in the test code. The following steps produce the chart in the above image:
- Open
sigmoid.csv
in Excel. If you don’t have Excel, you can use Excel Online for free. - Highlight (select) the numbers. e.g. Select the first cell, press Shift, and select the last cell.
- Click the
Insert
tab. - Select
Recommended Charts
. - Select
Scatter
. - Click
OK
.
Now you can see something similar to the figure above. Additionally, you can double click and change the title, move the chart around, and customize several properties.
Now you know what the Sigmoid function is, there was Python code showing how it works, and you could visualize it in an Excel chart.
While Sigmoid isn’t the only activation function used in deep learning, it was one of the first to gain widespread use in the early days of neural networks. These days, Sigmoid is not used as much and there are other, more popular activation functions such as TanH and ReLU.