sigmoid function, mathematical function that graphs as a distinctive S-shaped curve. The mathematical representation of the sigmoid function is an exponential equation of the formσ(x) = 1/(1 + e−x),where e is the constant that is the base of the natural logarithm function.
Although there are many S-shaped, sigmoidlike curves, it is the standard form of the logistic function that is referred to as the “sigmoid.” The logistic function was first derived by Belgian mathematician Pierre-François Verhulst in the mid-1830s to describe population growth.
The sigmoid function has the behavior that for large negative values of x, σ(x) approaches 0, and for large positive values of x, σ(x) approaches 1. The derivative of the sigmoid function isd(σ(x))e/dx = e−x/(1 + ex)2.
The sigmoid function played a key part in the evolution of neural networks and machine learning. A neural network is a computer network that operates similarly to the way neurons operate in the brain. A neuron in a neural network receives input from other neurons, and that input is sent into an activation function that determines the output.
Often the activation function was a sigmoid. The function’s outputs of 0 and 1 were useful in problems with binary classification. Its nonlinearity property was required to make complex decisions in networks in which there were nonlinear relationships among data. Because of these properties, the sigmoid function became an essential component in early neural networks, and it was therefore often referred to as the “sigmoid” or “sigmoid unit.”
In modern neural networks, the traditional sigmoid function σ(x) has often been replaced by specially designed activation functions that are faster and more economical to use. Nevertheless, these new activation functions are usually created by modifying the classic sigmoid function σ(x).