In mathematics and machine learning, the softplus function is
The names softplus[1][2] and SmoothReLU[3] are used in machine learning.
It is a smooth approximation (in fact, an analytic function) to the ramp function, which is known as the rectifier or ReLU in machine learning. For large negative it is , so just above 0, while for large positive it is , so just above .
Since the sigmoid h has a positive first derivative, its primitive, which we call softplus, is convex.
Rectifier and softplus activation functions. The second one is a smooth version of the first.