Softplus

In mathematics and machine learning, the softplus function is

The names softplus[1][2] and SmoothReLU[3] are used in machine learning.

It is a smooth approximation (in fact, an analytic function) to the ramp function, which is known as the rectifier or ReLU in machine learning. For large negative it is , so just above 0, while for large positive it is , so just above .

  1. ^ Dugas, Charles; Bengio, Yoshua; Bélisle, François; Nadeau, Claude; Garcia, René (2000-01-01). "Incorporating second-order functional knowledge for better option pricing" (PDF). Proceedings of the 13th International Conference on Neural Information Processing Systems (NIPS'00). MIT Press: 451–457. Since the sigmoid h has a positive first derivative, its primitive, which we call softplus, is convex.
  2. ^ Xavier Glorot; Antoine Bordes; Yoshua Bengio (2011). Deep sparse rectifier neural networks (PDF). AISTATS. Rectifier and softplus activation functions. The second one is a smooth version of the first.
  3. ^ "Smooth Rectifier Linear Unit (SmoothReLU) Forward Layer". Developer Guide for Intel Data Analytics Acceleration Library. 2017. Retrieved 2018-12-04.

From Wikipedia, the free encyclopedia · View on Wikipedia

Developed by Tubidy