Overview of builtin activation functions

Note that some of these functions are scaled differently from the canonical versions you may be familiar with. The intention of the scaling is to place more of the functions’ “interesting” behavior in the region \(\left[-1, 1\right] \times \left[-1, 1\right]\).

The implementation of these functions can be found in the activations module.

abs

absolute value function

clamped

clamped linear function

cube

cubic function

exp

exponential function

gauss

gaussian function

hat

hat function

identity

identity function

inv

inverse function

log

log function

relu

rectified linear function

sigmoid

sigmoid function

sin

sine function

softplus

soft-plus function

square

square function

tanh

hyperbolic tangent function