Common Activation functions
Function | get |
Helper function to get an activation function |
Function | leakyrelu |
Leaky ReLU activation function |
Function | relu |
ReLU activation function |
Function | sigmoid |
Sigmoid acitvation function |
Function | softmax |
SoftMax activation function |
Function | tanh |
Hyperbolic tangent activation function |
Helper function to get an activation function
Return the appropriate activation function depending on the given string
- Args
- func:
- Query string for the requested activation function
- Returns
- Appropriate function
Leaky ReLU activation function
This function was devised to address the issue of negative values as input to ReLU. ReLU would simply discard them, but leaky relu uses them as well.
leakyReLU(x, slope) = if (x > 0) => x, if (x <= 0) => slope * x
Range: (-inf, inf)
- Args
- x:
- The input vector to apply activation function over.
- slope:
- slope of the line that provides output for negative inputs
- Returns
- Value after leakyrelu is applied on x
ReLU activation function
Applies the ReLU activation function on the input x.
ReLU or Rectilinear Unit was devised to overcome the issue of vanishing gradient in sigmoid function. It resolves the issue by not having any upper limit. But that comes with its own problems.
ReLU(x) = max(0, x)
Range: [0, inf)
- Args
- x:
- The input vector to apply activation function over.
- Returns
- Value after relu is applied on x
Sigmoid acitvation function
Applies the sigmoid activation function to a value. It returns a value between 0 and 1 but never 0 and 1. It suffers from the vanishing gradient issue.
sigmoid(x) = 1 / (1 + exp(-x))
Range: (0, 1)
- Args
- x:
- The input vector to apply activation function over.
- Returns
- Value after sigmoid is applied on x
SoftMax activation function
This function takes a vector as input and changes them to probabilities based on their values with respect to the vector itself.
Range: (0, 1)
- Args
- x:
- The input vector to apply activation function over.
- Returns
- Value after softmax is applied on x
WARNING: Not available at the moment