Activation Functions#

Choosing a suitable activation function for the task at hand can make a significant difference, depending on the situation or dataset. In order to give you a certain freedom in this sense, Fortnet implements the following functions:

  • hyperbolic tangent

  • arcus tangent

  • sigmoid function

  • softplus function

  • gaussian function

  • (leaky) ReLU function

  • Bent identity function

  • heaviside function

  • linear function

The activation function is selected in the Network block of the HSD input. The functions and an associated, exemplary network block, are listed below.

Note

A linear function is always used as the activation function of the output layer, in order to be able to represent all real-valued results.

Hyperbolic Tangent#

Network = BPNN {
  Hidden = 2 2
  Activation = tanh
}
Plot of the hyperbolic tangent.

Arcus Tangent#

Network = BPNN {
  Hidden = 2 2
  Activation = atan
}
Plot of the arcus tangent.

Sigmoid#

Network = BPNN {
  Hidden = 2 2
  Activation = sigmoid
}
Plot of sigmoid activation function.

SoftPlus#

Network = BPNN {
  Hidden = 2 2
  Activation = softplus
}
Plot of softplus activation function.

Gaussian#

Network = BPNN {
  Hidden = 2 2
  Activation = gaussian
}
Plot of gaussian activation function.

ReLU#

Network = BPNN {
  Hidden = 2 2
  Activation = relu
}
Plot of relu activation function.

Leaky ReLU#

Network = BPNN {
  Hidden = 2 2
  Activation = lrelu
}
Plot of leaky ReLU activation function.

Bent Identity#

Network = BPNN {
  Hidden = 2 2
  Activation = bent
}
Plot of Bent identity activation function.

Heaviside#

Network = BPNN {
  Hidden = 2 2
  Activation = heaviside
}
Plot of heaviside activation function.

Linear#

Network = BPNN {
  Hidden = 2 2
  Activation = linear
}
Plot of linear activation function.