Module juice::layers::activation[][src]

Expand description

Provides nonlinear activation methods.

Activation Layers take a input tensor, provide the activation operation and produce a output tensor. Thanks to the nonlinearity of the activation methods, we can ‘learn’ and detect nonlinearities in our (complex) datasets.

The activation operation used should depend on the task at hand. For binary classification a step function might be very useful. For more complex tasks continious activation functions such as Sigmoid, TanH, ReLU should be used. In most cases ReLU might provide the best results.

If you supply the same blob as input and output to a layer via the LayerConfig, computations will be done in-place, requiring less memory.

The activation function is also sometimes called transfer function.


pub use self::relu::ReLU;
pub use self::sigmoid::Sigmoid;
pub use self::tanh::TanH;


Applies the nonlinear Rectified Linear Unit.

Applies the nonlinear Log-Sigmoid function.

Applies the nonlinear TanH function.