Module juice::layers[][src]

Expand description

Provides the fundamental units of computation in a Neural Network.

These layers provide different type of operations to the data Blobs that flow through them. The operations provided by the layers are grouped into five categories:

  • Activation
    Activation Layers provide element-wise operations and produce one top Blob of the same size as the bottom Blob. It can be seen as a synonym to nonlinear Activation Functions.

  • Common
    Common Layers can differ in their connectivity and behavior and are typically all network layer types which are not covered by activation or loss layers. Examples would be fully connected layers, covolutional layers, pooling layers, etc.

  • Loss
    Loss Layers compare an output to a target value and assign cost to minimize. Loss Layers are often the last layer in a [Network][1].

  • Utility
    Utility Layers provide all kind of helpful functionality, which might not be directly related to machine learning and neural nets. This could be operations for normalizing, restructuring or transforming information, log and debug behavior or data access. Utility Layers follow the general behavior of a layer, like the other types do.

  • Container
    Container layers take LayerConfigs and connect them on initialization, which creates a “network”. But as container layers are layers one can stack multiple container layers on top of another and compose even bigger container layers. Container layers differ in how they connect the layers that it receives.

For more information about how these layers work together, see the documentation for the general Layer module.


pub use self::activation::ReLU;
pub use self::activation::Sigmoid;
pub use self::activation::TanH;
pub use self::common::Convolution;
pub use self::common::ConvolutionConfig;
pub use self::common::Dropout;
pub use self::common::DropoutConfig;
pub use self::common::Linear;
pub use self::common::LinearConfig;
pub use self::common::LogSoftmax;
pub use self::common::Pooling;
pub use self::common::PoolingConfig;
pub use self::common::PoolingMode;
pub use self::common::Rnn;
pub use self::common::RnnConfig;
pub use self::common::Softmax;
pub use self::container::Sequential;
pub use self::container::SequentialConfig;
pub use self::loss::MeanSquaredError;
pub use self::loss::NegativeLogLikelihood;
pub use self::loss::NegativeLogLikelihoodConfig;
pub use self::utility::Flatten;
pub use self::utility::Reshape;
pub use self::utility::ReshapeConfig;


Provides nonlinear activation methods.

Provides common neural network layers.

Provides container layers.

Provides methods to calculate the loss (cost) of some output.

Provides various helpful layers, which might be not directly related to neural networks in general.