This package provides an easy and modular way to build and train simple or complex neural networks using Torch:
- Modules are the bricks used to build neural networks. Each are themselves neural networks, but can be combined with other networks using containers to create complex neural networks:
- Module: abstract class inherited by all modules;
 - Containers: container classes like 
Sequential,ParallelandConcat; - Transfer functions: non-linear functions like 
TanhandSigmoid; - Simple layers: like 
Linear,Mean,MaxandReshape; - Table layers: layers for manipulating 
tables likeSplitTable,ConcatTableandJoinTable; - Convolution layers: 
Temporal,SpatialandVolumetricconvolutions; 
 - Criterions compute a gradient according to a given loss function given an input and a target:
- Criterions: a list of all criterions, including 
Criterion, the abstract class; MSECriterion: the Mean Squared Error criterion used for regression;ClassNLLCriterion: the Negative Log Likelihood criterion used for classification;
 - Criterions: a list of all criterions, including 
 - Additional documentation:
- Overview of the package essentials including modules, containers and training;
 - Training: how to train a neural network using 
StochasticGradient; - Testing: how to test your modules.
 - Experimental Modules: a package containing experimental modules and criteria.