|
| ELU () |
| Create the ELU object. More...
|
|
| ELU (const double alpha) |
| Create the ELU object using the specified parameter. More...
|
|
double const & | Alpha () const |
| Get the non zero gradient. More...
|
|
double & | Alpha () |
| Modify the non zero gradient. More...
|
|
template |
void | Backward (const DataType &input, const DataType &gy, DataType &g) |
| Ordinary feed backward pass of a neural network, calculating the function f(x) by propagating x backwards through f. More...
|
|
OutputDataType const & | Delta () const |
| Get the delta. More...
|
|
OutputDataType & | Delta () |
| Modify the delta. More...
|
|
bool | Deterministic () const |
| Get the value of deterministic parameter. More...
|
|
bool & | Deterministic () |
| Modify the value of deterministic parameter. More...
|
|
template |
void | Forward (const InputType &input, OutputType &output) |
| Ordinary feed forward pass of a neural network, evaluating the function f(x) by propagating the activity forward through f. More...
|
|
double const & | Lambda () const |
| Get the lambda parameter. More...
|
|
OutputDataType const & | OutputParameter () const |
| Get the output parameter. More...
|
|
OutputDataType & | OutputParameter () |
| Modify the output parameter. More...
|
|
template |
void | serialize (Archive &ar, const unsigned int) |
| Serialize the layer. More...
|
|
template
class mlpack::ann::ELU< InputDataType, OutputDataType >
The ELU activation function, defined by.
For more information, read the following paper:
@article{Clevert2015,
author = {Djork{-}Arn{\'{e}} Clevert and Thomas Unterthiner and
Sepp Hochreiter},
title = {Fast and Accurate Deep Network Learning by Exponential Linear
Units (ELUs)},
journal = {CoRR},
year = {2015},
url = {https://arxiv.org/abs/1511.07289}
}
The SELU activation function is defined by
For more information, read the following paper:
@article{Klambauer2017,
author = {Gunter Klambauer and Thomas Unterthiner and
Andreas Mayr},
title = {Self-Normalizing Neural Networks},
journal = {Advances in Neural Information Processing Systems},
year = {2017},
url = {https:
}
In the deterministic mode, there is no computation of the derivative.
- Note
- During training deterministic should be set to false and during testing/inference deterministic should be set to true.
-
Make sure to use SELU activation function with normalized inputs and weights initialized with Lecun Normal Initialization.
- Template Parameters
-
InputDataType | Type of the input data (arma::colvec, arma::mat, arma::sp_mat or arma::cube). |
OutputDataType | Type of the output data (arma::colvec, arma::mat, arma::sp_mat or arma::cube). |
Definition at line 111 of file elu.hpp.