Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementation - Find and Fill Method for Dropout Layer #3684

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
18 changes: 18 additions & 0 deletions src/mlpack/methods/ann/layer/dropout.hpp
Expand Up @@ -77,6 +77,24 @@ class DropoutType : public Layer<MatType>
*/
void Forward(const MatType& input, MatType& output);

/**
* Implementation of the forward pass of the dropout layer.
*
* @param input Input data used for evaluating the specified function.
* @param output Resulting output activation.
*/
template<typename T = MatType, typename std::enable_if_t<arma::is_arma_type<T>::value, int> = 0>
void ForwardImpl(const T& input, T& output);

/**
* General implementation of the forward pass of the dropout layer.
*
* @param input Input data used for evaluating the specified function.
* @param output Resulting output activation.
*/
template<typename T = MatType, typename std::enable_if_t<!arma::is_arma_type<T>::value, int> = 0>
void ForwardImpl(const T& input, T& output);
Comment on lines +86 to +96
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@MarkFischinger
Looks better, just use directly MatType instead of T. It is better for readability because T does not mean a lot.
Also, could you break the line? mlpack code base is usually under 80 lines of code.
Also, the syntax is correct, I would remove int and instead add a * after > so the signature looks as follows:

typename std::enable_if_t<!arma::is_arma_type<T>::value>* = 0>

The reason for this is to have the same signature over all the code base, making it easier for everyone.


/**
* Ordinary feed backward pass of the dropout layer.
*
Expand Down
24 changes: 24 additions & 0 deletions src/mlpack/methods/ann/layer/dropout_impl.hpp
Expand Up @@ -74,10 +74,18 @@ DropoutType<MatType>::operator=(DropoutType&& other)
return *this;
}


Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change

No need for an extra line 👍

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, this one is still missing.

template<typename MatType>
void DropoutType<MatType>::Forward(const MatType& input, MatType& output)
{
// The dropout mask will not be multiplied in testing mode.
ForwardImpl(input, output);
}

template<typename MatType>
template<typename T, typename std::enable_if_t<arma::is_arma_type<T>::value, int>>
void DropoutType<MatType>::ForwardImpl(const T& input, T& output)
{
if (!this->training)
{
output = input;
Expand All @@ -92,6 +100,22 @@ void DropoutType<MatType>::Forward(const MatType& input, MatType& output)
}
}

template<typename MatType>
template<typename T, typename std::enable_if_t<!arma::is_arma_type<T>::value, int>>
void DropoutType<MatType>::ForwardImpl(const T& input, T& output)
{
if (!this->training)
{
output = input;
}
else
{
mask.randu(input.n_rows, input.n_cols);
mask = (mask > ratio);
output = input % mask * scale;
}
}

template<typename MatType>
void DropoutType<MatType>::Backward(
const MatType& /* input */,
Expand Down