Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using floats instead of double #1062

Closed
etano opened this issue Jul 19, 2017 · 10 comments
Closed

Using floats instead of double #1062

etano opened this issue Jul 19, 2017 · 10 comments

Comments

@etano
Copy link

etano commented Jul 19, 2017

Hi, long time armadillo user, newer mlpack user (looks great!).

I'd like to be able to change the datatype in the ann implementation to float (instead of double).

I'm willing to go through and add a template parameter for the datatype as it seems pretty straightforward. But before I go through the effort, I'm just wondering if there's a possibility something like this would be accepted.

Cheers!

@rcurtin
Copy link
Member

rcurtin commented Jul 19, 2017

I think actually this should not be too hard---all of the layers (or at least most of them) in src/mlpack/methods/ann/layer/ already support templated matrix types, meaning you could use floats with them. But still, I think that FFN and RNN should probably be refactored at some point to support generic matrix types. I'm not sure what the exact best way to do that will be---@zoq, any ideas in particular?

This is related to the long-standing issue #290.

@etano
Copy link
Author

etano commented Jul 19, 2017

Yeah looking at it, it seems as you say. I think there is an option between templating on a matrix type or templating on an underlying datatype, or both.

Looking forward, having arbitrary types might be convenient when integrating with bandicoot: https://github.com/conradsnicta/bandicoot-code

@rcurtin
Copy link
Member

rcurtin commented Jul 19, 2017

Absolutely, I want to be able to drop in bandicoot types when the time comes. I'm surprised, word is spreading fast about that project I guess... :)

I think that templating on a matrix type would be better---like you pointed out, this would allow us to accept bandicoot types and also arma::sp_mat, etc. I would think the way to do this would be to simply templatize the Train() and other related functions to accept arbitrary input types, so that you can train on whatever.

@etano
Copy link
Author

etano commented Jul 19, 2017

Well I opened a pull request (not meant to be merged yet), just to get an idea of what this change might look like.

I'm realizing now, though, that it will be a bit more involved because of this LayerTypes boost::variant thing.

@zoq
Copy link
Member

zoq commented Jul 20, 2017

Right, we have to find a good solution for boost::variant, have to think about it.

@rcurtin
Copy link
Member

rcurtin commented Sep 7, 2017

Do you think we could do something like this?

template<typename ElemType>
using LayerTypes = boost::variant<
    Add<arma::Mat<ElemType>, arma::Mat<ElemType>>,
    ...
>;

@dheeraj135
Copy link

@rcurtin Is anyone working on it? If not, I would like to look into this.

@zoq
Copy link
Member

zoq commented Jan 17, 2018

Please feel free to look into the issue.

@dheeraj135
Copy link

@zoq Okk..

@mlpack-bot
Copy link

mlpack-bot bot commented Feb 18, 2019

This issue has been automatically marked as stale because it has not had any recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions! 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants