Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model inference on ARM M4F #5

Open
gcmike opened this issue Sep 7, 2017 · 3 comments
Open

Model inference on ARM M4F #5

gcmike opened this issue Sep 7, 2017 · 3 comments

Comments

@gcmike
Copy link

gcmike commented Sep 7, 2017

Hi there.
Great work on the project! I have made successful progress with mac os. But I was wondering if the trained model, say minst-cnn.kan, could be transferred to ARM M4F chip for inference? There would definitely be data input and output processing. But apart from that, is it possible to use the trained model on ARM? Thanks in advance!

@attractivechaos
Copy link
Owner

I have no experience with ARM CPUs. Is M4F little endian or big endian? If it is big endian, model generated on x86 won't work, though the issue should be fixable. If it is little endian, data file should work with the CPU, in principle.

@gcmike
Copy link
Author

gcmike commented Sep 7, 2017

I am currently aiming at nRF52832, which has a cortex M4F on board along with ble function. From what I found here (http://infocenter.nordicsemi.com/index.jsp?topic=%2Fcom.nordic.infocenter.nrf52832.ps.v1.1%2Fcpu.html), the last table showed that it implements little endian. Is this the correct info?

@shipleyxie
Copy link

@gcmike i am intersted in you WORK too , have you finished this work?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants