Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accelerometer gesture sense that doesn't require tensorflow or other remote AI? #53

Open
frankcohen opened this issue May 20, 2023 · 0 comments

Comments

@frankcohen
Copy link

ESP32-BLE-Mouse is so easy to use! Thank you x 100! I'd like to extend it to support accelerometer-based gestures. I'm working on my wrist watch project (https://github.com/frankcohen/ReflectionsOS). It has an ESP32-S3, accelerometer, compass, and GPS, no touch screen, no buttons. It's all open-source. I am searching for code to interpret accelerometer data for gestures. Like making a big circle would register as a gesture, or shaking left and right a few inches would register as a gesture. I can't depend on the Internet being available while this is in operation. I could use a Tensor flow or other approach for training while networking is available. What have you seen? Pointers please. Thanks! -Frank

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant