Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate emotion detection and a write a helper class for this #9

Open
aravindsagar opened this issue Apr 8, 2018 · 1 comment
Assignees

Comments

@aravindsagar
Copy link
Owner

It'll be nice if selfie capture can also be encapsulated in this.

@dhl013
Copy link
Collaborator

dhl013 commented Apr 8, 2018

I've looked into two advisable API's: Microsoft Azure Emotion API and Google Vision API.

For our need, Emotion API is more fitting while Vision API can be used as well.
The problem with Emotion API is that the image that will be examined by the cloud needs to be uploaded to somewhere (Azure or other accessible cloud), which means that on each user's selfie take, it will have to be uploaded which is not what we want. Alternatively, it could be stored and be uploaded when wifi access is granted, but this approach is questionable.

Vision API, on the other hand, will use byte stream of the image be sent, obviously, larger data/image means longer time taken to transfer/upload.

I've pushed a branch that has basic usage of the two API's (the dependencies for gradle is kinda messy).
Also will see if there's a way to do this locally, but since emotion detection is not a light computation, it is unlikely that there will be an available library that can be used on local device.

@aravindsagar aravindsagar self-assigned this Apr 27, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants