iOS keyboard that detects emotions based on your typing pattern.
-
Updated
Dec 8, 2016 - Swift
iOS keyboard that detects emotions based on your typing pattern.
Affective Computing and Human Robot Interaction project to identify emotions in paintings using transfer learning
A valence and arousal mixing engine for the AudioMetaphor project.
This project acts as a proof of concept of the utility of facial expressions of emotion as a modality. Its intention is to explore the entire process of creating useful applications based on this modality to uncover challenges and determine their viability.
Implementation of visualisations presented in the paper ‘Conceiving Human Interaction by Visualising Depth Data of Head Pose Changes and Emotion Recognition via Facial Expressions’
Software library with synesthetic abilities, made for Processing digital artists. Its code serves as a medium between words, emotions, and images.
A user interface of the Logro project, designed to show affective data that is created from affects
A machine learning application for emotion recognition from speech
Machine Learning model for detecting bluffing in Poker
KaleidOk invites participants to use a new kind of interactive media tool and take part in an emerging experience which explores speech recognition, media retrieval and visuals generating in a collaborative context (between people, and between people and machines).
Song recommendations for your current mood
😄 Building a deep learning based affective computing platform
Some of my final files from the internship at IIT Kharagpur, 2016
Visualizing people's emotion in a city.
A comparison of different classifiers in valence and arousal detection using the DEAP database for EEG
Add a description, image, and links to the affective-computing topic page so that developers can more easily learn about it.
To associate your repository with the affective-computing topic, visit your repo's landing page and select "manage topics."