Skip to content

ferflugel/uoftHacks

Repository files navigation

Pelios

Inspiration

It's been really tiring to discern each other's emotions over online platforms, as a lot of the non-verbal modes of communcation are now rendered useless. Because of this, we interpreted that this would be especially difficult for people with autism, alexithymia, or other developmental disorders to communicate nowadays, and thus presume that there is a need for an increased accessibility source for online meetings. This project addresses this issue by creating a means in which these non-verbal communications can be interpreted by Machine Learning!

High Level Objectives

  • Help individuals discern emotions over online meetings better, thus increasing the quality of Mental Health and alleviating Zoom Fatigue
  • Provide a flexible system that would work on multiple different platforms
  • Provide real-time feedback back to the user continuously

What it does

Thus, the project team decided to implement technology to help address this opportunity - welcome PELIOS, an ML-based application that communicates back the user emotion and speech back in plain text to the other users!

How we built it

The project is made up of a frontend made in Flutter and a backend in Python which uses Machine Learning libraries and Google Cloud API. Flutter is used to communicate back to the user what emotions are being recognized, and Python is used to discern those facial emotions.

Challenges we ran into

As first year students working online, it was hard to communicate together and distribute tasks to each other. We didn't have much experience in programming, so it was also a challenge trying to learn new languages and integrations to finalize all the outputs of our product.

Accomplishments that we're proud of

We learned a lot of new programming languages and integrations! It's been really cool to learn how to connect various platforms together to create a versatile final product.

What we learned

From working on this project, the author team comprised of all first year students learned:

  • Flutter
  • Python Libraries, MI Libraries
  • Integrations between Flutter and Python
  • Google Cloud API integrations
  • Research on UI/UX, accessibility design

What's next for Pelios

  • Implementation of a variety of other emotion recognizers to re-verify the facial expressions, such as voice intonation, speech patterns, and sentence structures. All these methods could be added via more ML libraries.
  • Expansion into other OS/Devices, such as Android, IOS, and Linux
  • UI/UX improvement, research to better reflect accessibility needs of the user
  • Move C++ as an improvement (recording audio is faster)
  • Train custom model using Google’s Auto ML feature

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published