Object Detection toolkit based on PaddlePaddle. It supports object detection, instance segmentation, multiple object tracking and real-time multi-person keypoint detection.
-
Updated
May 11, 2024 - Python
Object Detection toolkit based on PaddlePaddle. It supports object detection, instance segmentation, multiple object tracking and real-time multi-person keypoint detection.
Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six activity categories - Guillaume Chevalier
Convolutional Neural Network for Human Activity Recognition in Tensorflow
Python implementation of KNN and DTW classification algorithm
Real-Time Spatio-Temporally Localized Activity Detection by Tracking Body Keypoints
MotionSense Dataset for Human Activity and Attribute Recognition ( time-series data generated by smartphone's sensors: accelerometer and gyroscope) (PMC Journal) (IoTDI'19)
Using deep stacked residual bidirectional LSTM cells (RNN) with TensorFlow, we do Human Activity Recognition (HAR). Classifying the type of movement amongst 6 categories or 18 categories on 2 different datasets.
[IJCAI-21] "Time-Series Representation Learning via Temporal and Contextual Contrasting"
Abnormal Human Behaviors Detection/ Road Accident Detection From Surveillance Videos/ Real-World Anomaly Detection in Surveillance Videos/ C3D Feature Extraction
Classifying the physical activities performed by a user based on accelerometer and gyroscope sensor data collected by a smartphone in the user’s pocket. The activities to be classified are: Standing, Sitting, Stairsup, StairsDown, Walking and Cycling.
Unity's privacy-preserving human-centric synthetic data generator
Implementation of Action Recognition using 3D Convnet on UCF-101 dataset.
Recognizing human activities using Deep Learning
Official GitHub page of the best-paper award publication "Improving Deep Learning for HAR with shallow LSTMs" presented at the International Symposium on Wearable Computers 21' (ISWC 21')
An up-to-date & curated list of Awesome IMU-based Human Activity Recognition(Ubiquitous Computing) papers, methods & resources. Please note that most of the collections of researches are mainly based on IMU data.
This repository provides the codes and data used in our paper "Human Activity Recognition Based on Wearable Sensor Data: A Standardization of the State-of-the-Art", where we implement and evaluate several state-of-the-art approaches, ranging from handcrafted-based methods to convolutional neural networks.
Human Activity Recognition using Channel State Information
Use a LSTM network to predict human activities from sensor signals collected from a smartphone
Multi Person Skeleton Based Action Recognition and Tracking
Human Activity Recognition based on WiFi Channel State Information
Add a description, image, and links to the human-activity-recognition topic page so that developers can more easily learn about it.
To associate your repository with the human-activity-recognition topic, visit your repo's landing page and select "manage topics."