Skip to content

Latest commit

 

History

History
37 lines (36 loc) · 15.4 KB

AC_physiological.md

File metadata and controls

37 lines (36 loc) · 15.4 KB

AC - Physiological Signals

Paper Conference Remarks
Toward Machine Emotional Intelligence: Analysis of Affective Physiological State IEEE TPAMI 2001 1. Develop a machine's ability to recognize human affective state given four physiological signals. 2. Propose new features and algorithms and compare their performances for the daily variation issues (features of different emotions on the same day cluster more closely than features of the same emotion on different days)
DEAP: A Database for Emotion Analysis using Physiological Signals IEEE TAC 2012 1. Present a multimodal dataset for the analysis of human affective states, comprising EEG and peripheral physiological signals of 32 participants. 2. An extensive analysis of the participants' ratings during the experiment is presented. Correlates between the EEG signal frequencies and the participants' ratings are investigated. 3. Methods and results are presented for single-trial classification of arousal, valence, and like/dislike ratings using the modalities of EEG, peripheral physiological signals, and multimedia content analysis.
A Multimodal Database for Affect Recognition and Implicit Tagging IEEE TAC 2012 1. Present a multimodal database recorded in response to affective stimuli with the goal of emotion recognition and implicit tagging research. 2. A multimodal setup was arranged for synchronized recording of face videos, audio signals, eye gaze data, and peripheral/central nervous system physiological signals.
A Review on the Computational Methods for Emotional State Estimation from the Human EEG Computational and Mathematical Methods in Medicine 2013 1. Reviews the computational methods that have been developed to deduct EEG indices of emotion, to extract emotion-related features, or to classify EEG signals into one of many emotional states. 2. Proposes using sequential Bayesian inference to estimate the continuous emotional state in real time. 3. Arousal, valence, dominance, and predictability as well as emotional keywords are recorded.
EEG Databases for Emotion Recognition ICCW 2013 1. Two affective EEG databases are presented. 2. The EEG data are rated by the participants with arousal, valence, and dominance levels. The correlation between powers of different EEG bands and the affective ratings is studied. 3. Use of a Fractal Dimension feature in combination with statistical and Higher Order Crossings (HOC) features gives us results with the best accuracy.
Emotional state classification from EEG data using machine learning approach Neurocomputing 2014 1. Systematically compares three kinds of existing EEG features for emotion classification, introduces an efficient feature smoothing method for removing the noise unrelated to emotion task, and proposes a simple approach to tracking the trajectory of emotion changes with manifold learning. 2. Design a movie induction experiment that spontaneously leads subjects to real emotional states and collect an EEG data set of six subjects. 3. Results show that (a) power spectrum feature is superior to other two kinds of features; (b) a linear dynamic system based feature smoothing method can significantly improve emotion classification accuracy; and (c) the trajectory of emotion changes can be visualized by reducing subject-independent features with manifold learning.
Feature Extraction and Selection for Emotion Recognition from EEG IEEE TAC 2014 1. Reviews feature extraction methods for emotion recognition from EEG based on 33 studies. 2. Results are presented with respect to performance of different feature selection methods, usage of selected feature types, and selection of electrode locations.
EEG artifact removal—state-of-the-art and guidelines Journal of Neural Engineering 2015 Presents an extensive review on the artifact removal algorithms used to remove the main sources of interference encountered in the electroencephalogram (EEG), specifically ocular, muscular and cardiac artifacts.
Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks IEEE Transactions on Autonomous Mental Development 2015 1. Introduces deep belief networks (DBNs) to constructing EEG-based emotion recognition models. 2. The critical frequency bands and channels determined by using the weights of trained DBNs are consistent with the existing observations. 3. Experiment results show that neural signatures associated with different emotions do exist and they share commonality across sessions and individuals.
Identifying Stable Patterns over Time for Emotion Recognition from EEG IEEE TAC 2016 1. Investigate stable patterns of electroencephalogram (EEG) over time for emotion recognition using a machine learning approach. 2. Systematically evaluate the performance of various popular feature extraction, feature selection, feature smoothing and pattern classification methods with the DEAP dataset and a newly developed dataset called SEED for this study. 3. The experimental results indicate that stable patterns exhibit consistency across sessions.
Learning Representations from EEG with Deep Recurrent-Convolutional Neural Networks ICLR 2016 1. Proposes a novel approach for learning such representations from multi-channel EEG time-series via multi-spectral images. 2. The proposed approach is designed to preserve the spatial, spectral, and temporal structure of EEG which leads to finding features that are less sensitive to variations and distortions within each dimension.
DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals From Wireless Low-cost Off-the-Shelf Devices IEEE JBHI 2017 1. Present DREAMER, a multimodal database consisting of electroencephalogram (EEG) and electrocardiogram (ECG) signals recorded during affect elicitation by means of audio-visual stimuli. 2. A baseline for participant-wise affect recognition using EEG and ECG-based features, as well as their fusion, was established through supervised classification experiments using support vector machines (SVMs).
Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review applied sciences 2017 1. Reviews published articles on emotion detection, recognition, and classification using EEG. 2. The articles were classified based on a scheme consisting of two categories: research orientation and domains/applications.
EEG-Based Emotion Recognition via Fast and Robust Feature Smoothing ICBI 2017 1. Propose a feature smoothing method to alleviate noise and high dimension problems of EEG signals. 2. Extract six statistical features from raw EEG signals and apply a simple yet cost-effective feature smoothing method to improve the recognition accuracy.
Emotions Recognition Using EEG Signals: A Survey IEEE TAC 2018 1. Present a survey of the neurophysiological research performed from 2009 to 2016, providing a comprehensive overview of the existing works in emotion recognition using EEG signals. 2. Focus our analysis in the main aspects involved in the recognition process (e.g., subjects, features extracted, classifiers), and compare the works per them. 2. Propose a set of good practice recommendations that researchers must follow to achieve reproducible, replicable, well-validated and high-quality results.
A Review of Emotion Recognition Using Physiological Signals Sensors 2018 Presents a comprehensive review on physiological signal-based emotion recognition, including emotion models, emotion elicitation methods, the published emotional physiological datasets, features, classifiers, and the whole framework for emotion recognition based on the physiological signals.
Deep Learning for Human Affect Recognition: Insights and New Developments Arxiv 2018 1. Reviews the literature on human affect recognition between 2010 and 2017, with a special focus on approaches using deep neural networks. 2. Finds that deep learning is used for learning of (i) spatial feature representations, (ii) temporal feature representations, and (iii) joint feature representations for multimodal sensor data.
Real-Time Movie-Induced Discrete Emotion Recognition from EEG Signals IEEE TAC 2018 1. Proposes a real-time movie-induced emotion recognition system for identifying an individual’s emotional states through the analysis of brain waves. 2. Results demonstrate the advantage over the existing state-of-the-art real-time emotion recognition systems from EEG signals in terms of classification accuracy and the ability to recognise similar discrete emotions that are close in the valence-arousal coordinate space.
The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data frontiers in Neuroscience 2018 1. Proposes the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels. 2. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner.
Exploring EEG Features in Cross-Subject Emotion Recognition frontiers in Neuroscience 2018 1. Provides a more comprehensive investigation on the poor generalizability of features with a wider range of feature types, including 18 kinds of linear and non-linear EEG features. 2. Explores the importance of different EEG features in cross-subject emotion recognition from multiple perspectives, including different channels, brain regions, rhythms, and feature types.
EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks IEEE TAC 2018 1. Proposes a multichannel EEG emotion recognition method based on a novel dynamical graph convolutional neural networks (DGCNN). 2. The proposed DGCNN method can dynamically learn the intrinsic relationship between different electroencephalogram (EEG) channels, represented by an adjacency matrix, via training a neural network so as to benefit for more discriminative EEG feature extraction.
A Bi-hemisphere Domain Adversarial Neural Network Model for EEG Emotion Recognition IEEE TAC 2018 1. Proposes a novel neural network model, called bi-hemisphere domain adversarial neural network (BiDANN) model, for electroencephalograph (EEG) emotion recognition. 2. The model contains a global and two local domain discriminators that work adversarially with a classifier to learn discriminative emotional features for each hemisphere. 3. Proposes an improved version of BiDANN, denoted by BiDANN-S, for subject-independent EEG emotion recognition problem by lowering the influences of the personal information of subjects to the EEG emotion recognition.
Cascade and Parallel Convolutional Recurrent Neural Networks on EEG-Based Intention Recognition for Brain Computer Interface AAAI 2018 1. Introduces both cascade and parallel convolutional recurrent neural network models for precisely identifying human intended movements and instructions by effectively learning the compositional spatio-temporal representations of raw EEG streams.
A Review on Nonlinear Methods Using Electroencephalographic Recordings for Emotion Recognition IEEE TAC 2019 1. Summarizes the most recent works that have applied nonlinear methods in EEG signal analysis for emotion recognition. 2. Identifies some nonlinear indices that have not been employed yet in this research area.
Deep learning for electroencephalogram (EEG) classification tasks: a review Journal of Neural Engineering 2019 1. A systematic review of the literature on deep learning applications to EEG classification was performed to address the following critical questions: (1) Which EEG classification tasks have been explored with deep learning? (2) What input formulations have been used for training the deep networks? (3) Are there specific deep learning network structures suitable for specific types of tasks? 2. The tasks that used deep learning fell into five general groups: emotion recognition, motor imagery, mental workload, seizure detection, event related potential detection, and sleep scoring. For each type of task, we describe the specific input formulation, major characteristics, and end classifier recommendations found through this review.
Domain Adaptation Techniques for EEG-Based Emotion Recognition: A Comparative Study on Two Public Datasets IEEE Transactions on Cognitive and Developmental Systems 2019 1. Conducts a comparative study on several state-of-the-art domain adaptation techniques on two datasets: DEAP and SEED. 2. Demonstrates that domain adaptation techniques can improve the classification accuracy on both datasets, but not so effective on DEAP as on SEED. 3. Explores the efficacy of domain adaptation in a cross-dataset setting when the data are collected under different environments using different devices and experimental protocols. 4. Proposes to apply domain adaptation to reduce the intersubject variance as well as technical discrepancies between datasets, and then train a subject-independent classifier on one dataset and test on the other.
Spatial–Temporal Recurrent Neural Network for Emotion Recognition IEEE Transactions on Cybernetics 2019 1. Propose a multidirectional recurrent neural network (RNN) layer to capture long-range contextual cues by traversing the spatial regions of each temporal slice along different directions; a bi-directional temporal RNN layer is further used to learn the discriminative features characterizing the temporal dependencies of the sequences. 2. Impose sparse projection onto those hidden states of spatial and temporal domains to improve the model discriminant ability. 3. Experimental results on the public emotion datasets of electroencephalogram and facial expression demonstrate the proposed STRNN method is more competitive over those state-of-the-art methods.

Back to index