Skip to content

IoBT-VISTEC/SleepPoseNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 

Repository files navigation

SleepPoseNet: Multi-View Learning for Sleep Postural Transition Recognition Using UWB

This work is published at IEEE Journal of Biomedical and Health Informatics

Abstract

Recognizing the movements during sleep is crucial for the monitoring of patients with sleep disorders. However, the utilization of Ultra-Wideband (UWB) radar for the classification of human sleep postures has not been explored widely. This study investigates the performance of the off-the-shelf single antenna UWB in a novel application of sleep postural transition (SPT) recognition. The proposed Multi-View Learning, entitled SleepPoseNet or SPN, with time series data augmentation aims to classify four standard SPTs. SPN exhibits an ability to capture both time and frequency features, including the movement and direction of sleeping positions. The data recorded from 38 volunteers displayed that SPN with a mean accuracy of 73.7±0.8% significantly outperformed the mean accuracy of 59.9±0.7% obtained from deep convolution neural network (DCNN) in recent state-of-the-art work on human activity recognition using UWB. Apart from UWB system, SPN with the data augmentation can ultimately be adopted to learn and classify time series data in various applications.

Data description

All data can be download here.

Dataset I

There are totally 930 input samples.

  • Input signal: X[930,160,180]
  • Label: y[930]

Each input sample is a 16 s UWB radar signal output which is contained in 2D array size 160x180, the first axis is slow-time and the second axis is range bin (fast-time).

There are six possibility labels for each input sample.

  • 0 - Supine to Left
  • 1 - Supine to Right
  • 2 - Supine to Prone
  • 3 - Left to Supine
  • 4 - Right to Supine
  • 5 - Prone to Supine

In the paper left and right are considered as side, so there are totally 4 classes

Files description
X = wall-placed radar data
y = label as a digit
subjects = subject or participant label

Dataset II

Session1 = without moving object
Session2 = with moving object

Each sample contains only one postural transition (the change of a posture) Dimension = #samples, slow time, range bins or fast time = #samples, 160, 180 Each range bin approximately represents 5 cm, so 180 range bins = 900 cm Each slow time index represents 0.1 s, so 160 slow time indices = 16 s

Label data (y_digit)

  • 0 = Supine side
  • 1 = Side prone
  • 2 = Prone side
  • 3 = Side supine
  • 4 = Supine prone
  • 5 = Prone supine
  • 6 = Background

In the paper, we used only class 0, 3, 4, 5 and 6, so there are totally 5 classes

Files description
X_wall = wall-placed radar data
X_ceiling = ceiling-placed radar data
y_digit = label as a digit
y_str = label as a string
subjects = subject or participant label

Fig. 1 The experiment room, UWB radar and camera position.

Device

In this paper, we use XeThru X4M03 with 10Hz frame rate.

Fig. 2 XeThru X4M03

Any enquiry can be sent to maytusp@gmail.com

Citation

When using any part of this dataset, please cite our paper

@article{piriyajitakonkij2020sleepposenet,
  title={SleepPoseNet: Multi-view learning for sleep postural transition recognition using UWB},
  author={Piriyajitakonkij, Maytus and Warin, Patchanon and Lakhan, Payongkit and Leelaarporn, Pitshaporn and Kumchaiseemak, Nakorn and Suwajanakorn, Supasorn and Pianpanit, Theerasarn and Niparnan, Nattee and Mukhopadhyay, Subhas Chandra and Wilaiprasitporn, Theerawit},
  journal={IEEE Journal of Biomedical and Health Informatics},
  year={2020},
  publisher={IEEE}
}