Skip to content

trettenbrein/OpenPoseR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

66 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenPoseR

An R package that provides functions for analyzing motion-tracking data derived from video files using OpenPose.

The original motivation for creating this package was to control video stimuli in sign language and gesture research, but the provided functionality may also be useful for other purposes.

What is this?

OpenPoseR can be used to analyze motion-tracking data derived from video files using OpenPose. In other words, OpenPoseR does not provide any motion-tracking capabilities by itself. You will need to install and run OpenPose on your system first to perform the actual motion tracking analysis. Then, the OpenPoseR package provides a variety of R functions that can be used to analyse the output generated by OpenPose.

OpenPose is currently the most sophisticated means for tracking people in video clips. The results of motion-tracking of people in video clips with OpenPose can be used for further quantitative analysis that allows for quantification of movement parameters which are relevant to researchers working on sign language and gesture where bodily movements a person take on linguistic and/or discourse functions.

Example video Example video with fit body-pose model

Example video of the German Sign Language (Deutsche Gebärdensprache, DGS) sign for "psychology" courtesy of Henrike Maria Falke, gebaerdenlernen.de (Creative Commons by-nc-sa/3.0/de).

What's it for?

The main reason for developing OpenPoseR was to create a state-of-the-art means for controlling for the bodily motion occuring in different video clips showing a human being either signing or using gestures captured from the front (see frame from example video of the German Sign Language sign PSYCHOLOGY with fit body-pose model above).

By quantifying the gross bodily movement of the person in a particular clip from this perspective it becomes possible to compare different clips (of the same person) that, for example, may represent different conditions in an experiment and determine their similiarty and/or differences. This method therefore may be useful for quantitative stimulus control in sign language and gesture research.


OpenPoseR analysis of the data from the above clip.

It must be pointed out here that this method cannot and was not intended to detect the subtle differences in hand configuration, path movement, as well as non-manual components that are linguistically meaningful and essential to sign languages. The situation may be a little less dim when it comes to gesture, but similar reservations apply.

For details on what OpenPoseR can (and can't) do, respectively, how you can use OpenPoseR please see this demo. Download the demo including examples here (.zip file).

Installation

For now, OpenPoseR (current version: 1.1.0) can be installed using the following commands (you will need to have the devtools package installed):

# Install devtools from CRAN (if not already installed)
install.packages("devtools")

# Install OpenPoseR package from Github
devtools::install_github("trettenbrein/OpenPoseR")

Creators

The OpenPoseR package was created at the Max Planck Institute for Human Cognitive & Brain Sciences by Patrick C. Trettenbrein in collaboration with Emiliano Zaccarella under the supervision of Angela D. Friederici.

If you have found a bug, please report it here. In case you have any questions, criticism, or suggestions that do not belong into the bug tracker, please e-mail Patrick at trettenbrein@cbs.mpg.de.

Citation

If you use OpenPoseR in your own work, please reference the following paper in your manuscript(s):

Trettenbrein, P. C., & Zaccarella, E. (2021). Controlling video stimuli in sign language and gesture research: The OpenPoseR package for analyzing OpenPose motion tracking data in R. Frontiers in Psychology, 12, 628728. https://doi.org/10.3389/fpsyg.2021.628728

License

The code of this project is free for use, re-use, and modification by anyone without any liability or warranty under the conditions of the GNU General Public License v3.0.