Skip to content

Library to predict user behavior in 360 videos.

License

Notifications You must be signed in to change notification settings

alanlivio/predict360user

Repository files navigation

predict360user

predict360user is a library that aims to help researchers to reproduce and develop models to predict user behavior in 360 videos, namely trajectory (or traject for short). It extends Rondon 360-videos models/dataset collection and takes a lot of design inspirations from the recommendation systems framework RecBole. See bellow the suported datasets and models.

dataset users (u) videos (v) trajects (u*v) integrated
Xu_PAMI_18 (paper) 59 77 4,543 yes
Xu_CVPR_18 (paper) 34 209 7,106 yes
Nguyen_MM_18 (paper) 48 9 432 yes
Fan_NOSSDAV_17 (paper) 39 9 300 yes
David_MMSys_18 (paper) 57 19 1,083 yes
total 12,451 yes
model method user input integrated
pos_only (paper, code) LSTM position, saliency yes
Xu_PAMI_18 (paper, code) LSTM position, saliency no
Xu_CVPR_18 (paper) LSTM gaze, gaze RGB yes
Nguyen_MM_18 (paper) LSTM position, tiles, saliency no
Li_ChinaCom_18 (paper) LSTM tiles, saliency no
Romero_PAMI_22 (paper, code) LSTM position, saliency yes
DVMS_MMSYS_22 (paper, code) LSTM position, saliency no
Chao_MMSP_21 (paper) Transformer position no
Wu_AAAI_20 (paper) SphericalCNN, RNN position no
Taghavi_NOSSDAV_20 (paper) Clustering position no
Petrangeli_AIVR_18 (paper) Spectral Clustering position no

Requeriments

The project requirements are in requirements.txt, which uses tensorflow is 2.8. This tensorflow version requires cudatoolkit>=11.2 and cudnn=8.1.0. See below how to create a conda env for that. If your GPU support a newer cudatoolkit version (run nvidia-smi), you should use the tensorflow and cuda accordingly (e.g. conda install tensorflow=2.15 cudatoolkit=12.2 for the latest).

conda create -n p3u python==3.9 -y
conda activate p3u
pip install -r requirements.txt
conda install -c conda-forge cudatoolkit=11.2 cudnn=8.1.0

To setup a WSL env, follow this tutorial and do sudo apt install cuda nvdia-cudnn.

Usage

The library's main functions are:

  • load_df_trajecs: return users trajects in 360 videos in memory as a pandas.DataFrame. Each traject has: dataset id; pre-fixed user and video ids; traces as list of x,y,z points; actual entropy (actS); and an class name (actS_c) with labels low, medium and high selected using a Jenks breaks over actS value. See an example below:

    ds user video traces actS actS_c
    david david_0 david_10_Cows [[x,y,z],...] 3.2 medium
  • BaseModel: train and evaluate prediction models

See notebooks in docs/ folder.

To illustrate usage, the code below does train and evaluates pos_only model for david dataset.

python -m predict360user.start_run dataset=david model=pos_only

Cite

If you use predict360user please consider citing it as:

@misc{predict360user,
  author = {Guedes, Alan},
  title = {predict360user: library to predict user behavior in 360 videos},
  year = {2021},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/alanlivio/predict360user}}
}

A note on maintenance

This repository was born as part of the UK EPSR SpheryStream project. Its maintenance is limited by a research project's time and resources. Even if I want to automate all 360 user prediction models, I need more time to maintain the whole body of automation that a well-maintained package deserves. Any help is very welcome. Here is a quick guide to interacting with this repository:

  • If you find a bug, please open an issue, and I will fix it as soon as possible.
  • If you want to request a new feature, please open an issue, and I will consider it as soon as possible.
  • If you want to contribute yourself, please open an issue first, we discuss the objective, plan a proposal, and open a pull request to act on it.

If you would like to be involved further in the development of this repository, please get in touch with me directly: aguedes at ucl dot ac dot uk.

About

Library to predict user behavior in 360 videos.

Topics

Resources

License

Stars

Watchers

Forks

Languages