Skip to content

Code for reproducing the stimuli for the Credit Assignment OpenScope project

License

Notifications You must be signed in to change notification settings

colleenjg/cred_assign_stimuli

Repository files navigation

OpenScope Credit Assignment Project Stimulus Code

This repository contains the code needed to reproduce the stimuli used in the Credit Assignment project, an Allen Institute for Brain Science OpenScope project.  

The Credit Assignment experiment was conceptualized by Joel Zylberberg (York University), Blake Richards (McGill University), Timothy Lillicrap (DeepMind) and Yoshua Bengio (Mila), and the stimuli were coded by Colleen Gillon.

The dataset for which these stimuli were used is described in Gillon, Lecoq et al., 2023, Sci Data. Analyses and results are published in Gillon, Pina et al., 2024, J Neurosci.  

NOTE: If you are looking to design your own stimuli for use in the OpenScope data collection pipeline, please refer to the section on deployment in the OpenScope pipeline.  

Installation

Dependencies:

  • Windows OS (see Camstim package)
  • python 2.7
  • psychopy 1.82.01
  • camstim 0.2.4  

Camstim 0.2.4:

  • Built and licensed by the Allen Institute.
  • Written in Python 2 and designed for Windows OS (requires pywin32).
  • Pickled stimulus presentation logs are typically saved under user/camstim/output.  

Installation with Anaconda or Miniconda:

  1. Navigate to repository and install conda environment.
    conda env create -f cred_assign_stimuli.yml
  2. Activate the environment.
    conda activate cred_assign_stimuli
  3. Install the Allen Institute's camstim package in the environment.
    pip install camstim/.
  4. Revert the version of the pillow package (ignore incompatibility warning for camstim).
    pip install pillow==2.9.0.
  5. Download and install AVbin for your OS.
     

Run

View an optical physiology session presentation (70 min) with the same parameters as used in the Credit Assignment project by running
python run_generate_stimuli.py.

To exit the presentation at any time, press Ctrl-C or Esc.
 

Example uses of arguments:

--test_run               ->  abridged 2 min example of an optical physiology session presentation.
--test_hab               ->  abridged 22 sec example of a habituation session presentation.
--hab_duration 10   ->  10 min habituation session.
 

--seed 101     ->  seeds the random processes generating the stimuli to allow reproduction, e.g. with a seed value of 101.
--ca_seeds 0  ->  reproduces the stimuli presented in the first (0th) Credit Assignment session.
 

--fullscreen  ->  produces the presentation in fullscreen mode.
Note that the same stimulus seed used with different presentation window sizes produces different stimuli. Do not use if your aim is to reproduce a specific Credit Assignment session's stimuli, unless your screen is the same size (1920 x 1200 pixels).
--reproduce    ->  checks that the presentation window size is correct for reproducing the Credit Assignment experiment, and raises an error if it is not.
--warp             ->  warps the stimuli on the screen, as was done during the experiment to simulate a spherical screen on a flat screen.
 

--save_frames                                 ->  instead of presenting the stimuli, saves each new frame as an image, and produces a frame list file (see Notes on saving frames, below.)
--save_directory your_directory  ->  main directory to which frames are saved, e.g. your_directory.
--save_extension png                     ->  format in which to save frames as images, e.g. png.
--save_from_frame 100                   ->  frame at which to start saving frames, e.g. 100 (if omitted, starts from beginning).
 

Notes

Helper scripts under cred_assign_stims:

  • generate_stimuli.py: Generates stimuli and either projects them or saves them.
  • cred_assign_stims.py: Defines classes used to build stimuli, and enable stimulus frames to be saved.
  • stimulus_params.py: Initializes stimuli and their parameters.
     

Saving frames:

  • Process saves each new frame as an image, and frame_list.txt which lists the frame images that appear at each frame throughout the entire presentation.
  • Frame saving is very slow during the Bricks stimuli (up to 10x slower), as each individual frame is saved.
  • To partially compensate for the lag induced when saving frames, stimuli are not drawn to the presentation window - it remains gray.
    NOTE: This does not apply when using the warping effect, which must be drawn to apply to the saved frames.
  • File format considerations:
    • tif: fastest, lossless, produces very large files
    • jpg: slower, lossy, produces much smaller files
    • png: slowest, lossless, produces smallest files
  • For instructions to assemble files into a movie using ffmpeg, see example_videos.
     

Known bugs:

  • Non fullscreen presentation window may appear cropped, not showing the full frame image. The saved frames, however, do reflect the full frame image.
  • Lags (i.e., dropped frames) may occur during the stimulus presentation if sufficient compute resources are not available.
    NOTE: When saving frames, saved frames and framelist.txt will not reflect any lags.
  • On rare occasions, stimuli fail to be drawn to occupy the full presentation window, e.g. corner quadrants remain gray. Typically, this occurs if the presentation window is minimized during the presentation or frame saving. If this occurs, it is best to restart the recording.
     

Warnings/messages printed to console which can be ignored:

  • Brightness/contrast not set.
  • Git commit error.
  • Import warnings (e.g., Movie2 stim, TextBox stim).
  • TextBox Font Manager warning.
  • Monitor specification warning.  

Experimental design

 
During each session, subjects were presented with two stimulus types, in random order:

1. Sparse Gabor sequences:

  • Adapted from Homann et al., 2022, PNAS.
  • Each sequence lasted 1.5 sec and cycled through the frames: A, B, C, D, grayscreen (G).
  • For each presentation session, new positions and sizes were sampled for the 30 Gabor patches in each frame (A, B, C, and D).
  • Within a presentation session, at each sequence repetition, the orientation of each of the Gabor patches was resampled around the sequence mean (sampled from 0, 45, 90 or 135 degrees).  

2. Visual flow squares:

  • Randomly positioned squares moved right for one half of the stimulus presentation, and left for the other (direction order was random).
  • All squares moved at the same speed.
  • The visual flow squares are also called "Bricks" in the actual code.
     

Habituation sessions:

  • Lasted 10-60 min, increasing by 10 min between each of the 6 sessions.
  • Presentation time was equally split between the two stimulus types, presented in random order.
     

Optical physiology sessions:

  • Lasted 70 min.
  • Presentation time was equally split between the two stimulus types, presented in random order.
  • Unexpected sequences or "surprises" were introduced, occurring around 5-7% of the time.

1. Sparse Gabor unexpected sequences:

  • Unexpected sequences lasted 3-6 sec (2-4 consecutive sequences), and occurred every 30-90 sec.
  • During unexpected sequences, the D frames were replaced with U frames.
  • Each session's U frame Gabor patches had distinct locations and sizes from the session's D frame Gabor patches.
  • U frame Gabor patch orientations were sampled not from the sequence mean, but from the sequence mean + 90 degrees. So, they were about 90 degrees off from the rest of the sequence they appeared in.

2. Visual flow squares:

  • Unexpected visual flow lasted 2-4 sec, and occurred every 30-90 sec.
  • During unexpected visual flow, 25% of squares moved in the direction opposite to the main flow.
     

Deployment in the OpenScope pipeline

If you are looking to design your own stimuli for use in the OpenScope data collection pipeline, please note that the scripts in this repository are not the exact scripts that were deployed in the pipeline. They have been modified so that users could conveniently reproduce, view and save our project stimuli.

The actual scripts deployed for data collection in the OpenScope pipeline can be found here, i.e., in the cred_assign_stimuli_deployed repository, under the production_v1 commit tag.

Note that the differences, when compared to the code in this repository, are fairly minor. Mainly, for deployment:

  • All modules were integrated into a single script for each habituation day, and a single script for all the recording days.
  • The parameters which could change from day to day (e.g., stimulus block durations) were hard-coded at the top of each day's script.
  • No __main__() function or command line argument passing was implemented.
  • The frame saving option was not built in, as the scripts were intended for live use only.

Code

Code and documentation (excluding camstim) built by Colleen Gillon (colleen dot gillon at mail dot utoronto dot ca).

Citations

To cite the dataset paper:

@Article{GillonLecoq2023,
  title={Responses of pyramidal cell somata and apical dendrites in mouse visual cortex over multiple days},
  author={Gillon, Colleen J. and Lecoq, J{\'e}r{\^o}me A. and Pina, Jason E. and Ahmed, Ruweida and Billeh, Yazan and Caldejon, Shiella and Groblewski, Peter and Henley, Timothy M. and Kato, India and Lee, Eric and Luviano, Jennifer and Mace, Kyla and Nayan, Chelsea and Nguyen, Thuyanh and North, Kat and Perkins, Jed and Seid, Sam and Valley, Matthew T. and Williford, Ali and Bengio, Yoshua and Lillicrap, Timothy P. and Zylberberg, Joel and Richards, Blake A.},
  journal={Scientific Data},
  year={2023},
  date={May 2023},
  publisher={Cold Spring Harbor Laboratory},
  volume={10},
  number={1},
  pages={287},
  issn={2052-4463},
  doi={10.1038/s41597-023-02214-y},
  url={https://www.nature.com/articles/s41597-023-02214-y},
}

To cite the analysis paper:

@Article{GillonPina2024,
  title={Responses to pattern-violating visual stimuli evolve differently over days in somata and distal apical dendrites},
  author={Gillon, Colleen J. and Pina, Jason E. and Lecoq, J{\'e}r{\^o}me A. and Ahmed, Ruweida and Billeh, Yazan and Caldejon, Shiella and Groblewski, Peter and Henley, Timothy M. and Kato, India and Lee, Eric and Luviano, Jennifer and Mace, Kyla and Nayan, Chelsea and Nguyen, Thuyanh and North, Kat and Perkins, Jed and Seid, Sam and Valley, Matthew T. and Williford, Ali and Bengio, Yoshua and Lillicrap, Timothy P. and Richards, Blake A. and Zylberberg, Joel},
  journal={Journal of Neuroscience},
  year = {2024},
  date = {Jan 2024},
  publisher = {Society for Neuroscience},
  volume = {44},
  number = {5},
  pages = {1-22},
  issn = {0270-6474},
  doi = {10.1523/JNEUROSCI.1009-23.2023},
  url = {https://www.jneurosci.org/content/44/5/e1009232023},
}

About

Code for reproducing the stimuli for the Credit Assignment OpenScope project

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages