Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update readme #433

Open
wants to merge 6 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
1 change: 1 addition & 0 deletions .circleci/config.yml
Expand Up @@ -49,6 +49,7 @@ commands:
command: |
python3 -m venv venv
. venv/bin/activate
pip install --upgrade pip
pip install pytest pytest-cov black[jupyter]
pip install -e .
fork_test:
Expand Down
28 changes: 18 additions & 10 deletions docs/notebooks/pipelines/BrainLine/README.md
@@ -1,15 +1,23 @@
1. Install [Python 3.8](https://www.python.org/downloads/)
2. Install virtualenv `pip install virtualenv`
3. Make virtual environment `<python path> -m virtualenv <path to new virtual environment>`
4. Activate virtual environment e.g. `source <path to new virtual environment>/bin/activate`
5. Clone the repository `git clone https://github.com/neurodata/brainlit.git`
5. Install brainlit `pip install brainlit` or `cd brainlit && pip install -e .`
6. Install [ilastik](https://www.ilastik.org/)
7. Setup jupyter notebooks either by installing [VSCode](https://code.visualstudio.com/download) or [Jupyter Notebooks](https://jupyter.org/install)
8. Run the `soma_analysis.ipynb` or `axon_analysis.ipynb`
TACC Tutorial
-------------

In Jupyter session on Lonestar 6:

1. In home directory (`cdh`) make python 3.9 virtual environment: `python3 -m venv venv_39`.
2. Activate virtual environment: `source venv_39/bin/activate`.
3. Update pip `pip install --upgrade pip`
4. In scratch directory (`cds`), download ilastik (`wget https://files.ilastik.org/ilastik-1.4.0-Linux.tar.bz2`).
5. Decompress the ilastik file (`tar –xvf ilastik-1.4.0-Linux.tar.bz2`).
6. Clone brainlit repository (`git clone https://github.com/neurodata/brainlit.git`).
7. Install brainlit from source in editable mode: `cd brainlit && pip install -e .`.
8. Install packages I use for this tutorial: `pip install matplotlib-scalebar jupyter`.
9. Go back to the home directory (`cdh`) and copy the notebook I made: `cp /home1/09423/tathey1/brainline-tacc-tutorial.ipynb .`.
10. Create jupyter kernel for this virtual environment: `ipython kernel install --name "venv_39" --user`.
11. Open the jupyter notebook and select `venv_39` as the kernel.
12. As you run the notebook, you will need to change a couple variables including `brainlit_path` and `ilastik_path` according to your scratch directory path.


Atlas
-----

If you plan on using BrainLine analysis a lot (in particular, the napari coronal section views), I recommend you download the atlas from [here](https://neurodata.io/data/allen_atlas/).
If you plan on using BrainLine analysis a lot (in particular, the napari coronal section views), I recommend you download the atlas from [here](https://neurodata.io/data/allen_atlas/).
2 changes: 1 addition & 1 deletion docs/notebooks/pipelines/BrainLine/soma_analysis.ipynb
Expand Up @@ -552,7 +552,7 @@
"metadata": {},
"source": [
"```\n",
"python -m cloudreg.scripts.registration -input_s3_path precomputed://s3://smartspim-precomputed-volumes/2022_02_02/8604/Ch_561 --output_s3_path precomputed://s3://smartspim-precomputed-volumes/2022_02_02/8604/atlas_to_target --atlas_s3_path https://open-neurodata.s3.amazonaws.com/ara_2016/sagittal_50um/average_50um --parcellation_s3_path https://open-neurodata.s3.amazonaws.com/ara_2016/sagittal_10um/annotation_10um_2017 --atlas_orientation PIR -orientation ARS --rotation 0 0 0 --translation 0 0 0 --fixed_scale .7 -log_s3_path precomputed://s3://smartspim-precomputed-volumes/2022_02_02/8604/atlas_to_target --missing_data_correction True --grid_correction False --bias_correction True --regularization 5000.0 --iterations 3000 --registration_resolution 100\n",
"python -m cloudreg.scripts.registration -input_s3_path precomputed://s3://smartspim-precomputed-volumes/2023_04_13/MS36/Ch_561 --output_s3_path precomputed://s3://smartspim-precomputed-volumes/2023_04_13/MS36/atlas_to_target --atlas_s3_path https://open-neurodata.s3.amazonaws.com/ara_2016/sagittal_50um/average_50um --parcellation_s3_path https://open-neurodata.s3.amazonaws.com/ara_2016/sagittal_10um/annotation_10um_2017 --atlas_orientation PIR -orientation RPI --rotation 0 0 0 --translation 0 0 0 --fixed_scale 1 -log_s3_path precomputed://s3://smartspim-precomputed-volumes/2023_04_13/MS36/atlas_to_target --missing_data_correction True --grid_correction False --bias_correction True --regularization 5000.0 --iterations 3000 --registration_resolution 100\n",
"```"
]
},
Expand Down
2 changes: 1 addition & 1 deletion netlify.toml
Expand Up @@ -11,7 +11,7 @@
publish = "docs/_build/html/"

# Default build command.
command = "bash ./.aws.sh; pip install -U pip setuptools; pip install -r docs/requirements.txt; pip install -e .; cd docs; make html; cd .."
command = "bash ./.aws.sh; pip install --upgrade pip; pip install -U pip setuptools; pip install -r docs/requirements.txt; pip install -e .; cd docs; make html; cd .."
# sudo apt-get update; sudo apt-get install python3.8 python3-pip;

# Directory with the serverless Lambda functions to deploy to AWS.
Expand Down