Skip to content

Commit

Permalink
Merge pull request #379 from cosanlab/ejolly/dev
Browse files Browse the repository at this point in the history
Entire code base reformat with black. Added github workflow for ci. Drop Python 2 artifacts.
  • Loading branch information
ejolly committed Mar 25, 2021
2 parents e2f78a0 + 86534f3 commit def76a0
Show file tree
Hide file tree
Showing 43 changed files with 5,249 additions and 3,437 deletions.
85 changes: 85 additions & 0 deletions .github/workflows/deploy_docs_pypi_onrelease.yml
@@ -0,0 +1,85 @@
name: Deploy Docs and PyPI

on: release

jobs:
# Job (1): Build and deploy docs.
docs:
if: "!contains(github.event.head_commit.message, 'skip ci')"
name: Build & deploy docs
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v2

- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: "3.8"

- name: Upgrade pip
run: |
# install pip=>20.1 to use "pip cache dir"
python3 -m pip install --upgrade pip
- name: Setup pip-cache
id: pip-cache
run: echo "::set-output name=dir::$(pip cache dir)"

- name: Cache deps
uses: actions/cache@v2
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-
- name: Install deps
run: |
python3 -m pip install . -r requirements.txt
python3 -m pip install . -r requirements-dev.txt
python3 -m pip install . -r optional-dependencies.txt
- name: Build docs
run: |
cd docs
make clean
make html
touch _build/html/.nojekyll
- name: Deploy docs
if: success()
uses: crazy-max/ghaction-github-pages@v2
with:
target_branch: gh-pages
build_dir: docs/_build/html
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

# Job (2): Build package and upload to pypi
deploy:
if: "!contains(github.event.head_commit.message, 'skip ci')"
name: Build & deploy package
runs-on: ubuntu-latest
needs: docs
steps:
- name: Checkout Code
uses: actions/checkout@v2

- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: "3.8"

- name: Pypa build
run: |
python3 -m pip install build --user
- name: Wheel and source build
run: |
python3 -m build --sdist --wheel --outdir dist/
- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@master
with:
password: ${{ secrets.PYPI_API_TOKEN }}
@@ -1,4 +1,4 @@
name: nltools
name: Tests and Coverage

on:
push:
Expand Down Expand Up @@ -32,30 +32,30 @@ jobs:
python-version: [3.7, 3.8]
# By default everything should pass for the workflow to pass
ok-fail: [false]
include:
# Rather than include 3.9 in the python versions, do it here so we can ignore failures on mac and windows with 3.9 (they have install issues)
- os: ubuntu-latest
python-version: 3.9
ok-fail: false
- os: macos-latest
python-version: 3.9
ok-fail: true
- os: windows-latest
python-version: 3.9
ok-fail: true
# include:
# Rather than include 3.9 in the python versions, do it here so we can ignore failures on mac and windows with 3.9 (they have install issues)
# - os: ubuntu-latest
# python-version: 3.9
# ok-fail: false
# - os: macos-latest
# python-version: 3.9
# ok-fail: true
# - os: windows-latest
# python-version: 3.9
# ok-fail: true
steps:
# 1. Step up miniconda
# Step up miniconda
- name: Download and setup Miniconda
uses: conda-incubator/setup-miniconda@v2
with:
miniconda-version: "latest"
python-version: ${{ matrix.python-version }}

# 2. Check out latest code on github
# Check out latest code on github
- name: Checkout Code
uses: actions/checkout@v2

# 3. Install common sci-py packages via conda as well as testing packages and requirements
# Install common sci-py packages via conda as well as testing packages and requirements
# TODO: unpin pandas version when deepdish adds support for 1.2: https://github.com/uchicago-cs/deepdish/issues/45
- name: Install Dependencies
run: |
Expand All @@ -71,14 +71,14 @@ jobs:
run: |
black nltools --check --diff
# 4. Actually run the tests with coverage
# Actually run the tests with coverage
- name: Run Tests
run: |
conda activate test
conda env list
coverage run --source=nltools -m pytest -rs -n auto
# 5. Send coverage to coveralls.io but waiting on parallelization to finish
# Send coverage to coveralls.io but waiting on parallelization to finish
# Not using the official github action in the marketplace to upload because it requires a .lcov file, which pytest doesn't generate. It's just easier to use the coveralls python library which does the same thing, but works with pytest.
- name: Upload Coverage
# The coveralls python package has some 422 server issues with uploads from github-actions so try both service providers, for more see:
Expand All @@ -105,10 +105,10 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

# Job (3): Build and deploy docs
# Job (3): Build docs, but don't deploy. This is effectively another layer of testing because of our sphinx-gallery auto-examples
docs:
if: "!contains(github.event.head_commit.message, 'skip ci')"
name: Build & deploy docs
name: Build docs and auto-examples
runs-on: ubuntu-latest
steps:
- name: Checkout Code
Expand Down Expand Up @@ -147,21 +147,3 @@ jobs:
cd docs
make clean
make html
# - name: Deploy docs
# uses: peaceiris/actions-gh-pages@v3
# with:
# github_token: ${{ secrets.GITHUB_TOKEN }}
# publish_dir: ./site

# Job (4): Build package and upload to conda/pypi
# deploy:
# if: "!contains(github.event.head_commit.message, 'skip ci')"
# name: Build & deploy package
# runs-on: ubuntu-latest
# needs: test
# steps:
# - name: Say Hi
# shell: bash
# run: |
# echo "hello world. I havent been configured for package deployment yet!"
4 changes: 3 additions & 1 deletion .gitignore
Expand Up @@ -11,7 +11,7 @@ dist/
.cache/
htmlcov
.pytest_cache/*
dev/
dev/
# Logs and databases #
######################
*.log
Expand Down Expand Up @@ -46,3 +46,5 @@ htmlcov/
#####
.tox
.tox/*

.pytest_cache
36 changes: 0 additions & 36 deletions .travis.yml

This file was deleted.

6 changes: 6 additions & 0 deletions .vscode/extensions.json
@@ -0,0 +1,6 @@
{
"recommendations": [
"kevinrose.vsc-python-indent",
"njpwerner.autodocstring"
]
}
11 changes: 11 additions & 0 deletions .vscode/settings.json
@@ -0,0 +1,11 @@
{
"editor.formatOnSave": true,
"python.testing.pytestEnabled": true,
"python.testing.unittestEnabled": false,
"python.testing.nosetestsEnabled": false,
"python.testing.pytestArgs": [
"nltools"
],
"python.testing.autoTestDiscoverOnSaveEnabled": true,
"editor.insertSpaces": true
}
55 changes: 16 additions & 39 deletions README.md
@@ -1,62 +1,39 @@
[![Package versioning](https://img.shields.io/pypi/v/nltools.svg)](https://pypi.org/project/nltools/)
[![Build Status](https://api.travis-ci.org/cosanlab/nltools.png)](https://travis-ci.org/cosanlab/nltools/)
[![Build Status](https://github.com/cosanlab/nltools/workflows/tests_and_coverage/badge.svg)]
[![codecov](https://codecov.io/gh/cosanlab/nltools/branch/master/graph/badge.svg)](https://codecov.io/gh/cosanlab/nltools)
[![Codacy Badge](https://api.codacy.com/project/badge/Grade/625677967a0749299f38c2bf8ee269c3)](https://www.codacy.com/app/ljchang/nltools?utm_source=github.com&utm_medium=referral&utm_content=ljchang/nltools&utm_campaign=Badge_Grade)
[![Documentation Status](https://readthedocs.org/projects/neurolearn/badge/?version=latest)](http://neurolearn.readthedocs.io/en/latest/?badge=latest)
[![Documentation Status](https://github.com/cosanlab/nltools/workflows/deploy_docs_pypi_onrelease/badge.svg)
[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.2229813.svg)](https://doi.org/10.5281/zenodo.2229813)
![Python Versions](https://img.shields.io/badge/python-3.7%20%7C%203.8-blue)
![Platforms](https://img.shields.io/badge/platform-linux%20%7C%20osx%20%7C%20win-blue)


# NLTools
Python toolbox for analyzing neuroimaging data. It is particularly useful for conducting multivariate analyses. It is originally based on Tor Wager's object oriented matlab [canlab core tools](http://wagerlab.colorado.edu/tools) and relies heavily on [nilearn](http://nilearn.github.io) and [scikit learn](http://scikit-learn.org/stable/index.html). Nltools is compatible with Python 3.6+. Python 2.7 was only supported through 0.3.11. We will no longer be supporting Python2 starting with version 0.3.12.
Python toolbox for analyzing neuroimaging data. It is particularly useful for conducting multivariate analyses. It is originally based on Tor Wager's object oriented matlab [canlab core tools](http://wagerlab.colorado.edu/tools) and relies heavily on [nilearn](http://nilearn.github.io) and [scikit learn](http://scikit-learn.org/stable/index.html). Nltools is only compatible with Python 3.7+.

### Installation
1. Method 1
## Documentation

Documentation and tutorials are available at https://nltools.org

## Installation
1. Method 1 (stable)

```
pip install nltools
```

2. Method 2 (Recommended)
2. Method 2 (bleeding edge)

```
pip install git+https://github.com/cosanlab/nltools
```

3. Method 3
3. Method 3 (for development)

```
git clone https://github.com/cosanlab/nltools
python setup.py install
```
or
```
pip install -e 'path_to_github_directory'
pip install -e nltools
```

### Dependencies
nltools requires several dependencies. All are available in pypi. Can use `pip install 'package'`
- nibabel>=2.0.1
- scikit-learn>=0.19.1
- nilearn>=0.4
- pandas>=0.20
- numpy>=1.9
- seaborn>=0.7.0
- matplotlib>=2.1
- scipy
- six
- pynv
- joblib

### Optional Dependencies
- mne
- requests
- networkx
- ipywidgets >=5.2.2

### Documentation
Current Documentation can be found at [readthedocs](http://neurolearn.readthedocs.org/en/latest).

Please see our [tutorials](http://neurolearn.readthedocs.io/en/latest/auto_examples/index.html), which provide numerous examples for how to use the toolbox.

### Preprocessing
Please see our [cosanlab_preproc](https://github.com/cosanlab/cosanlab_preproc) library for nipype pipelines to perform preprocessing on neuroimaging data.
## Preprocessing
Nltools has minimal routines for pre-processing data. For more complete pre-processing pipelines please see our [cosanlab_preproc](https://github.com/cosanlab/cosanlab_preproc) library built with `nipype`.
1 change: 0 additions & 1 deletion docs/install.rst
Expand Up @@ -35,7 +35,6 @@ nltools requires several dependencies. All are available in pypi. Can use *pip
- seaborn>=0.7.0
- matplotlib>=2.2.0
- scipy
- six
- pynv
- joblib
- deepdish>=0.3.6
Expand Down
34 changes: 15 additions & 19 deletions nltools/__init__.py
@@ -1,25 +1,21 @@
from __future__ import absolute_import

__all__ = ['data',
'datasets',
'analysis',
'cross_validation',
'plotting',
'stats',
'utils',
'file_reader',
'mask',
'prefs',
'external',
'__version__']
__all__ = [
"data",
"datasets",
"analysis",
"cross_validation",
"plotting",
"stats",
"utils",
"file_reader",
"mask",
"prefs",
"external",
"__version__",
]

from .analysis import Roc
from .cross_validation import set_cv
from .data import (Brain_Data,
Adjacency,
Groupby,
Design_Matrix,
Design_Matrix_Series)
from .data import Brain_Data, Adjacency, Groupby, Design_Matrix, Design_Matrix_Series
from .simulator import Simulator
from .prefs import MNI_Template, resolve_mni_path
from .version import __version__
Expand Down

0 comments on commit def76a0

Please sign in to comment.