Skip to content

Commit

Permalink
Merge pull request #797 from yarikoptic/enh-codespell
Browse files Browse the repository at this point in the history
codespell: workflow, config, some typos fixed
  • Loading branch information
adelavega committed Dec 6, 2023
2 parents 7a61f13 + d45f5c8 commit 413e8ff
Show file tree
Hide file tree
Showing 15 changed files with 45 additions and 19 deletions.
4 changes: 4 additions & 0 deletions .codespellrc
@@ -0,0 +1,4 @@
[codespell]
skip = .git,*.pdf,*.svg,*.min.js,xtk.js,*.min.map,*.css,*.pkl,papaya.js,ahba_data
ignore-regex = \b(FWE|TE)\b
# ignore-words-list =
22 changes: 22 additions & 0 deletions .github/workflows/codespell.yml
@@ -0,0 +1,22 @@
---
name: Codespell

on:
push:
branches: [master]
pull_request:
branches: [master]

permissions:
contents: read

jobs:
codespell:
name: Check for spelling errors
runs-on: ubuntu-latest

steps:
- name: Checkout
uses: actions/checkout@v3
- name: Codespell
uses: codespell-project/actions-codespell@v2
2 changes: 1 addition & 1 deletion README.md
Expand Up @@ -34,7 +34,7 @@ After making changes to the code you need to restart the server (but just the uw
```
docker-compose restart nginx django worker
```
### Reseting the server
### Resetting the server
If you would like to reset the server and clean the database:
```
docker-compose stop
Expand Down
2 changes: 1 addition & 1 deletion neurovault/CELERY_README.md
@@ -1,6 +1,6 @@
# Setting Up Celery for NeuroVault

Celery requires a task server. They recommend rabbitmq (and I tested this for the virtual machine) however django (a database) can also be used (but it's not recommended). The other option is redis, but they don't recommend because it is "suseptible to data loss in event of power failures" This is really just like a task database or queue. I was going to do rabbit, but Gabriel said do redis, so I will do redis. Celery is python based so we install from pip.
Celery requires a task server. They recommend rabbitmq (and I tested this for the virtual machine) however django (a database) can also be used (but it's not recommended). The other option is redis, but they don't recommend because it is "susceptible to data loss in event of power failures" This is really just like a task database or queue. I was going to do rabbit, but Gabriel said do redis, so I will do redis. Celery is python based so we install from pip.

source /opt/nv_env/bin/activate
pip install -U celery[redis]
Expand Down
2 changes: 1 addition & 1 deletion neurovault/apps/main/templates/FAQ.html
Expand Up @@ -49,7 +49,7 @@ <h4>2. Why would I submit anything?</h4>
decoded
using <a href="http://neurosynth.org">neurosynth.org</a>.
</p>
<h4 id="persistance">3. How do I know my data is safe and will be
<h4 id="persistence">3. How do I know my data is safe and will be
available
for the years to come?</h4>
<p> All of NeuroVault is backed up daily into off site storage.
Expand Down
2 changes: 1 addition & 1 deletion neurovault/apps/statmaps/migrations/0001_initial.py
Expand Up @@ -71,7 +71,7 @@ class Migration(migrations.Migration):
('file', models.FileField(max_length=500, storage=neurovault.apps.statmaps.storage.DoubleExtensionStorage(), upload_to=neurovault.apps.statmaps.models.upload_img_to, verbose_name='File with the unthresholded volume map (.img, .nii, .nii.gz)')),
('surface_left_file', models.FileField(blank=True, null=True, storage=neurovault.apps.statmaps.storage.DoubleExtensionStorage(), upload_to=neurovault.apps.statmaps.models.upload_img_to, verbose_name='File with the unthresholded LEFT hemisphere fsaverage surface map (.mgh, .curv, .gii)')),
('surface_right_file', models.FileField(blank=True, null=True, storage=neurovault.apps.statmaps.storage.DoubleExtensionStorage(), upload_to=neurovault.apps.statmaps.models.upload_img_to, verbose_name='File with the unthresholded RIGHT hemisphere fsaverage surface map (.mgh, .curv, .gii)')),
('data_origin', models.CharField(blank=True, choices=[('volume', 'volume'), ('surface', 'surface')], default='volume', help_text='Was this map originaly derived from volume or surface?', max_length=200, null=True, verbose_name='Data origin')),
('data_origin', models.CharField(blank=True, choices=[('volume', 'volume'), ('surface', 'surface')], default='volume', help_text='Was this map originally derived from volume or surface?', max_length=200, null=True, verbose_name='Data origin')),
('target_template_image', models.CharField(choices=[('GenericMNI', 'Human (Generic/Unknown MNI)'), ('Dorr2008', 'Mouse (Dorr 2008 space)'), ('NMT', 'Rhesus - macacca mulatta (NMT)'), ('MNI152NLin2009cAsym', 'Human (MNI152 NLin 2009c Asym)')], default='GenericMNI', help_text='Name of target template image', max_length=200, verbose_name='Target template image')),
('subject_species', models.CharField(blank=True, default='homo sapiens', max_length=200, null=True)),
('figure', models.CharField(blank=True, help_text='Which figure in the corresponding paper was this map displayed in?', max_length=200, null=True, verbose_name='Corresponding figure')),
Expand Down
4 changes: 2 additions & 2 deletions neurovault/apps/statmaps/models.py
Expand Up @@ -1000,7 +1000,7 @@ class Image(BaseCollectionItem):
verbose_name="File with the unthresholded RIGHT hemisphere fsaverage surface map (.mgh, .curv, .gii)",
)
data_origin = models.CharField(
help_text=("Was this map originaly derived from volume or surface?"),
help_text=("Was this map originally derived from volume or surface?"),
verbose_name="Data origin",
default="volume",
max_length=200,
Expand Down Expand Up @@ -1156,7 +1156,7 @@ def create(
niftiFile = File(f)
image.file.save(my_file_name, niftiFile)

# If a .img file was loaded then load the correspoding .hdr file as well
# If a .img file was loaded then load the corresponding .hdr file as well
_, ext = os.path.splitext(my_file_name)
print(ext)
if ext in [".img"]:
Expand Down
2 changes: 1 addition & 1 deletion neurovault/apps/statmaps/templates/pycortex/dataview.html
Expand Up @@ -185,7 +185,7 @@
<div class='opt_category'>
<fieldset id="viewopt_fieldset" class='subtable'>
<legend>View</legend>
<div class='display_opt' title="Enable to show slice planes. Not available in flat view. (Hotkeys: Q/W move saggital plane, A/S move coronal plane, Z/X move axial plane)">
<div class='display_opt' title="Enable to show slice planes. Not available in flat view. (Hotkeys: Q/W move sagittal plane, A/S move coronal plane, Z/X move axial plane)">
<input id='volvis' type='checkbox'><label for='volvis'>Volume slices visible</label>
</div>
<div class='display_opt' title="Enable to show left hemisphere surface">
Expand Down
Expand Up @@ -99,7 +99,7 @@ <h5>Regions</h5>
<table id="image-regions-datatable" class="table table-striped table-sm w-100">
<thead>
<tr>
<th>Intesity</th>
<th>Intensity</th>
<th>Name</th>
</tr>
</thead>
Expand Down
Expand Up @@ -35,7 +35,7 @@
<h2>My metanalyses</h2>
<p>Here you can create, activate, and finalize (run inference on) your metaanalyses.
When you will have an active metaanalysis an "Add to the active metaanalysis" button
become present on comaptible (group level T or Z) maps.</p>
become present on compatible (group level T or Z) maps.</p>
<div class="table-responsive-md">
<table id="collections-table" class="table table-striped table-sm table-hover">
<thead>
Expand Down
Expand Up @@ -200,19 +200,19 @@ <h2>{{ image.name }}</h2>
</button>
<div class="dropdown-menu" aria-labelledby="btnGroupDropAnalysis">
{% if neurosynth_compatible %}
<a class="dropdown-item" href="http://neurosynth.org/decode/?neurovault={{ api_cid }}" data-toggle="tooltip" title="Cognitive decoding using coordiante based data collected from thousands of papers.">
<a class="dropdown-item" href="http://neurosynth.org/decode/?neurovault={{ api_cid }}" data-toggle="tooltip" title="Cognitive decoding using coordinate based data collected from thousands of papers.">
Cognitive decoding (neurosynth)
</a>
{% endif %}
{% if comparison_is_possible %}
<a class="dropdown-item" href="{% url 'statmaps:find_similar' image.id %}" data-toggle="tooltip" title="Find maps with similar patters.">
<a class="dropdown-item" href="{% url 'statmaps:find_similar' image.id %}" data-toggle="tooltip" title="Find maps with similar patterns.">
Similar maps search
</a>
{% else %}
<a class="dropdown-item disabled" tabindex="-1" role="button" aria-disabled="true" href="#" data-toggle="tooltip" title="Find maps with similar patters. This function is only enabled for public group level unthresholded statistical maps.">
<a class="dropdown-item disabled" tabindex="-1" role="button" aria-disabled="true" href="#" data-toggle="tooltip" title="Find maps with similar patterns. This function is only enabled for public group level unthresholded statistical maps.">
Similar maps search
</a>
<a class="dropdown-item disabled" tabindex="-1" role="button" aria-disabled="true" href="#" data-toggle="tooltip" title="Find genes with similar expression patters. This function is only enabled for public group level unthresholded statistical maps.">
<a class="dropdown-item disabled" tabindex="-1" role="button" aria-disabled="true" href="#" data-toggle="tooltip" title="Find genes with similar expression patterns. This function is only enabled for public group level unthresholded statistical maps.">
Gene expression decoding
</a>
{% endif %}
Expand Down
Expand Up @@ -3,7 +3,7 @@
<header>
<name>Neubert Ventral Frontal connectivity-based parcellation</name>
<shortname>Ventral Frontal CBP</shortname>
<type>Probabalistic</type>
<type>Probabilistic</type>
<images>
<imagefile>/NeubertVentralFrontalParcellation/VentralFrontal_thr25_2mm</imagefile>
<summaryimagefile>/NeubertVentralFrontalParcellation/VentralFrontal_thr75_summaryimage_2mm</summaryimagefile>
Expand Down
Expand Up @@ -3,7 +3,7 @@
<header>
<name>Neubert Ventral Frontal connectivity-based parcellation</name>
<shortname>Ventral Frontal CBP</shortname>
<type>Probabalistic</type>
<type>Probabilistic</type>
<images>
<imagefile>/NeubertVentralFrontalParcellation/VentralFrontal_thr25_2mm</imagefile>
<summaryimagefile>/NeubertVentralFrontalParcellation/VentralFrontal_thr75_summaryimage_2mm</summaryimagefile>
Expand Down
6 changes: 3 additions & 3 deletions neurovault/apps/statmaps/views.py
Expand Up @@ -1597,7 +1597,7 @@ class ImagesInCollectionJson(BaseDatatableView):
order_columns = ["", "pk", "name", "polymorphic_ctype.name", ""]

def get_initial_queryset(self):
# return queryset used as base for futher sorting/filtering
# return queryset used as base for further sorting/filtering
# these are simply objects displayed in datatable
# You should not filter data returned here by any filter values entered by user. This is because
# we need some base queryset to count total number of records.
Expand Down Expand Up @@ -1715,7 +1715,7 @@ class AtlasesAndParcellationsJson(BaseDatatableView):
order_columns = ["", "name", "polymorphic_ctype.name"]

def get_initial_queryset(self):
# return queryset used as base for futher sorting/filtering
# return queryset used as base for further sorting/filtering
# these are simply objects displayed in datatable
# You should not filter data returned here by any filter values entered by user. This is because
# we need some base queryset to count total number of records.
Expand Down Expand Up @@ -1766,7 +1766,7 @@ class PublicCollectionsJson(BaseDatatableView):
order_columns = ["name", "", "description", ""]

def get_initial_queryset(self):
# return queryset used as base for futher sorting/filtering
# return queryset used as base for further sorting/filtering
# these are simply objects displayed in datatable
# You should not filter data returned here by any filter values entered by user. This is because
# we need some base queryset to count total number of records.
Expand Down
2 changes: 1 addition & 1 deletion scripts/preparing_AHBA_data.py
Expand Up @@ -41,7 +41,7 @@
urllib.request.urlretrieve(url, os.path.join(download_dir, "donor%d.zip" % (i + 1)))
zipfile.ZipFile(os.path.join(download_dir, "donor%d.zip" % (i + 1)))

# Dowloading MNI coordinates
# Downloading MNI coordinates
urllib.request.urlretrieve(
"https://raw.githubusercontent.com/chrisfilo/alleninf/master/alleninf/data/corrected_mni_coordinates.csv",
os.path.join(download_dir, "corrected_mni_coordinates.csv"))
Expand Down

0 comments on commit 413e8ff

Please sign in to comment.