Skip to content

Commit

Permalink
Merge branch 'multi-slice-epie' of https://github.com/ptycho/ptypy in…
Browse files Browse the repository at this point in the history
…to multi-slice-epie
  • Loading branch information
kahntm committed Mar 11, 2024
2 parents 90058d1 + 391cc4f commit 381065c
Show file tree
Hide file tree
Showing 20 changed files with 712 additions and 47 deletions.
30 changes: 12 additions & 18 deletions CONTRIB.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,46 +26,40 @@ Please ensure you satisfy most of PEP8_ recommendations. We are not dogmatic abo
Testing
^^^^^^^

Not much testing exists at the time of writing this document, but we are aware that this is something that should change. If you want to contribute code, it would be very good practice to also submit related tests.
All tests are in the (``/test/``) folder and our CI pipeline runs these test for every commit (?). Please note that tests that require GPUs are disabled for the CI pipeline. Make sure to supply tests for new code or drastic changes to the existing code base. Smaller commits or bug fixes don't require an extra test.

Branches
^^^^^^^^

We are following the Gitflow https://www.atlassian.com/git/tutorials/comparing-workflows/gitflow-workflow development model where a development branch (``dev``) is merged into the master branch for every release. Individual features are developed on topic branches from the development branch and squash-merged back into it when the feature is mature

The important permanent branches are:
- ``master``: the current cutting-edge but functional package.
- ``stable``: the latest release, recommended for production use.
- ``target``: target for a next release. This branch should stay up-to-date with ``master``, and contain planned updates that will break compatibility with the current version.
- other thematic and temporary branches will appear and disappear as new ideas are tried out and merged in.
- ``master``: (protected) the current release plus bugfixes / hotpatches.
- ``dev``: (protected) current branch for all developments. Features are branched this branch and merged back into it upon completion.


Development cycle
^^^^^^^^^^^^^^^^^

There has been only two releases of the code up to now, so what we can tell about the *normal development cycle* for |ptypy| is rather limited. However the plan is as follows:
- Normal development usually happens on thematic branches. These branches are merged back to master when it is clear that (1) the feature is sufficiently debugged and tested and (2) no current functionality will break.
- At regular interval admins will decide to freeze the development for a new stable release. During this period, development will be allowed only on feature branches but master will accept only bug fixes. Once the stable release is done, development will continue.
|ptypy| does not follow a rigid release schedule. Releases are prepared for major event or when a set of features have reached maturity.

- Normal development usually happens on thematic branches from the ``dev`` branch. These branches are merged back to ``dev`` when it is clear that (1) the feature is sufficiently debugged and tested and (2) no current functionality will break.
- For a release the dev branch will be merged back into master and that merge tagged as a release.


3. Pull requests
----------------

Most likely you are a member of the |ptypy| team, which give you access to the full repository, but no right to commit changes. The proper way of doing this is *pull requests*. You can read about how this is done on github's `pull requests tutorial`_.

Pull requests can be made against one of the feature branches, or against ``target`` or ``master``. In the latter cases, if your changes are deemed a bit too substantial, the first thing we will do is create a feature branch for your commits, and we will let it live for a little while, making sure that it is all fine. We will then merge it onto ``master`` (or ``target``).

In principle bug fixes can be requested on the ``stable`` branch.

3. Direct commits
-----------------

If you are one of our power-users (or power-developers), you can be given rights to commit directly to |ptypy|. This makes things much simpler of course, but with great power comes great responsibility.
Pull requests shall be made against one of the feature branches, or against ``dev`` or ``master``. For PRs against master we will only accept bugifxes or smaller changes. Every other PR should be made against ``dev``. Your PR will be reviewed and discussed anmongst the core developer team. The more you touch core libraries, the more scrutiny your PR will face. However, we created two folders in the main source folder where you have mmore freedom to try out things. For example, if you want to provide a new reconstruction engine, place it into the ``custom/`` folder. A new ``PtyScan`` subclass that prepares data from your experiment is best placed in the ``experiment/`` folder.

To make sure that things are done cleanly, we encourage all the core developers to create thematic remote branches instead of committing always onto master. Merging these thematic branches will be done as a collective decision during one of the regular admin meetings.
If you develop a new feature on a topic branch, it is your responsibility to keep it current with dev branch to avoid merge conflicts.


.. |ptypy| replace:: PtyPy


.. _PEP8: https://www.python.org/dev/peps/pep-0008/

.. _`pull requests tutorial`: https://help.github.com/articles/using-pull-requests/
.. _`pull requests tutorial`: https://help.github.com/articles/using-pull-requests/
1 change: 1 addition & 0 deletions archive/cuda_extension/engines/DM_gpu.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,7 @@ class DMGpu(DMNpy):
default = 'linear'
type = str
help = Subpixel interpolation; 'fourier','linear' or None for no interpolation
choices = ['fourier','linear',None]
[update_object_first]
default = True
Expand Down
1 change: 1 addition & 0 deletions archive/cuda_extension/engines/DM_npy.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,7 @@ class DMNpy(DM):
default = 'linear'
type = str
help = Subpixel interpolation; 'fourier','linear' or None for no interpolation
choices = ['fourier','linear',None]
[update_object_first]
default = True
Expand Down
1 change: 1 addition & 0 deletions archive/engines/DM.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,7 @@ class DM(PositionCorrectionEngine):
default = 'linear'
type = str
help = Subpixel interpolation; 'fourier','linear' or None for no interpolation
choices = ['fourier','linear',None]
[update_object_first]
default = True
Expand Down
6 changes: 3 additions & 3 deletions ptypy/core/classes.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@

# Hard-coded limit in array size
# TODO: make this dynamic from available memory.
MEGAPIXEL_LIMIT = 50
MEGAPIXEL_LIMIT = 100


class Base(object):
Expand Down Expand Up @@ -709,8 +709,8 @@ def reformat(self, newID=None, update=True):

megapixels = np.array(new_shape).astype(float).prod() / 1e6
if megapixels > MEGAPIXEL_LIMIT:
raise RuntimeError('Arrays larger than %dM not supported. You '
'requested %.2fM pixels.' % (MEGAPIXEL_LIMIT, megapixels))
logger.warning('Arrays larger than %dM not recommended. You '
'requested %.2fM pixels.' % (MEGAPIXEL_LIMIT, megapixels))

# Apply Nd misfit
if self.data is not None:
Expand Down
6 changes: 4 additions & 2 deletions ptypy/core/data.py
Original file line number Diff line number Diff line change
Expand Up @@ -114,6 +114,7 @@ class PtyScan(object):
default = data
help = Determines what will be loaded in parallel
doc = Choose from ``None``, ``'data'``, ``'common'``, ``'all'``
choices = ['data', 'common', 'all']
[rebin]
type = int
Expand All @@ -122,7 +123,7 @@ class PtyScan(object):
doc = Rebinning factor for the raw data frames. ``'None'`` or ``1`` both mean *no binning*
userlevel = 1
lowlim = 1
uplim = 8
uplim = 32
[orientation]
type = int, tuple, list
Expand All @@ -139,6 +140,7 @@ class PtyScan(object):
<newline>
Alternatively, a 3-tuple of booleans may be provided ``(do_transpose,
do_flipud, do_fliplr)``
choices = [0, 1, 2, 3, 4, 5, 6, 7]
userlevel = 1
[min_frames]
Expand Down Expand Up @@ -797,7 +799,7 @@ def get_data_chunk(self, chunksize, start=None):
rebin = self.rebin
if rebin <= 1:
pass
elif (rebin in range(2, 6)
elif (rebin in range(2, 32+1)
and (((sh / float(rebin)) % 1) == 0.0).all()):
mask = w > 0
d = u.rebin_2d(d, rebin)
Expand Down
8 changes: 4 additions & 4 deletions ptypy/core/geometry.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
:license: see LICENSE for details.
"""
import numpy as np
from scipy import fftpack
import scipy.fft

from .. import utils as u
from ..utils.verbose import logger
Expand Down Expand Up @@ -55,7 +55,7 @@ class Geo(Base):
If set to True, changes to properties like :py:meth:`energy`,
:py:meth:`lam`, :py:meth:`shape` or :py:meth:`psize` will cause
a call to :py:meth:`update`.
Default geometry parameters. See also :py:data:`.scan.geometry`
Expand Down Expand Up @@ -471,8 +471,8 @@ def _FFTW_fft(self):
self.ifft = lambda x: fftw_np.ifft2(x, planner_effort=pe)

def _scipy_fft(self):
self.fft = lambda x: fftpack.fft2(x).astype(x.dtype)
self.ifft = lambda x: fftpack.ifft2(x).astype(x.dtype)
self.fft = lambda x: scipy.fft.fft2(x).astype(x.dtype)
self.ifft = lambda x: scipy.fft.ifft2(x).astype(x.dtype)

def _numpy_fft(self):
self.fft = lambda x: np.ascontiguousarray(np.fft.fft2(x).astype(x.dtype))
Expand Down
1 change: 1 addition & 0 deletions ptypy/core/illumination.py
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,7 @@
- *<template>* : one of the templates inillumination module
In script, you may pass a numpy.ndarray here directly as the model. It is considered as incoming wavefront and will be propagated according to `propagation` with an optional `aperture` applied before.
choices = ['recon','stxm',None]
userlevel = 0
[photons]
Expand Down
4 changes: 4 additions & 0 deletions ptypy/core/manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,13 +71,15 @@ class ScanModel(object):
default = farfield
help = Propagation type
doc = Either "farfield" or "nearfield"
choices = ['farfield', 'nearfield']
userlevel = 1
[ffttype]
type = str
default = scipy
help = FFT library
doc = Choose from "numpy", "scipy" or "fftw"
choices = ['numpy', 'scipy', 'fftw']
userlevel = 1
[data]
Expand Down Expand Up @@ -906,6 +908,7 @@ class _Full(object):
- ``'irregular'``: no assumption
**[not implemented]**
type = str
choices = ['achromatic', 'linear', 'irregular']
userlevel = 2
[coherence.probe_dispersion]
Expand All @@ -917,6 +920,7 @@ class _Full(object):
- ``'irregular'``: no assumption
**[not implemented]**
type = str
choices = ['achromatic', 'linear', 'irregular']
userlevel = 2
[resolution]
Expand Down
64 changes: 60 additions & 4 deletions ptypy/core/ptycho.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,7 @@ class Ptycho(Base):
- ``INSPECT``: Object Information
- ``DEBUG``: Debug
type = str, int
choices = ['CRITICAL', 'ERROR', 'WARNING', 'INFO', 'INSPECT', 'DEBUG']
userlevel = 0
[data_type]
Expand All @@ -90,6 +91,7 @@ class Ptycho(Base):
doc = Reconstruction floating number precision (``'single'`` or
``'double'``)
type = str
choices = ['single', 'double']
userlevel = 1
[run]
Expand Down Expand Up @@ -155,7 +157,8 @@ class Ptycho(Base):
doc = Choose a reconstruction file format for after engine completion.
- ``'minimal'``: Bare minimum of information
- ``'dls'``: Custom format for Diamond Light Source
choices = 'minimal','dls'
- ``'used_params'``: Same as minimal but including all used parameters
choices = 'minimal','dls','used_params'
[io.interaction]
default = None
Expand Down Expand Up @@ -248,6 +251,7 @@ class Ptycho(Base):
help = Options for default plotter or template name
doc = Flexible layout for default plotter is not implemented yet. Please choose one of the
templates ``'default'``,``'black_and_white'``,``'nearfield'``, ``'minimal'`` or ``'weak'``
choices = ['default', 'black_and_white', 'nearfield', 'minimal', 'weak']
userlevel = 2
[io.autoplot.dump]
Expand All @@ -269,6 +273,7 @@ class Ptycho(Base):
help = Produce timings for benchmarking the performance of data loaders and engines
doc = Switch to get timings and save results to a json file in p.io.home
Choose ``'all'`` for timing data loading, engine_init, engine_prepare, engine_iterate and engine_finalize
choices = ['all', 'loading', 'engine_init', 'engine_prepare', 'engine_iterate', 'engine_finalize']
userlevel = 2
[scans]
Expand Down Expand Up @@ -351,6 +356,7 @@ def __init__(self, pars=None, level=2, **kwargs):
self.mask = None
self.model = None
self.new_data = None
self.state_dict = dict()

# Communication
self.interactor = None
Expand Down Expand Up @@ -985,9 +991,14 @@ def save_run(self, alt_file=None, kind='minimal', force_overwrite=True):
if len(self.runtime.iter_info) > 0:
dump.runtime.iter_info = [self.runtime.iter_info[-1]]

if self.record_positions:
dump.positions = {}
for ID, S in self.obj.storages.items():
dump.positions[ID] = np.array([v.coord for v in S.views if v.pod.pr_view.layer==0])

content = dump

elif kind == 'minimal' or kind == 'dls':
elif kind in ('minimal', 'dls', 'used_params'):
# if self.interactor is not None:
# self.interactor.stop()
logger.info('Generating shallow copies of probe, object and '
Expand All @@ -1002,7 +1013,7 @@ def save_run(self, alt_file=None, kind='minimal', force_overwrite=True):
defaults_tree['ptycho'].validate(self.p) # check the parameters are actually able to be read back in
except RuntimeError:
logger.warning("The parameters we are saving won't pass a validator check!")
minimal.pars = self.p.copy() # _to_dict(Recursive=True)
minimal.pars = self.p.copy(depth=99) # _to_dict(Recursive=True)
minimal.runtime = self.runtime.copy()

content = minimal
Expand All @@ -1016,6 +1027,13 @@ def save_run(self, alt_file=None, kind='minimal', force_overwrite=True):
for ID, S in self.obj.storages.items():
content.obj[ID]['grids'] = S.grids()

if kind == 'used_params':
for name, engine in self.engines.items():
content.pars.engines[name] = engine.p
for name, scan in self.model.scans.items():
content.pars.scans[name] = scan.p
content.pars.scans[name].data = scan.ptyscan.p

if kind in ['minimal', 'dls'] and self.record_positions:
content.positions = {}
for ID, S in self.obj.storages.items():
Expand Down Expand Up @@ -1098,7 +1116,45 @@ def plot_overview(self, fignum=100):
cmap='gray')
fignum += 1



def copy_state(self, name="baseline", overwrite=False):
"""
Store a copy of the current state of object/probe
Warning: This feature is under development and syntax might change!
"""
if name in self.state_dict:
logger.warning("A state with name {:s} exists already".format(name))
if overwrite:
logger.warning("Overwrite {:s} state".format(name))
del self.state_dict[name]
else:
return
self.state_dict[name] = {}
self.state_dict[name]["ob"] = self.obj.copy()
self.state_dict[name]["pr"] = self.probe.copy()
self.state_dict[name]["runtime"] = self.runtime.copy(depth=99)
logger.info("Saved a copy of object and probe as the {:s} state".format(name))

def restore_state(self, name="baseline", reformat_exit=True):
"""
Restore object/probe based on a previously saved copy
Warning: This feature is under development and syntax might change!
"""
if name in self.state_dict:
for ID,S in self.probe.storages.items():
S.data[:] = self.state_dict[name]["pr"].storages[ID].data
for ID,S in self.obj.storages.items():
S.data[:] = self.state_dict[name]["ob"].storages[ID].data
self.runtime = self.state_dict[name]["runtime"]

# Reformat/Recalculate exit waves
if reformat_exit:
self.exit.reformat()
for scan in self.model.scans.values():
scan._initialize_exit(list(self.pods.values()))

def _redistribute_data(self, div = 'rect', obj_storage=None):
"""
This function redistributes data among nodes, so that each
Expand Down
1 change: 1 addition & 0 deletions ptypy/core/sample.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@
processed according to `process` in order to *simulate* a sample from e.g. a thickness
profile.
type = str, array
choices = ['recon', 'stxm', 'None']
userlevel = 0
[fill]
Expand Down

0 comments on commit 381065c

Please sign in to comment.