Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add a functional API for optimization #98

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

engintoklu
Copy link
Collaborator

This pull request introduces an alternative API for EvoTorch that conforms to the functional programming paradigm. This functional API can be used together with torch.func.vmap, and therefore can be used for optimizing not just a single population, but a batch of populations simultaneously.

The main improvements are:

  • Functional counterparts of cross entropy method (cem) and policy gradients with parameter-based exploration (pgpe) are implemented. These algorithm implementations can be used with vmap, or can be given batches of starting points (center_init arguments) so that they will generate batches of populations centered around them.
  • Functional counterparts of gradient-based optimizers adam, clipup, and sgd are implemented. The interfaces of these optimizers are similar to the interfaces of functional cem and pgpe. Therefore, the user can switch back and forth between evolutionary approach and gradient-based approach for solving a problem, with minimum amount of code change.
  • The decorator @expects_ndim is introduced. This decorator is used for declaring how many dimensions are expected per each positional argument of the decorated function. Upon receiving tensors whose number of dimensions are more than expected, the decorated function interprets those tensors as batched arguments, applies vmap on itself, and performs its operations across the batch dimensions.
  • The decorator @rowwise is introduced. This decorator is used for declaring that a function is implemented with the assumption that its argument is a vector. If it receives a tensor with 2 or more dimensions, the function applies vmap on itself, and performs its operations across multiple batch dimensions.

This commit introduces an alternative API for
EvoTorch that conforms to the functional
programming paradigm. This functional API can
be used together with `torch.func.vmap`, and
therefore can be used for optimizing not just
a single population, but a batch of populations
simultaneously.

The main improvements are:

(i) Functional counterparts of cross entropy
method (`cem`) and policy gradients with
parameter-based exploration (`pgpe`) are
implemented. These algorithm implementations can
be used with `vmap`, or can be given batches of
starting points (`center_init` arguments) so that
they will generate batches of populations
centered around them.

(ii) Functional counterparts of gradient-based
optimizers `adam`, `clipup`, and `sgd` are
implemented. The interfaces of these optimizers
are similar to the interfaces of functional `cem`
and `pgpe`. Therefore, the user can switch back
and forth between evolutionary approach and
gradient-based approach for solving a problem,
with minimum amount of code change.

(iii) The decorator `@expects_ndim` is
introduced. This decorator is used for declaring
how many dimensions are expected per each
positional argument of the decorated function.
Upon receiving tensors whose number of dimensions
are more than expected, the decorated function
interprets those tensors as batched arguments,
applies `vmap` on itself, and performs its
operations across the batch dimensions.

(iv) The decorator `@rowwise` is introduced.
This decorator is used for declaring that a
function is implemented with the assumption that
its argument is a vector. If it receives a tensor
with 2 or more dimensions, the function applies
`vmap` on itself, and performs its operations
across multiple batch dimensions.
@engintoklu engintoklu added the enhancement New feature or request label Jan 17, 2024
@engintoklu engintoklu self-assigned this Jan 17, 2024
Co-authored-by: Rupesh K Srivastava <rupesh@nnaisense.com>
Copy link

codecov bot commented Feb 3, 2024

Codecov Report

Attention: 113 lines in your changes are missing coverage. Please review.

Comparison is base (5c58566) 76.74% compared to head (8224905) 77.36%.

Files Patch % Lines
src/evotorch/distributions.py 80.79% 34 Missing ⚠️
src/evotorch/core.py 31.42% 24 Missing ⚠️
src/evotorch/algorithms/functional/misc.py 67.50% 13 Missing ⚠️
src/evotorch/tools/misc.py 78.43% 11 Missing ⚠️
src/evotorch/algorithms/functional/funcpgpe.py 89.53% 9 Missing ⚠️
src/evotorch/algorithms/functional/funccem.py 87.50% 8 Missing ⚠️
src/evotorch/tools/constraints.py 87.30% 8 Missing ⚠️
src/evotorch/algorithms/functional/funcclipup.py 92.50% 3 Missing ⚠️
src/evotorch/decorators.py 96.25% 3 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master      #98      +/-   ##
==========================================
+ Coverage   76.74%   77.36%   +0.61%     
==========================================
  Files          49       57       +8     
  Lines        7509     8213     +704     
==========================================
+ Hits         5763     6354     +591     
- Misses       1746     1859     +113     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

flukeskywalker and others added 3 commits February 3, 2024 23:11
This commit adds the classmethod with signature
`functional_sample(num_solutions, params)`
to the classes `SeparableGaussian` and
`SymmetricSeparableGaussian`.
These classmethods contain alternative
implementations for generating samples from
their distributions, completely in a stateless
manner.

Functional samplers created via the utility
function `make_functional_sampler(...)` will
now check whether or not the wrapped
distribution has its classmethod
`functional_sample(...)` (the other case
being `functional_sample=NotImplemented`).
If the classmethod does exist, the functional
sampler will use that method to generate
the samples.

The goal of this new mechanism is to allow
the distributions to have alternative sampling
mechanisms that have better compatibility
with the pure functional programming paradigm.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants