Skip to content

cunni/ldr

Repository files navigation

====================================================================
John P. Cunningham, Columbia University
Copyright (C) 2015 John P. Cunningham

Matrix manifold optimization as used for linear 
dimensionality reduction.

This code implements the algorithms and experiments discussed in 
Cunningham and Ghahramani (2015), Journal of Machine Learning Research
(see ref below).  

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
====================================================================

====================================================================
Getting started

This code is implemented in MATLAB.  Either open MATLAB in this 
directory, or, once MATLAB is opened, make this directory the current
working directory *and run startup.m*.  There are a number of calls
to util functions and similar that require startup.m to have been run.

Update 2016: please note that a python package for much of the underlying
optimization is now available at https://pymanopt.github.io/ .  You
may prefer that library, if you wish to work in python.
====================================================================

====================================================================
Basic Usage Example

The following Matlab code is a simple example of how to run this project

> % first set up parameters
> parms = struct('show_fig',1,'save_fig',0,'randseed',0);
> % now run MAF
> test_maf( 8 , 2 , parms);
> % note that red dimensions are smoother than green (the first two dimensions
of the data) and black (the heuristic solver).

Note that this will run MAF (maximum autocorrelation factors), which 
maximizes the correlation between adjacent points in an effort to 
find dimensions in the data that are most temporally smooth. There is 
also a heuristic version based on old literature that implements an
LDA-like eigenvector solution.  This simply serves as a comparison.
The code will produce a figure with each cardinal dimension of the
data plotted in green, and the ordered MAF dimensions plotted in black
(for the LDA-like method) and red (for the Stiefel method).  Data are 
plotted against the temporal dimension (horizontal dimension). These 
look similar for these simple examples, though the Stiefel method is
numerically superior.  For this result and further explanations, see
the paper (reference below).

Another simple example is;

> test_pca( d , r , parms)

where d and r are the data dimension and the low-dimensional projection
dimension ( r<d ).  

> test_pca( 120 , 4 , parms)

is also perfectly reasonable.  NOTE: these pca examples are entirely 
confirmatory, because we know that standard PCA (with svd) is globally
optimal.  Any departure here from the standard PCA result is numerical
optimization issues.  The error here should be 0 or tiny; if not, increase
the iteration count or decrease the convergence tolerance.

To run several methods and reproduce figure results from the paper, run:

> test_project( 1 , ‘r_sweep’ , 1 , 0); 
> % make sure your editor pastes those quotation marks in a MATLAB acceptable form!
 
Doing so will produce exactly figures 2B, 3B, and 3D from the paper (ref below).
You can then do a ‘d_sweep’, and can then run test_iter, test_lda, and test_convergence to 
reproduce the remaining figures from the paper.  Note that these last three require 
a bit of mucking with the code; see those .m files for details.

====================================================================

====================================================================
Understanding This Code Computationally

Functions like test_* and run_* are simply wrappers to handle various
function calls.  Other than instantiating test data (test_*) and setting
initial points (run_*), these methods hold no intrinsic computation or 
complexity.  All real computation is in:

f_pca
f_maf
minimize_...
project_stiefel

The f_* functions are where the objective and gradient of the optimization
is calculated.  These are typically the vast users of any computational
allocation in this method, because they often operate on the data structures
themselves.    

minimize_stiefel_sd can be seen as a simple steepest descent method (with 
linesearch), with a few Stiefel tweaks.  There is similarly _grassmann_trust
which does a second order trust region optimization over the Grassmann manifold, 
or _grassmann_mosd which does steepest descent using the manopt library (mo).
Any minimize_ routine calls feval multiple times
on f_* to get a current f and gradf (objective and free gradient).  There are also
some alternative inner products (rather than Euclidean inner products)
which add a bit of computation and warrant consideration. This function also
calls project_stiefel, which is a simple svd method that projects any matrix down
to its closest point on the Stiefel manifold, which is to say that calling
project_stiefel( Q ) for any matrix Q gives the closest matrix to Q of the same size 
that has orthonormal columns.  That requires computation also.
====================================================================

====================================================================
The BibTeX citations for the primary paper used in this project are
the following and all the references therein.  Note that this will 
be updated upon publication.

@article{cunningham2015ldr,
  title={Linear dimensionality reduction: survey, insights, and generalizations},
  author={J.P. Cunningham and Z. Ghahramani},
  year={2015},
  journal={Journal of Machine Learning Research}
}

====================================================================

About

code for Cunningham and Ghahramani (2015), Journal of Machine Learning Research

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages