Skip to content

This Python package enables to efficiently compute leave-one-out cross validation error for multinomial logistic regression with elastic net (L1 and L2) penalty. The computation is based on an analytical approximation, which enables to avoid re-optimization and to reduce much computational time. MATLAB version: https://github.com/T-Obuchi/Accele…

License

T-Obuchi/AcceleratedCVonMLR_python

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

63 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AcceleratedCVonMLR_Python

AcceleratedCVonMLR_Python is a Python module for approximate cross-validation for multinomial logistic regression with elastic net penalty.

This is a free software, you can redistribute it and/or modify it under the terms of the GNU General Public License, version 3 or above. See LICENSE.txt for details.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

INSTALL

  • Compile from sourse
git clone https://github.com/T-Obuchi/AcceleratedCVonMLR_python.git
cd AcceleratedCVonMLR_python
python setup.py install
  • currentry, pip installation is not supported yet.

DESCRIPTION

Using estimated weight vectors wV given the feature data X and the class Ycode for multinomial logistic regression penalized by elastic net regularizer, this program computes and returns an approximate leave-one-out estimator (LOOE) and its standard error of predictive likelihood. All required modules are in the "accelerated_cv_on_mlr" package. Note that this program itself does not contain any solver to obtain wV. Please use other distributed programs for the purpose.

Requirement

USAGE

multinomial case

For multinomial logistic regression with Np (>2) classes,

import accelerated_cv_on_mlr as acv
[LOOE,ERR] = acv.acv_mlr(wV, X, Ycode, Np, lambda2)
  • Arguments and Returns
    • Arguments:

      • wV: weight vectors ((p, N)-shape np.float64 array)
      • X: input feature matrix ((M, N)-shape np.float64 array)
      • Ycode: class representative binary matrix ((M, p)-shape np.int64 array)
      • Np: number of classes (int value)
      • lambda2 : coefficient of the l2 regularization term (float value)
    • Returns:

      • LOOE: Approximate value of the leave-one-out estimator
      • ERR: Approximate standard error of the leave-one-out estimator

binomial case

For binomial logistic regression (logit model),

import accelerated_cv_on_mlr as acv
[LOOE,ERR] = acv.acv_logit(w, X, Ycode, lambda2)
  • Arguments and Returns

    • Arguments:
      • w: weight vector ((1,N)-shape np.float64 array)
      • X: input feature matrix ((M, N)-shape np.float64 array)
      • Ycode: binary matrix representing the class to which the corresponding feature vector belongs ((M, 2)-shape np.int64 array)
      • lambda2 : coefficient of the l2 regularization term (float value)
    • Returns:
      • LOOE: Approximate value of the leave-one-out estimator
      • ERR: Approximate standard error of the leave-one-out estimator
  • For more details, see docstrings.

acv.acv_mlr?
acv.acv_logit?
  • or see the api-documentation:
    • (PATH_TO_AcceleratedCVonMLR_python)/docs/_build/_html/index.html

api-documentation-screenshot

DEMONSTRATION

In the "sample" folder, demonstration Jupyter notebooks for the multinomial and binomial logistic regressions, sample_logit.ipynb and sample_mlr.ipynb, respectively, are available.

REFERENCE

Tomoyuki Obuchi and Yoshiyuki Kabashima: "Accelerating Cross-Validation in Multinomial Logistic Regression with $ell_1$-Regularization", arXiv: 1711.05420

About

This Python package enables to efficiently compute leave-one-out cross validation error for multinomial logistic regression with elastic net (L1 and L2) penalty. The computation is based on an analytical approximation, which enables to avoid re-optimization and to reduce much computational time. MATLAB version: https://github.com/T-Obuchi/Accele…

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages