Skip to content

Releases: BayesWitnesses/m2cgen

v0.10.0

25 Apr 18:51
cd67aa0
Compare
Choose a tag to compare
  • Python 3.6 is no longer supported.
  • Added support for Python 3.9 and 3.10.
  • Trained models can now be transpiled into Rust and Elixir 🎉
  • Model support:
    • Added support for SGDRegressor from the lightning package.
    • Added support for extremely randomized trees in the LightGBM package.
    • Added support for OneClassSVM from the scikit-learn package.
  • Various improvements to handle the latest versions of the supported models.
  • Various CI/CD improvements including migration from coveralls to codecov, automated generation of the code examples and automated GitHub Release creation.
  • Minor codebase cleanup.
  • Significantly reduced the number of redundant parentheses and return statements in the generated code.
  • Latest Dart language versions are supported.
  • Programming languages can provide native implementation of sigmoid and softmax functions.
  • Improved code generation speed by adding new lines at the end of a generated code.

v0.9.0

18 Sep 19:17
Compare
Choose a tag to compare
  • Python 3.5 is no longer supported.
  • Trained models can now be transpiled into F# 🎉 .
  • Model support:
    • Added support for GLM models from the scikit-learn package.
    • Introduced support for a variety of objectives in LightGBM models.
    • The cauchy function is now supported for GLM models.
  • Improved conversion of floating point numbers into string literals. This leads to improved accuracy of results returned by generated code.
  • Improved handling of missing values in LightGBM models. Kudos to our first time contributor @Aulust 🎉
  • Various improvements of the code generation runtime.

v0.8.0

18 Jun 14:52
Compare
Choose a tag to compare
  • This release is the last one which supports Python 3.5. Next release will require Python >= 3.6.
  • Trained models can now be transpiled into Haskell and Ruby 🎉
  • Various improvements of the code generation runtime:
    • Introduced caching of the interpreter handler names.
    • A string buffer is now used to store generated code.
    • We moved away from using the string.Template.
  • The numpy dependency is no longer required at runtime for the generated Python code.
  • Improved model support:
    • Enabled multiclass support for XGBoost Random Forest models.
    • Added support of Boosted Random Forest models from the XGBoost package.
    • Added support of GLM models from the statsmodels package.
  • Introduced fallback expressions for a variety of functions which rely on simpler language constructs. This should simplify implementation of new interpreters since the number of functions that must be provided by the standard library or by a developer of the given interpreter has been reduced. Note that fallback expressions are optional and can be overridden by a manually written implementation or a corresponding function from the standard library. Among functions for which fallback AST expressions have been introduced are: abs, tanh, sqrt, exp, sigmoid and softmax.

Kudos to @StrikerRUS who's responsible for all these amazing updates 💪

v0.7.0

07 Apr 16:47
Compare
Choose a tag to compare
  • Bug fixes:
    • Thresholds for XGBoost trees are forced to be float32 now (#168).
    • Fixed support for newer versions of XGBoost, in which the default value for the base_score parameter became None (#182).
  • Models can now be transpiled into the Dart language. Kudos to @MattConflitti for this great addition 🎉
  • Support for following models has been introduced:
    • Models from the statsmodels package are now supported. The list of added models includes: GLS, GLSAR, OLS, ProcessMLE, QuantReg and WLS.
    • Models from the lightning package: AdaGradRegressor/AdaGradClassifier, CDRegressor/CDClassifier, FistaRegressor/FistaClassifier, SAGARegressor/SAGAClassifier, SAGRegressor/SAGClassifier, SDCARegressor/SDCAClassifier, SGDClassifier, LinearSVR/LinearSVC and KernelSVC.
    • RANSACRegressor from the scikit-learn package.
  • The name of the scoring function can now be changed via a parameter. Thanks @mrshu 💪
  • The SubroutineExpr expression has been removed from AST. The logic of how to split the generated code into subroutines is now focused in interpreters and was completely removed from assemblers.

v0.6.0

17 Feb 03:52
Compare
Choose a tag to compare
  • Trained models can now be transpiled into R, PowerShell and PHP. Major effort delivered solely by @StrikerRUS .
  • In Java interpreter introduced a logic that splits code into methods that is based on heuristics and which doesn't rely on SubroutineExpr from AST.
  • Added support of LightGBM and XGBoost Random Forest models.
  • XGBoost linear models are now supported.
  • LassoLarsCV, Perceptron and PassiveAggressiveClassifier estimators from scikit-learn package are now supported.

v0.5.0

01 Dec 18:38
Compare
Choose a tag to compare

Quite a few awesome updates in this release. Many thanks to @StrikerRUS and @chris-smith-zocdoc for making this release happen.

  • Visual Basic and C# joined the list of supported languages. Thanks @StrikerRUS for all the hard work!
  • The numpy dependency is no longer required for generated Python code when no linear algebra is involved. Thanks @StrikerRUS for this update.
  • Fixed the bug when generated Java code exceeded the JVM method size constraints in case when individual estimators of a GBT model contained a large number of leaves. Kudos to @chris-smith-zocdoc for discovering and fixing this issue.

v0.4.0

28 Sep 22:49
Compare
Choose a tag to compare
  • JavaScript is now among supported languages. Kudos to @bcampbell-prosper for this contribution.

v0.3.1

15 Aug 16:03
Compare
Choose a tag to compare
  • Fixed generation of XGBoost models in case when feature names are not specified in a model object (#93). Thanks @akhvorov for contributing the fix.

v0.3.0

21 May 19:04
Compare
Choose a tag to compare
  • Added support of the following SVM model implementations from scikit-learn: SVC, NuSVC, SVR and NuSVR.

v0.2.1

17 Apr 18:08
Compare
Choose a tag to compare
  • For XGBoost models add support of the best_ntree_limit attribute to limit the number of estimators used during prediction. Thanks @arshamg for helping with that.