Skip to content

v0.4.0

Compare
Choose a tag to compare
@julia-tagbot julia-tagbot released this 11 Sep 14:27
· 1621 commits to master since this release
v0.4.0
99ef588
  • (Enhancment) Update to MLJBase 0.5.0 and MLJModels 0.4.0. The
    following new scikit-learn models are thereby made available:
  • ScikitLearn.jl
  • SVM: SVMClassifier, SVMRegressor, SVMNuClassifier,
    SVMNuRegressor, SVMLClassifier, SVMLRegressor,
  • Linear Models (regressors): ARDRegressor,
    BayesianRidgeRegressor, ElasticNetRegressor,
    ElasticNetCVRegressor, HuberRegressor, LarsRegressor,
    LarsCVRegressor, LassoRegressor, LassoCVRegressor,
    LassoLarsRegressor, LassoLarsCVRegressor,
    LassoLarsICRegressor, LinearRegressor,
    OrthogonalMatchingPursuitRegressor,
    OrthogonalMatchingPursuitCVRegressor,
    PassiveAggressiveRegressor, RidgeRegressor,
    RidgeCVRegressor, SGDRegressor, TheilSenRegressor
  • (New feature) The macro @pipeline allows one to construct linear
    (non-branching) pipeline composite models with one line of code. One
    may include static transformations (ordinary functions) in the
    pipeline, as well as target transformations for the supervised case
    (when one component model is supervised).

  • (Breaking) Source nodes (type Source) now have a kind field,
    which is either :input,:target or :other, with :input the
    default value in the source constructor. If building a learning
    network, and the network is to be exported as a standalone model,
    then it is now necessary to tag the source nodes accordingly, as in
    Xs = source(X) and ys = source(y, kind=:target).

  • (Breaking) By virtue of the preceding change, the syntax for
    exporting a learning network is simplified. Do?@from_network for
    details. Also, one now uses fitresults(N) instead of fit results(N, X, y) and fitresults(N, X) when exporting a learning
    network N "by hand"; see the updated
    manual
    for details.

  • (Breaking) One must explicitly state if a supervised learning
    network being exported with @from_network is probabilistic by
    adding is_probablistic=true to the macro expression. Before, this
    information was unreliably inferred from the network.

  • (Enhancement) Add macro-free method for loading model code into an arbitrary
    module. Do ?load for details.

  • (Enhancement) @load now returns a mode instance with default
    hyperparameters (instead of nothing), as in tree_model = @load DecisionTreeRegressor

  • (Breaking) info("PCA") now returns a named-tuple, instead of a
    dictionary, of the properties of a the model named "PCA"

  • (Breaking) The list returned by models(conditional) is now a list
    of complete metadata entries (named-tuples, as returned by
    info). An entry proxy appears in the list exactly when
    conditional(proxy) == true. Model query is simplified; for
    example models() do model model.is_supervised && model.is_pure_julia end finds all pure julia supervised models.

  • (Bug fix) Introduce new private methods to avoid relying on MLJBase
    type piracy MLJBase
    #30
    .

  • (Enhancement) If composite is a a learning network exported as a
    model, and m = machine(composite, args...) then report(m)
    returns the reports for each machine in the learning network, and
    similarly for fitted_params(m).

  • (Enhancement) MLJ.table, vcat and hcat now overloaded for
    AbstractNode, so that they can immediately be used in defining
    learning networks. For example, if X = source(rand(20,3)) and
    y=source(rand(20)) then MLJ.table(X) and vcat(y, y) both make
    sense and define new nodes.

  • (Enhancement) pretty(X) prints a pretty version of any table X,
    complete with types and scitype annotations. Do ?pretty for
    options. A wrap of pretty_table from PrettyTables.jl.

  • (Enhancement) std is re-exported from Statistics

  • (Enhancement) The manual and MLJ
    cheatsheet

    have been updated.

  • Performance measures have been migrated to MLJBase, while the model
    registry and model load/search facilities have migrated to
    MLJModels. As relevant methods are re-exported to MLJ, this is
    unlikely to effect many users.