-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FR] Why is there no 'log_model()' function in mlflow.client? #7392
Comments
I think we can add a method
@dbczumar @harupy @BenWilson2 WDYT ? |
Can we just add |
@WeichenXu123
Also, exceptions were handled for type check errors that occurred in the
I tested it and the above settings work. |
This is a bit of using MLflowClient, because MlflowClient might use another @jaehyeongAN |
@WeichenXu123 Got it.
This approach sounds good to me. |
Sounds good to me as well! |
We get consensus to add API like:
Would you contribute this new API ? Thank you! |
@WeichenXu123 Cool, I want to do that. |
@BenWilson2 @dbczumar @harupy @WeichenXu123 Please assign a maintainer and start triaging this issue. |
@jaehyeongAN @WeichenXu123 any updates on this? This would be really nice to have :) |
@jaehyeongAN I assigned the task to you. :) |
@jaehyeongAN @WeichenXu123 I understand when the plate gets full in life and work. I can try to take this on as it's applicable to my work if that's okay with y'all. I just may need some guidance as it would be my first time contributing to MLflow. On a related note, not only is |
I had totally forgot about this issue until recently... Since there has been no activity since, I went ahead and created a WIP PR here: #11906 |
@WeichenXu123 following up on the above WIP PR that may warrant discussion before I proceed further. Thanks! |
Willingness to contribute
Yes. I would be willing to contribute this feature with guidance from the MLflow community.
Proposal Summary
I want to use mlflow by parallel runs. But, mlflow is not thread-safe. when using
mlflow.start_run()
code, run id becomes a global variable. So, every parallel runs become crashed.I found a method of
log_artifact()
inmlflow.client()
. This method is not log model and meta info of model to mlflow tracking server, but only save to registry server.I want to method for
mlflow.client
likemlflow.skelarn.log_model()
that is save model to registry and log to mlflow tracking server.Is it possible?
Motivation
When using
mlflow.client
, There is no way to log model to mlflow tracking server.log_artifact()
method inmlflow.client
is only save to registry not log.This feature is necessary to users who want to use mlflow in parallel.
Now I',m using
mlflow.onnx.log_model()
. So parallel runs become crashed.Details
No response
What component(s) does this bug affect?
area/artifacts
: Artifact stores and artifact loggingarea/build
: Build and test infrastructure for MLflowarea/docs
: MLflow documentation pagesarea/examples
: Example codearea/model-registry
: Model Registry service, APIs, and the fluent client calls for Model Registryarea/models
: MLmodel format, model serialization/deserialization, flavorsarea/recipes
: Recipes, Recipe APIs, Recipe configs, Recipe Templatesarea/projects
: MLproject format, project running backendsarea/scoring
: MLflow Model server, model deployment tools, Spark UDFsarea/server-infra
: MLflow Tracking server backendarea/tracking
: Tracking Service, tracking client APIs, autologgingWhat interface(s) does this bug affect?
area/uiux
: Front-end, user experience, plotting, JavaScript, JavaScript dev serverarea/docker
: Docker use across MLflow's components, such as MLflow Projects and MLflow Modelsarea/sqlalchemy
: Use of SQLAlchemy in the Tracking Service or Model Registryarea/windows
: Windows supportWhat language(s) does this bug affect?
language/r
: R APIs and clientslanguage/java
: Java APIs and clientslanguage/new
: Proposals for new client languagesWhat integration(s) does this bug affect?
integrations/azure
: Azure and Azure ML integrationsintegrations/sagemaker
: SageMaker integrationsintegrations/databricks
: Databricks integrationsThe text was updated successfully, but these errors were encountered: