-
Notifications
You must be signed in to change notification settings - Fork 739
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add warning message in sdk/cli to consume new llm tools #3196
Conversation
promptflow SDK CLI Azure E2E Test Result user/yalu4/llm_warning 4 files 4 suites 4m 11s ⏱️ Results for commit b3779ec. |
SDK CLI Global Config Test Result user/yalu4/llm_warning6 tests 6 ✅ 1m 13s ⏱️ Results for commit b3779ec. |
promptflow-core test result0 tests 0 ✅ 0s ⏱️ Results for commit b3779ec. |
Executor Unit Test Result user/yalu4/llm_warning792 tests 792 ✅ 3m 46s ⏱️ Results for commit b3779ec. |
Executor E2E Test Result user/yalu4/llm_warning243 tests 238 ✅ 5m 32s ⏱️ Results for commit b3779ec. |
SDK CLI Test Result user/yalu4/llm_warning 4 files 4 suites 1h 2m 16s ⏱️ Results for commit b3779ec. |
flow = Flow.load(source, **kwargs) | ||
if check_legacy_llm_in_flow_nodes(flow._data): | ||
logger.warning( | ||
"Please upgrade to the latest version of promptflow-tools to consume new LLM tools. " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
need to wait for new tool pypi release
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should specify which tool version.
@@ -20,6 +20,7 @@ | |||
PROMPTY_EXTENSION, | |||
) | |||
from promptflow._core._errors import MetaFileNotFound, MetaFileReadError | |||
from promptflow._core.tool import ToolType |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@@ -14,17 +14,17 @@ outputs: | |||
is_chat_output: true | |||
nodes: | |||
- name: chat | |||
type: llm | |||
type: custom_llm |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
feel risky to modify now, which may make local user fail to run new created flow.
@@ -14,17 +14,17 @@ outputs: | |||
is_chat_output: true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why not change other flow templates?
flow = Flow.load(source, **kwargs) | ||
if check_legacy_llm_in_flow_nodes(flow._data): | ||
logger.warning( | ||
"Please upgrade to the latest version of promptflow-tools to consume new LLM tools. " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Besides upgrade local pf-tools version, I think there are other things people also need to do to dismiss the warning.
Please consider thoughtfully.
No PR description? |
Hi, thank you for your interest in helping to improve the prompt flow experience and for your contribution. We've noticed that there hasn't been recent engagement on this pull request. If this is still an active work stream, please let us know by pushing some changes or leaving a comment. |
Hi, thank you for your contribution. Since there has not been recent engagement, we are going to close this out. Feel free to reopen if you'd like to continue working on these changes. Please be sure to remove the |
Description
Please add an informative description that covers that changes made by the pull request and link all relevant issues.
All Promptflow Contribution checklist:
General Guidelines and Best Practices
Testing Guidelines