New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[fix] stream field in llmconfig not work #1258
Conversation
PR-Agent was enabled for this repository. To continue using it, please link your git user with your CodiumAI identity here. PR Description updated to latest commit (8f209bf)
|
PR-Agent was enabled for this repository. To continue using it, please link your git user with your CodiumAI identity here. PR Review 🔍
|
PR-Agent was enabled for this repository. To continue using it, please link your git user with your CodiumAI identity here. PR Code Suggestions ✨
|
Codecov ReportAttention: Patch coverage is
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. Additional details and impacted files@@ Coverage Diff @@
## main #1258 +/- ##
==========================================
- Coverage 70.21% 70.19% -0.02%
==========================================
Files 316 316
Lines 18866 18862 -4
==========================================
- Hits 13246 13241 -5
- Misses 5620 5621 +1 ☔ View full report in Codecov by Sentry. |
User description
Fix
-- #1257
Bug description
~/.metagpt/config2.yaml
does not affect the way to call llm.aask.
PR Type
Bug fix, Enhancement
Description
stream
configuration fromconfig2.yaml
into theaask
method to respect the user's settings.Changes walkthrough 📝
base_llm.py
Integrate stream configuration in aask method
metagpt/provider/base_llm.py
config
frommetagpt.config2
.stream
parameter in theaask
methodto be
None
.stream
based on thellm.stream
configuration if
stream
isNone
.ranker.py
Refactor and improve import statements in ranker factory
metagpt/rag/factories/ranker.py
FlagEmbeddingReranker
.base.py
Minor formatting cleanup
metagpt/rag/benchmark/base.py