Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[fix] stream field in llmconfig not work #1258

Merged
merged 4 commits into from May 17, 2024

Conversation

Wei-Jianan
Copy link
Contributor

@Wei-Jianan Wei-Jianan commented May 9, 2024

User description

Fix
-- #1257
Bug description
~/.metagpt/config2.yaml

llm:  
  stream: False

does not affect the way to call llm.aask.


PR Type

Bug fix, Enhancement


Description

  • Integrated the stream configuration from config2.yaml into the aask method to respect the user's settings.
  • Minor formatting improvements in benchmark and ranker factory files to enhance code readability and maintainability.

Changes walkthrough 📝

Relevant files
Enhancement
base_llm.py
Integrate stream configuration in aask method                       

metagpt/provider/base_llm.py

  • Imported config from metagpt.config2.
  • Modified the default value of the stream parameter in the aask method
    to be None.
  • Added a conditional to set stream based on the llm.stream
    configuration if stream is None.
  • +4/-1     
    ranker.py
    Refactor and improve import statements in ranker factory 

    metagpt/rag/factories/ranker.py

  • Reordered import statements for ranker configurations.
  • Improved import formatting for FlagEmbeddingReranker.
  • +6/-4     
    Formatting
    base.py
    Minor formatting cleanup                                                                 

    metagpt/rag/benchmark/base.py

    • Minor formatting change by removing extra whitespace.
    +1/-1     

    💡 PR-Agent usage:
    Comment /help on the PR to get a list of all available PR-Agent tools and their descriptions

    Copy link

    PR-Agent was enabled for this repository. To continue using it, please link your git user with your CodiumAI identity here.

    PR Description updated to latest commit (8f209bf)

    Copy link

    PR-Agent was enabled for this repository. To continue using it, please link your git user with your CodiumAI identity here.

    PR Review 🔍

    ⏱️ Estimated effort to review [1-5]

    2, because the changes are localized to specific methods and files with clear modifications. The changes are not extensive and involve straightforward logic adjustments and minor formatting improvements.

    🧪 Relevant tests

    No

    ⚡ Possible issues

    Possible Bug: The conditional check if stream is None: in aask method might not handle cases where stream is explicitly set to False by the caller, as it will override it with the config setting. This could lead to unexpected behavior if the caller expects stream=False to be respected regardless of the config.

    🔒 Security concerns

    No

    Copy link

    PR-Agent was enabled for this repository. To continue using it, please link your git user with your CodiumAI identity here.

    PR Code Suggestions ✨

    CategorySuggestions                                                                                                                                                       
    Enhancement
    Add error handling for unset configuration values.

    It's recommended to explicitly handle the case where stream might not be set in the
    configuration, to avoid potential runtime errors. Instead of directly assigning stream =
    config.llm.stream, consider adding a default value or a more robust error handling
    mechanism.

    metagpt/provider/base_llm.py [150-151]

     if stream is None:
    -    stream = config.llm.stream
    +    stream = getattr(config.llm, 'stream', True)  # Default to True if not set
     
    Provide detailed error messages for missing dependencies.

    To avoid runtime errors and improve the user experience, consider providing a more
    specific error message when the required package is not found, possibly including guidance
    on how to resolve the dependency issue.

    metagpt/rag/factories/ranker.py [67-69]

     raise ImportError(
    -    "`llama-index-postprocessor-flag-embedding-reranker` package not found, please run `pip install llama-index-postprocessor-flag-embedding-reranker`"
    +    "`llama-index-postprocessor-flag-embedding-reranker` package not found. Ensure it is installed by running `pip install llama-index-postprocessor-flag-embedding-reranker`, or check your environment settings."
     )
     
    Best practice
    Ensure explicit parameter passing for clarity and maintainability.

    To ensure that the stream parameter is always explicitly passed to the acompletion_text
    method, consider removing the default None assignment in the method signature and handle
    the default assignment within the method body.

    metagpt/provider/base_llm.py [136]

    -stream=None,
    +stream,
     
    Handle mutable default arguments safely.

    To avoid potential issues with mutable default arguments and to ensure that the function
    behaves as expected across multiple calls, consider using an immutable default argument or
    handling the default assignment within the function body.

    metagpt/provider/base_llm.py [133]

     format_msgs: Optional[list[dict[str, str]]] = None,
    +if format_msgs is None:
    +    format_msgs = []
     
    Maintainability
    Consolidate imports from the same module into a single line.

    To improve the readability and maintainability of the import statements, consider grouping
    imports from the same module together in a single line.

    metagpt/rag/factories/ranker.py [63-65]

    -from llama_index.postprocessor.flag_embedding_reranker import (
    -    FlagEmbeddingReranker,
    -)
    +from llama_index.postprocessor.flag_embedding_reranker import FlagEmbeddingReranker
     

    @codecov-commenter
    Copy link

    codecov-commenter commented May 9, 2024

    Codecov Report

    Attention: Patch coverage is 75.00000% with 1 lines in your changes are missing coverage. Please review.

    Project coverage is 70.19%. Comparing base (a32e238) to head (1c9214e).
    Report is 38 commits behind head on main.

    Current head 1c9214e differs from pull request most recent head cda6451

    Please upload reports for the commit cda6451 to get more accurate results.

    Files Patch % Lines
    metagpt/rag/factories/ranker.py 0.00% 1 Missing ⚠️

    ❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

    Additional details and impacted files
    @@            Coverage Diff             @@
    ##             main    #1258      +/-   ##
    ==========================================
    - Coverage   70.21%   70.19%   -0.02%     
    ==========================================
      Files         316      316              
      Lines       18866    18862       -4     
    ==========================================
    - Hits        13246    13241       -5     
    - Misses       5620     5621       +1     

    ☔ View full report in Codecov by Sentry.
    📢 Have feedback on the report? Share it here.

    metagpt/provider/base_llm.py Outdated Show resolved Hide resolved
    metagpt/rag/benchmark/base.py Outdated Show resolved Hide resolved
    @geekan geekan merged commit c20073e into geekan:main May 17, 2024
    0 of 2 checks passed
    @Wei-Jianan Wei-Jianan deleted the fix/cstream_not_work branch May 17, 2024 07:42
    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Labels
    Projects
    None yet
    Development

    Successfully merging this pull request may close these issues.

    None yet

    3 participants