Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM not following description in config #1123

Open
credelosa2022 opened this issue Apr 18, 2024 · 3 comments
Open

LLM not following description in config #1123

credelosa2022 opened this issue Apr 18, 2024 · 3 comments

Comments

@credelosa2022
Copy link

credelosa2022 commented Apr 18, 2024

System Info

Os version: Google Colab
Python version: Google Colab
Pandasai version: 2.0.34

馃悰 Describe the bug


from pandasai.llm import GooglePalm

llm = GooglePalm(api_key="xxxx")

data_df = pd.read_csv("Loan payments data.csv") #This is from your example in github

sdf = SmartDataframe(data_df, config={"llm": llm,
                                      "description":"You are a data analysis agent. Your main goal is to help non-technical users to analyze data. When you provide visualizations, make sure you include the axis labels.",
                                      "seed": 2024})
                                      
sdf.chat("In the same graph, show the age distribution where each histogram corresponds to a gender category. ")

image

Is there a better way to do this?

@seanshanker
Copy link

i'm a newbie in this area myself...so take what i say with a pinch of salt..i didnt realize you could tell it to plot with axes labels in the description. i've usually done it in the sdf.chat with decent success.

@ArslanSaleem
Copy link
Collaborator

ArslanSaleem commented Apr 18, 2024

credelosa2022 instead of config pass description to Agent class like:

agent = Agent([data_df], config=config, description="You are a data analysis agent. Your main goal is to help non-technical users to analyze data. When you provide visualizations, make sure you include the axis labels.")

agent.chat...

Let me know if it helps

Thanks

@adri0
Copy link

adri0 commented Apr 19, 2024

I noticed that since the version 2.0.34 my tests using the and Agent with FakeLLM started breaking.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants