Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Added safety_settings parameter for gemini #13568

Conversation

akumar-b6i
Copy link
Contributor

@akumar-b6i akumar-b6i commented May 17, 2024

Description

Currently, the Vertex API does not provide handling for safety_settings to Gemini. This update will enable users to configure safety_settings during the initialization of the Vertex module.

Vertex AI GenerativeMoel Reference : https://cloud.google.com/python/docs/reference/aiplatform/latest/vertexai.generative_models.GenerativeModel

Fixes # (issue)

New Package?

Did I fill in the tool.llamahub section in the pyproject.toml and provide a detailed README.md for my new integration or package?

  • No

Version Bump?

Did I bump the version in the pyproject.toml file of the package I am updating? (Except for the llama-index-core package)

  • No
    Have to confirm the right spot to update. Comment below.

Type of Change

Please delete options that are not relevant.

  • New feature (non-breaking change which adds functionality)

How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration

  • I stared at the code and made sure it makes sense
  • Ran the test file in the respective folder and it passed
  • Ran below test code to confirm :
import pytest
from llama_index.llms.vertex import Vertex

def test_vertex_initialization():
    safety_settings = {
        'HarmCategory.HARM_CATEGORY_HATE_SPEECH': 'HarmBlockThreshold.BLOCK_LOW_AND_ABOVE',
        'HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT': 'HarmBlockThreshold.BLOCK_LOW_AND_ABOVE'
    }

    # Create an instance of the Vertex class
    llm = Vertex(
        model="gemini-pro",
        project='GCP_PROJECT_ID',
        safety_settings=safety_settings,
        location="GCP_PROJECT_LOCATION",
    )

    # Check if safety_settings is correctly set

    assert llm.safety_settings == safety_settings

    # # Check if safety_settings is of correct type
    assert isinstance(llm.safety_settings, dict)

    # # Check if safety_settings keys are correct
    assert set(llm.safety_settings.keys()) == set(['HarmCategory.HARM_CATEGORY_HATE_SPEECH', 'HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT'])

    # # Check if safety_settings values are correct
    assert set(llm.safety_settings.values()) == set(['HarmBlockThreshold.BLOCK_LOW_AND_ABOVE', 'HarmBlockThreshold.BLOCK_LOW_AND_ABOVE'])

Result: PASS

Suggested Checklist:

  • I have performed a self-review of my own code

@akumar-b6i
Copy link
Contributor Author

Feature Request : #10788

@akumar-b6i akumar-b6i marked this pull request as ready for review May 17, 2024 23:12
@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label May 17, 2024
Copy link
Collaborator

@logan-markewich logan-markewich left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good to me. If you bump the version of the gemini llm integration package, it will auto publish on merge

@akumar-b6i
Copy link
Contributor Author

akumar-b6i commented May 17, 2024

This looks good to me. If you bump the version of the gemini llm integration package, it will auto publish on merge

I am a bit confused by the above wording, the change is in llama-index-llms-vertex so should the version update be here

authors = ["Your Name <you@example.com>"]
description = "llama-index llms vertex integration"
exclude = ["**/BUILD"]
license = "MIT"
name = "llama-index-llms-vertex"
readme = "README.md"
version = "0.1.5"

Or in the gemini folder here

@logan-markewich
Copy link
Collaborator

Whoops, in the vertex package (for some reason I assumed this change was in the gemini package lol)

@akumar-b6i akumar-b6i force-pushed the feature_request_10788_safety_settings branch from 7ece91c to b432080 Compare May 19, 2024 01:54
@dosubot dosubot bot added size:S This PR changes 10-29 lines, ignoring generated files. and removed size:XS This PR changes 0-9 lines, ignoring generated files. labels May 19, 2024
@akumar-b6i
Copy link
Contributor Author

Whoops, in the vertex package (for some reason I assumed this change was in the gemini package lol)

Done, let me know if it looks good.

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label May 20, 2024
@akumar-b6i akumar-b6i force-pushed the feature_request_10788_safety_settings branch 2 times, most recently from 4a74679 to cd25c1a Compare May 22, 2024 15:28
@akumar-b6i akumar-b6i force-pushed the feature_request_10788_safety_settings branch from cd25c1a to 424fd9f Compare May 23, 2024 21:14
@logan-markewich logan-markewich merged commit 19c0852 into run-llama:main May 24, 2024
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lgtm This PR has been approved by a maintainer size:S This PR changes 10-29 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants