Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Importing metrics issue since there is not a way to pass the model path if stored locally #230

Open
Starignus opened this issue Feb 7, 2024 · 3 comments
Assignees

Comments

@Starignus
Copy link

If you want to import a metric or the metrics module like:

from langkit import toxicity,
from langkit import llm_metrics 

By default, it downloads the models from the Huggingfaces when you try to import the module. The issue is when your organisation blocks the connection for downloading big files, but the organisation hosts the models in a secure location. For your reference, see this issue on the Transformers page

I searched in Langkit documentation for a way that the user could indicate the path of the models, but I could not find anything. Besides, it is impossible to pass any variable to a module when importing it. The problem can be solved by letting the user provide a path in a configuration file (e.g. JSON) that could override the default path. For example, in the toxicity module, I can see that the option can be taken.

This can be a potential blocker if an organisation wants to try the package and cannot since it might have some security concerns. This will be a good enhancement.

@Starignus Starignus changed the title Importing metrics do not have the option to pass the model path if stored locally Importing metrics issue since there is not a way to pass the model path if stored locally Feb 7, 2024
@FelipeAdachi FelipeAdachi self-assigned this Feb 7, 2024
@FelipeAdachi
Copy link
Contributor

FelipeAdachi commented Feb 7, 2024

Hi @Starignus ,

Thank you for reporting this!
We are working on this and will let you know how this advances in this thread.

@Starignus
Copy link
Author

@FelipeAdachi Many thanks; I would appreciate the update!

@Starignus
Copy link
Author

@FelipeAdachi, do you think calling any other model hosted locally will also be possible? e.g. Llama Guard

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants