Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Models and weights for ConaCLIP #327

Open
justlike-prog opened this issue Aug 4, 2023 · 5 comments
Open

Models and weights for ConaCLIP #327

justlike-prog opened this issue Aug 4, 2023 · 5 comments

Comments

@justlike-prog
Copy link

Hi any way you could release the ConaCLIP models and weights soon? Referring to this paper ConaCLIP

@jpWang
Copy link
Collaborator

jpWang commented Aug 6, 2023

Hi, thanks for your attention to our work. ConaCLIP weights have been released as alibaba-pai/pai-conaclip-text2L-vit-small-patch16, alibaba-pai/pai-conaclip-text4L-vit-small-patch16 and alibaba-pai/pai-conaclip-text6L-vit-small-patch16. You can refer to, for example, Here, to load and use them.

@justlike-prog
Copy link
Author

Is there a direct method to access the weights? Could you share the size in MB of the vision encoder on EC-ConaCLIP-2L so I can see if the size fits my requirements?

@susht3
Copy link

susht3 commented Aug 14, 2023

Hi, thanks for your attention to our work. ConaCLIP weights have been released as alibaba-pai/pai-conaclip-text2L-vit-small-patch16, alibaba-pai/pai-conaclip-text4L-vit-small-patch16 and alibaba-pai/pai-conaclip-text6L-vit-small-patch16. You can refer to, for example, Here, to load and use them.

i can't find these models from : https://huggingface.co/alibaba-pai, how to download these models?

@susht3
Copy link

susht3 commented Aug 14, 2023

1692003857465 only these models

@Nashihikari
Copy link

Hi, thanks for your attention to our work. ConaCLIP weights have been released as alibaba-pai/pai-conaclip-text2L-vit-small-patch16, alibaba-pai/pai-conaclip-text4L-vit-small-patch16 and alibaba-pai/pai-conaclip-text6L-vit-small-patch16. You can refer to, for example, Here, to load and use them.

Hi, how can I get the code or model of ConaCLIP method directly? I can't find it in the framework of EasyNLP

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants