Skip to content

vivien000/clip-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Minimal user-friendly demo of OpenAI's CLIP

Screenshot

OpenAI's CLIP is a deep learning model that can estimate the "similarity" of an image and a text. In this way, you can search images matching a natural language query even though your image corpus doesn't include titles, descriptions, keywords...

This repository includes a simple demo built on 25,000 Unsplash images and 7,685 movie images.

Usage

Click here to try the included notebook in Google Colab. Connect with your Google account and execute the first (and only) code cell.

Alternatively, an equivalent demo can be found as a Hugging Face's Space (and it works without an account).

Acknowledgements

Thanks to:

I was inspired by Unsplash Image Search from Vladimir Haltakov and Alph, The Sacred River from Travis Hoppe.

About

Minimal user-friendly demo of OpenAI's CLIP for semantic image search

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published