Spaces:
Runtime error
Runtime error
kawa
commited on
Commit
•
af7ba6c
1
Parent(s):
bae727c
changed description
Browse files
app.py
CHANGED
@@ -64,6 +64,7 @@ with gr.Blocks() as clip_demo:
|
|
64 |
* CLIP combines large language models with images to form an unified embedding space.
|
65 |
* Embeddings of CLIP can be used to compare two images, to compare two text prompts and to compare text prompt to image.
|
66 |
* It can be used for e.g. zero-shot classification, image retrieval.
|
|
|
67 |
""")
|
68 |
|
69 |
with gr.Row():
|
|
|
64 |
* CLIP combines large language models with images to form an unified embedding space.
|
65 |
* Embeddings of CLIP can be used to compare two images, to compare two text prompts and to compare text prompt to image.
|
66 |
* It can be used for e.g. zero-shot classification, image retrieval.
|
67 |
+
If you want to learn more, have a look at the [original paper](https://arxiv.org/abs/2103.00020) or the [post by openai](https://openai.com/blog/clip/).
|
68 |
""")
|
69 |
|
70 |
with gr.Row():
|