Spaces:
Runtime error
Runtime error
Update app.py
Browse files
app.py
CHANGED
@@ -25,7 +25,7 @@ semantic_search = load_model()
|
|
25 |
description = '''
|
26 |
# Multilingual Semantic Search
|
27 |
|
28 |
-
**Search images in 100 languages (list [here](https://github.com/pytorch/fairseq/blob/main/examples/xlmr/README.md#introduction)) powered by MultiLingual CLIP
|
29 |
|
30 |
MultiLingual CLIP is a custom model built using OpenAI's [CLIP](https://openai.com/blog/clip/) and [XMLRoBERTa](https://huggingface.co/xlm-roberta-base) models, trained using 16 [Habana](https://habana.ai/) accelerators with PyTorch Lightning, Distributed Data Parallel, Mixed precision and using [COCO](https://cocodataset.org/) and [Google Conceptual Captions](https://ai.google.com/research/ConceptualCaptions) as training datasets.
|
31 |
|
|
|
25 |
description = '''
|
26 |
# Multilingual Semantic Search
|
27 |
|
28 |
+
**Search images in 100 languages (list [here](https://github.com/pytorch/fairseq/blob/main/examples/xlmr/README.md#introduction)) powered by [MultiLingual CLIP](https://huggingface.co/gzomer/clip-multilingual).**
|
29 |
|
30 |
MultiLingual CLIP is a custom model built using OpenAI's [CLIP](https://openai.com/blog/clip/) and [XMLRoBERTa](https://huggingface.co/xlm-roberta-base) models, trained using 16 [Habana](https://habana.ai/) accelerators with PyTorch Lightning, Distributed Data Parallel, Mixed precision and using [COCO](https://cocodataset.org/) and [Google Conceptual Captions](https://ai.google.com/research/ConceptualCaptions) as training datasets.
|
31 |
|