ryanyip7777 commited on
Commit
75a125c
1 Parent(s): a9bb83c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -12,7 +12,7 @@ should probably proofread and complete it, then remove this comment. -->
12
 
13
  # clip-vit-l-14-pmc-finetuned
14
 
15
- This model is a fine-tuned version of [openai/clip-vit-large-patch14](https://huggingface.co/openai/clip-vit-large-patch14) on an [pmc_oa](.https://huggingface.co/datasets/axiong/pmc_oa) dataset.
16
  It achieves the following results on the evaluation set:
17
  - Loss: 1.0125
18
 
@@ -53,7 +53,7 @@ The following hyperparameters were used during training:
53
  - Tokenizers 0.13.3
54
 
55
 
56
- ### finetune this model use the script from run_clip.py (.https://github.com/huggingface/transformers/tree/main/examples/pytorch/contrastive-image-text)
57
  ```shell
58
 
59
  python -W ignore run_clip.py --model_name_or_path openai/clip-vit-large-patch14 \
 
12
 
13
  # clip-vit-l-14-pmc-finetuned
14
 
15
+ This model is a fine-tuned version of [openai/clip-vit-large-patch14](https://huggingface.co/openai/clip-vit-large-patch14) on an **pmc_oa** (https://huggingface.co/datasets/axiong/pmc_oa) dataset.
16
  It achieves the following results on the evaluation set:
17
  - Loss: 1.0125
18
 
 
53
  - Tokenizers 0.13.3
54
 
55
 
56
+ ### finetune this model use the script from *run_clip.py* (https://github.com/huggingface/transformers/tree/main/examples/pytorch/contrastive-image-text)
57
  ```shell
58
 
59
  python -W ignore run_clip.py --model_name_or_path openai/clip-vit-large-patch14 \