HugoLaurencon HF staff commited on
Commit
697ab43
1 Parent(s): 7ec80bb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -9,7 +9,7 @@ datasets:
9
  - wikipedia
10
  - facebook/pmd
11
  - laion/laion2B-en
12
- - HuggingFaceM4/OBELISC
13
  ---
14
 
15
 
@@ -372,7 +372,7 @@ Besides, we also computed the classification accuracy on FairFace for both the b
372
 
373
  # License
374
 
375
- The model is built on top of of two pre-trained models: [laion/CLIP-ViT-H-14-laion2B-s32B-b79K](https://huggingface.co/laion/CLIP-ViT-H-14-laion2B-s32B-b79K) and [huggyllama/llama-65b](https://huggingface.co/huggyllama/llama-65b). The first was released under an MIT license, while the second was released under a specific noncommercial license focused on research purposes. As such, users should comply with that license by applying directly to [Meta's form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform).
376
 
377
  We release the additional weights we trained under an MIT license.
378
 
@@ -381,8 +381,8 @@ We release the additional weights we trained under an MIT license.
381
  **BibTeX:**
382
 
383
  ```bibtex
384
- @misc{laurençon2023obelisc,
385
- title={OBELISC: An Open Web-Scale Filtered Dataset of Interleaved Image-Text Documents},
386
  author={Hugo Laurençon and Lucile Saulnier and Léo Tronchon and Stas Bekman and Amanpreet Singh and Anton Lozhkov and Thomas Wang and Siddharth Karamcheti and Alexander M. Rush and Douwe Kiela and Matthieu Cord and Victor Sanh},
387
  year={2023},
388
  eprint={2306.16527},
@@ -397,4 +397,4 @@ V, i, c, t, o, r, ,, , S, t, a, s, ,, , X, X, X
397
 
398
  # Model Card Contact
399
 
400
- Please open a discussion on the Community tab!
 
9
  - wikipedia
10
  - facebook/pmd
11
  - laion/laion2B-en
12
+ - HuggingFaceM4/OBELICS
13
  ---
14
 
15
 
 
372
 
373
  # License
374
 
375
+ The model is built on top of two pre-trained models: [laion/CLIP-ViT-H-14-laion2B-s32B-b79K](https://huggingface.co/laion/CLIP-ViT-H-14-laion2B-s32B-b79K) and [huggyllama/llama-65b](https://huggingface.co/huggyllama/llama-65b). The first was released under an MIT license, while the second was released under a specific noncommercial license focused on research purposes. As such, users should comply with that license by applying directly to [Meta's form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform).
376
 
377
  We release the additional weights we trained under an MIT license.
378
 
 
381
  **BibTeX:**
382
 
383
  ```bibtex
384
+ @misc{laurencon2023obelics,
385
+ title={OBELICS: An Open Web-Scale Filtered Dataset of Interleaved Image-Text Documents},
386
  author={Hugo Laurençon and Lucile Saulnier and Léo Tronchon and Stas Bekman and Amanpreet Singh and Anton Lozhkov and Thomas Wang and Siddharth Karamcheti and Alexander M. Rush and Douwe Kiela and Matthieu Cord and Victor Sanh},
387
  year={2023},
388
  eprint={2306.16527},
 
397
 
398
  # Model Card Contact
399
 
400
+ Please open a discussion on the Community tab!