mrm8488 commited on
Commit
1d9373c
1 Parent(s): b0647b4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -8
README.md CHANGED
@@ -12,7 +12,6 @@ license: mit
12
  ---
13
  # Spanish GPT-2 trained on BETO's corpus (large_spanish_corpus)
14
 
15
- # BERTIN
16
  This is a Spanish GPT-2 model trained from scratch on the [large_spanish_corpus](https://huggingface.co/datasets/viewer/?dataset=large_spanish_corpus) aka BETO's corpus with [Flax](https://github.com/google/flax)
17
  This is part of the
18
  [Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
@@ -27,15 +26,15 @@ The dataset is about 20 GB. 95% of the data was used for training and the rest 5
27
  ## Team members
28
  - Manuel Romero ([mrm8488](https://huggingface.co/mrm8488))
29
  - María Grandury ([mariagrandury](https://huggingface.co/))
30
- - Eduardo González ([edugp](https://huggingface.co/edugp))
31
- - Paulo Villegas ([paulo](https://huggingface.co/paulo))
32
  - Pablo González de Prado ([Pablogps](https://huggingface.co/Pablogps))
33
- - Manu Romero ([mrm8488](https://huggingface.co/))
 
 
 
 
 
34
 
35
  ## Useful links
36
  - [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
37
  - [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
38
- - [Community Week thread](https://discuss.huggingface.co/t/bertin-pretrain-roberta-large-from-scratch-in-spanish/7125)
39
- - [Community Week channel](https://discord.com/channels/858019234139602994/859113060068229190)
40
- - [Masked Language Modelling example scripts](https://github.com/huggingface/transformers/tree/master/examples/flax/language-modeling)
41
- - [Model Repository](https://huggingface.co/flax-community/bertin-roberta-large-spanish/)
 
12
  ---
13
  # Spanish GPT-2 trained on BETO's corpus (large_spanish_corpus)
14
 
 
15
  This is a Spanish GPT-2 model trained from scratch on the [large_spanish_corpus](https://huggingface.co/datasets/viewer/?dataset=large_spanish_corpus) aka BETO's corpus with [Flax](https://github.com/google/flax)
16
  This is part of the
17
  [Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
 
26
  ## Team members
27
  - Manuel Romero ([mrm8488](https://huggingface.co/mrm8488))
28
  - María Grandury ([mariagrandury](https://huggingface.co/))
 
 
29
  - Pablo González de Prado ([Pablogps](https://huggingface.co/Pablogps))
30
+ - Daniel Vera ([daveni](https://huggingface.co/daveni))
31
+ - Sri Lakshmi ([srisweet](https://huggingface.co/srisweet))
32
+ - José Posada ([jdposa](https://huggingface.co/jdposa))
33
+ - Santiago Hincapie ([shpotes](https://huggingface.co/shpotes))
34
+ - Jorge ([jorgealro](https://huggingface.co/jorgealro))
35
+
36
 
37
  ## Useful links
38
  - [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
39
  - [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
40
+ - [Community Week thread](https://discuss.huggingface.co/t/pretrain-gpt2-from-scratch-in-spanish/7086/8)