Text Generation
Transformers
PyTorch
Spanish
gptj
causal-lm
Inference Endpoints
versae commited on
Commit
f43ae24
1 Parent(s): c70e489

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -0
README.md CHANGED
@@ -99,6 +99,21 @@ We still have to find proper datasets to evaluate the model, so help is welcome!
99
 
100
  To cite this model:
101
  ```bibtex
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
102
  @article{BERTIN,
103
  author = {Javier De la Rosa y Eduardo G. Ponferrada y Manu Romero y Paulo Villegas y Pablo González de Prado Salas y María Grandury},
104
  title = {{BERTIN}: Efficient Pre-Training of a Spanish Language Model using Perplexity Sampling},
 
99
 
100
  To cite this model:
101
  ```bibtex
102
+ @inproceedings{BERTIN-GPT,
103
+ author = {Javier De la Rosa and Andres Fernández},
104
+ editor = {Manuel Montes-y-Gómezd and Julio Gonzalod and Francisco Rangeld and Marco Casavantesd and Miguel Ángel Álvarez-Carmonad and Gemma Bel-Enguixd and Hugo Jair Escalanted and Larissa Freitasd and Antonio Miranda-Escaladad and Francisco Rodríguez-Sánchezd and Aiala Rosád and Marco Antonio Sobrevilla-Cabezudod and Mariona Tauléd and Rafael Valencia-García},
105
+ title = {Zero-shot Reading Comprehension and Reasoning for Spanish with {BERTIN} {GPT-J-6B}},
106
+ date = {2022-09},
107
+ booktitle = {Proceedings of the Iberian Languages Evaluation Forum (IberLEF 2022)},
108
+ booktitleaddon = {Co-located with the Conference of the Spanish Society for Natural Language Processing (SEPLN 2022)},
109
+ eventdate = {2022-09-20/2022-09-25},
110
+ venue = {A Coru\~{n}a, Spain},
111
+ publisher = {CEUR Workshop Proceedings},
112
+ }
113
+ ```
114
+
115
+ To cite the data used to train it:
116
+ ```bibtex
117
  @article{BERTIN,
118
  author = {Javier De la Rosa y Eduardo G. Ponferrada y Manu Romero y Paulo Villegas y Pablo González de Prado Salas y María Grandury},
119
  title = {{BERTIN}: Efficient Pre-Training of a Spanish Language Model using Perplexity Sampling},