ahb commited on
Commit
6fc488b
1 Parent(s): b3bbc96

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +23 -6
README.md CHANGED
@@ -20,18 +20,35 @@ widget:
20
 
21
  **Albertina PT-*** is a foundation, large language model for the **Portuguese language**.
22
 
23
- It is an **encoder** of the BERT family, based on a Transformer architecture, developed over the DeBERTa model, with most competitive performance for this language.
 
 
 
 
24
 
25
- It has different versions that were trained for different variants of Portuguese (PT), namely the European variant from Portugal (PT-PT) and the American variant from Brazil (PT-BR), and it is distributed free of charge and under a most permissible license.
 
 
 
26
 
27
- It was developped by a joint team from the University of Lisbon and the University of Porto, Portugal. For further details, check the respective publication:
 
28
 
29
- Rodrigues, João António, Luís Gomes, João Silva, António Branco, Rodrigo Santos, Henrique Lopes Cardoso, Tomás Osório, 2023, Advancing Neural Encoding of Portuguese with Transformer Albertina PT-*, arXiv ###.
 
 
 
 
 
 
 
 
 
30
 
31
  Please use the above cannonical reference when using or citing this model.
32
 
33
- teste
34
-
35
 
36
 
37
  ## Model Description
 
20
 
21
  **Albertina PT-*** is a foundation, large language model for the **Portuguese language**.
22
 
23
+ It is an **encoder** of the BERT family, based on a Transformer architecture,
24
+ developed over the DeBERTa model, with most competitive performance for this language.
25
+ It has different versions that were trained for different variants of Portuguese (PT),
26
+ namely the European variant from Portugal (PT-PT) and the American variant from Brazil (PT-BR),
27
+ and it is distributed free of charge and under a most permissible license.
28
 
29
+ **Albertina PT-PT** is the version for European Portuguese from Portugal,
30
+ and to the best of our knowledge, at the time of its initial distribution,
31
+ it was the first competitive encoder for this language and variant
32
+ that had been made publicly available and distributed for reuse.
33
 
34
+ It was developped by a joint team from the University of Lisbon and the University of Porto, Portugal.
35
+ For further details, check the respective publication:
36
 
37
+ ``` latex
38
+ @misc{albertina-pt,
39
+ title={Advancing Neural Encoding of Portuguese with Transformer Albertina PT-*},
40
+ author={João Rodrigues and Luís Gomes and João Silva and António Branco and Rodrigo Santos and Henrique Lopes and Tomás Osório},
41
+ year={2023},
42
+ eprint={?},
43
+ archivePrefix={arXiv},
44
+ primaryClass={cs.CL}
45
+ }
46
+ ```
47
 
48
  Please use the above cannonical reference when using or citing this model.
49
 
50
+ <br>
51
+ <br>
52
 
53
 
54
  ## Model Description