jarodrigues
commited on
Commit
•
347e432
1
Parent(s):
466756e
Update README.md
Browse files
README.md
CHANGED
@@ -6,23 +6,10 @@ tags:
|
|
6 |
- gervasio-pt*
|
7 |
- gervasio-ptpt
|
8 |
- gervasio-ptbr
|
9 |
-
- gervasio-ptpt-base
|
10 |
-
- gervasio-ptbr-base
|
11 |
- gervasio-7b-portuguese-ptpt-decoder
|
12 |
- gervasio-7b-portuguese-ptbr-decoder
|
13 |
- portulan
|
14 |
- albertina-pt*
|
15 |
-
- albertina-ptpt
|
16 |
-
- albertina-ptbr
|
17 |
-
- albertina-ptbr-nobrwac
|
18 |
-
- albertina-ptpt-base
|
19 |
-
- albertina-ptbr-base
|
20 |
-
- albertina-100m-portuguese-ptpt-encoder
|
21 |
-
- albertina-100m-portuguese-ptbr-encoder
|
22 |
-
- albertina-900m-portuguese-ptpt-encoder
|
23 |
-
- albertina-900m-portuguese-ptbr-encoder
|
24 |
-
- albertina-1b5-portuguese-ptpt-encoder
|
25 |
-
- albertina-1b5-portuguese-ptbr-encoder
|
26 |
- clm
|
27 |
- gpt
|
28 |
- portuguese
|
@@ -48,8 +35,9 @@ datasets:
|
|
48 |
**Gervásio PT-*** is a **fully open** decoder for the **Portuguese language**.
|
49 |
|
50 |
|
51 |
-
It is a **decoder** of the LLaMA family, based on the neural architecture Transformer and developed over the LLaMA
|
52 |
-
Its further improvement through additional training was done over language resources that include new instruction data sets of Portuguese prepared for this purpose
|
|
|
53 |
|
54 |
It has different versions that were trained for different variants of Portuguese (PT),
|
55 |
namely for the European variant, spoken in Portugal ([**gervasio-7b-portuguese-ptpt-decoder**](https://huggingface.co/PORTULAN/gervasio-7b-portuguese-ptpt-decoder)), and for the American variant, spoken in Brazil ([**gervasio-7b-portuguese-ptbr-decoder**](https://huggingface.co/PORTULAN/gervasio-7b-portuguese-ptbr-decoder)).
|
@@ -139,8 +127,8 @@ This involves repurposing the tasks in various ways, such as generation of answe
|
|
139 |
| Model | MRPC (F1) | RTE (F1) | COPA (F1) |
|
140 |
|--------------------------|----------------|----------------|-----------|
|
141 |
| **Gervásio 7B PT-PT** | **0.7273** | **0.8291** | **0.5459**|
|
142 |
-
| **LLaMA
|
143 |
-
| **LLaMA
|
144 |
<br>
|
145 |
|
146 |
# How to use
|
|
|
6 |
- gervasio-pt*
|
7 |
- gervasio-ptpt
|
8 |
- gervasio-ptbr
|
|
|
|
|
9 |
- gervasio-7b-portuguese-ptpt-decoder
|
10 |
- gervasio-7b-portuguese-ptbr-decoder
|
11 |
- portulan
|
12 |
- albertina-pt*
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
13 |
- clm
|
14 |
- gpt
|
15 |
- portuguese
|
|
|
35 |
**Gervásio PT-*** is a **fully open** decoder for the **Portuguese language**.
|
36 |
|
37 |
|
38 |
+
It is a **decoder** of the LLaMA family, based on the neural architecture Transformer and developed over the LLaMA-2 7B model.
|
39 |
+
Its further improvement through additional training was done over language resources that include new instruction data sets of Portuguese prepared for this purpose ([extraGLUE-Instruct
|
40 |
+
](https://huggingface.co/datasets/PORTULAN/extraglue-instruct)).
|
41 |
|
42 |
It has different versions that were trained for different variants of Portuguese (PT),
|
43 |
namely for the European variant, spoken in Portugal ([**gervasio-7b-portuguese-ptpt-decoder**](https://huggingface.co/PORTULAN/gervasio-7b-portuguese-ptpt-decoder)), and for the American variant, spoken in Brazil ([**gervasio-7b-portuguese-ptbr-decoder**](https://huggingface.co/PORTULAN/gervasio-7b-portuguese-ptbr-decoder)).
|
|
|
127 |
| Model | MRPC (F1) | RTE (F1) | COPA (F1) |
|
128 |
|--------------------------|----------------|----------------|-----------|
|
129 |
| **Gervásio 7B PT-PT** | **0.7273** | **0.8291** | **0.5459**|
|
130 |
+
| **LLaMA-2** | 0.0328 | 0.0482 | 0.3844 |
|
131 |
+
| **LLaMA-2 Chat** | 0.5703 | 0.4697 | 0.4737 |
|
132 |
<br>
|
133 |
|
134 |
# How to use
|