jarodrigues commited on
Commit
8bc75ad
1 Parent(s): 40305be

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -84,7 +84,8 @@ Please use the above cannonical reference when using or citing this model.
84
  # Model Description
85
 
86
  **This model card is for Gervásio 7B PT-BR**, with 7 billion parameters, a hidden size of 4096 units, an intermediate size of 11,008 units, 32 attention heads, 32 hidden layers, and a tokenizer obtained using the Byte-Pair Encoding (BPE) algorithm implemented with SentencePiece, featuring a vocabulary size of 32,000.
87
- Gervásio-7B-PTPT-Decoder is distributed under an [MIT license](https://huggingface.co/PORTULAN/albertina-ptpt/blob/main/LICENSE).
 
88
 
89
 
90
  <br>
 
84
  # Model Description
85
 
86
  **This model card is for Gervásio 7B PT-BR**, with 7 billion parameters, a hidden size of 4096 units, an intermediate size of 11,008 units, 32 attention heads, 32 hidden layers, and a tokenizer obtained using the Byte-Pair Encoding (BPE) algorithm implemented with SentencePiece, featuring a vocabulary size of 32,000.
87
+
88
+ Gervásio-7B-PTBR-Decoder is distributed under an [MIT license](https://huggingface.co/PORTULAN/albertina-ptpt/blob/main/LICENSE).
89
 
90
 
91
  <br>