pablo-rf commited on
Commit
8a1b811
1 Parent(s): a9e7a5c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -5
README.md CHANGED
@@ -91,13 +91,13 @@ pipeline_tag: text-generation
91
  library_name: transformers
92
  ---
93
 
94
- # Llama3.1-Carballo
95
 
96
  ## Table of Contents
97
  <details>
98
  <summary>Click to expand</summary>
99
 
100
- - [Llama3.1-Carballo](#llama31-carballo)
101
  - [Table of Contents](#table-of-contents)
102
  - [Model description](#model-description)
103
  - [Intended uses and limitations](#intended-uses-and-limitations)
@@ -117,12 +117,13 @@ library_name: transformers
117
 
118
  ## Model description
119
 
120
- **Llama3.1-Carballo** is a 8B-parameter transformer-based causal language model for Galician, Portuguese, Spanish, Catalan and English.
121
  It is the result of a continual pretraining of [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) with a multilingual corpus of almost 20B tokens, with an emphasis of Galician texts.
122
 
 
123
  ## Intended uses and limitations
124
 
125
- The **Llama3.1-Carballo** model is ready-to-use only for causal language modeling.
126
  It can perform text-generation tasks and be fine-tuned for specific scenarios.
127
 
128
  ## How to use
@@ -132,7 +133,7 @@ from transformers import pipeline, AutoTokenizer, AutoModelForCausalLM
132
 
133
  input_text = "Hoxe fai un bo día. O sol "
134
 
135
- model_id = "proxectonos/Llama3.1-Carballo"
136
  tokenizer = AutoTokenizer.from_pretrained(model_id)
137
  model = AutoModelForCausalLM.from_pretrained(model_id)
138
  generator = pipeline(
 
91
  library_name: transformers
92
  ---
93
 
94
+ # Llama-3.1-Carballo
95
 
96
  ## Table of Contents
97
  <details>
98
  <summary>Click to expand</summary>
99
 
100
+ - [Llama-3.1-Carballo](#llama31-carballo)
101
  - [Table of Contents](#table-of-contents)
102
  - [Model description](#model-description)
103
  - [Intended uses and limitations](#intended-uses-and-limitations)
 
117
 
118
  ## Model description
119
 
120
+ **Llama-3.1-Carballo** is a 8B-parameter transformer-based causal language model for Galician, Portuguese, Spanish, Catalan and English.
121
  It is the result of a continual pretraining of [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) with a multilingual corpus of almost 20B tokens, with an emphasis of Galician texts.
122
 
123
+ This model is part of the **Carballo familily**, a family of LLMs specialized in Galician. Smaller models can be founded [here](https://huggingface.co/collections/proxectonos/text-models-65d49fa54e358ce02a9699c8)
124
  ## Intended uses and limitations
125
 
126
+ The **Llama-3.1-Carballo** model is ready-to-use only for causal language modeling.
127
  It can perform text-generation tasks and be fine-tuned for specific scenarios.
128
 
129
  ## How to use
 
133
 
134
  input_text = "Hoxe fai un bo día. O sol "
135
 
136
+ model_id = "proxectonos/Llama-3.1-Carballo"
137
  tokenizer = AutoTokenizer.from_pretrained(model_id)
138
  model = AutoModelForCausalLM.from_pretrained(model_id)
139
  generator = pipeline(