gonzalez-agirre commited on
Commit
90217fa
1 Parent(s): 82e163e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -7
README.md CHANGED
@@ -51,7 +51,7 @@ This model card corresponds to the 2B version.
51
  To visit the model cards of other Salamandra versions, please refer to the [Model Index](#model-index).
52
 
53
  The entire Salamandra family is released under a permissive [Apache 2.0 license]((https://www.apache.org/licenses/LICENSE-2.0)).
54
- Along with the open weights, all training scripts and configuration files are made publicly available in [this GitHub repository](https://github.com/projecte-aina/salamandra).
55
 
56
  ---
57
 
@@ -64,7 +64,7 @@ The pre-training corpus contains text in 35 European languages and code.
64
 
65
  ### Hyperparameters
66
 
67
- The full list of hyperparameters for each model can be found [here](https://github.com/projecte-aina/salamandra/tree/main/configs).
68
 
69
  ### Architecture
70
 
@@ -150,7 +150,7 @@ pip install transformers torch accelerate sentencepiece protobuf
150
  ```python
151
  from transformers import pipeline, set_seed
152
 
153
- model_id = "projecte-aina/salamandra-2b"
154
 
155
  # Sample prompts
156
  prompts = [
@@ -197,7 +197,7 @@ pip install transformers torch accelerate sentencepiece protobuf
197
  from transformers import AutoTokenizer, AutoModelForCausalLM
198
  import torch
199
 
200
- model_id = "projecte-aina/salamandra-2b"
201
 
202
  # Input text
203
  text = "El mercat del barri és"
@@ -241,7 +241,7 @@ pip install vllm
241
  ```python
242
  from vllm import LLM, SamplingParams
243
 
244
- model_id = "projecte-aina/salamandra-2b"
245
 
246
  # Sample prompts
247
  prompts = [
@@ -747,6 +747,6 @@ Technical report and paper coming soon.
747
  ## Model Index
748
  |Model|Base|Instruct|
749
  |:---:|:---:|:---:|
750
- |2B| [Link](https://huggingface.co/projecte-aina/salamandra-2b) | [Link](https://huggingface.co/projecte-aina/salamandra-2b-instruct) |
751
- |7B| [Link](https://huggingface.co/projecte-aina/salamandra-7b) | [Link](https://huggingface.co/projecte-aina/salamandra-7b-instruct) |
752
  |40B| WiP | WiP |
 
51
  To visit the model cards of other Salamandra versions, please refer to the [Model Index](#model-index).
52
 
53
  The entire Salamandra family is released under a permissive [Apache 2.0 license]((https://www.apache.org/licenses/LICENSE-2.0)).
54
+ Along with the open weights, all training scripts and configuration files are made publicly available in [this GitHub repository](https://github.com/BSC-LT/salamandra).
55
 
56
  ---
57
 
 
64
 
65
  ### Hyperparameters
66
 
67
+ The full list of hyperparameters for each model can be found [here](https://github.com/langtech-bsc/salamandra/tree/main/configs).
68
 
69
  ### Architecture
70
 
 
150
  ```python
151
  from transformers import pipeline, set_seed
152
 
153
+ model_id = "BSC-LT/salamandra-2b"
154
 
155
  # Sample prompts
156
  prompts = [
 
197
  from transformers import AutoTokenizer, AutoModelForCausalLM
198
  import torch
199
 
200
+ model_id = "BSC-LT/salamandra-2b"
201
 
202
  # Input text
203
  text = "El mercat del barri és"
 
241
  ```python
242
  from vllm import LLM, SamplingParams
243
 
244
+ model_id = "BSC-LT/salamandra-2b"
245
 
246
  # Sample prompts
247
  prompts = [
 
747
  ## Model Index
748
  |Model|Base|Instruct|
749
  |:---:|:---:|:---:|
750
+ |2B| [Link](https://huggingface.co/BSC-LT/salamandra-2b) | [Link](https://huggingface.co/BSC-LT/salamandra-2b-instruct) |
751
+ |7B| [Link](https://huggingface.co/BSC-LT/salamandra-7b) | [Link](https://huggingface.co/BSC-LT/salamandra-7b-instruct) |
752
  |40B| WiP | WiP |