|
--- |
|
license: apache-2.0 |
|
base_model: bertin-project/Gromenauer-7B |
|
datasets: |
|
- bertin-project/bonanza-hf |
|
- bertin-project/zenobia-instruct-hf |
|
language: |
|
- es |
|
- ca |
|
pipeline_tag: text-generation |
|
--- |
|
# Gromenauer-7B-Instruct |
|
|
|
<div align=center> |
|
<img alt="gromenauer-7B logo" src="https://huggingface.co/bertin-project/Gromenauer-7B/resolve/main/images/gromenauer.png" width="200px"> |
|
</div> |
|
|
|
## Overview |
|
Gromenauer-7B-Instruct is an instruct fine-tuned version of the [bertin-project/Gromenauer-7B](https://huggingface.co/bertin-project/Gromenauer-7B) model using the [bertin-project/bonanza-hf](https://huggingface.co/datasets/bertin-project/bonanza-hf) and [bertin-project/zenobia-instruct-hf](https://huggingface.co/datasets/bertin-project/zenobia-instruct-hf) datasets. |
|
|
|
## Usage examples |
|
|
|
### Multinomial sampling example: |
|
```python |
|
# Use a pipeline as a high-level helper |
|
from transformers import pipeline |
|
|
|
messages = [ |
|
{"role": "system", "content": "Eres un modelo experto en poesía española."}, |
|
{"role": "user", "content": "Escribe un poema sobre la pérdida de un coche querido en forma de pareado."}, |
|
] |
|
|
|
generate_kwargs = { |
|
"do_sample": True, |
|
"temperature": 0.7, |
|
"max_new_tokens": 150, |
|
} |
|
|
|
pipe = pipeline("text-generation", model="bertin-project/Gromenauer-7B-Instruct", generate_kwargs=generate_kwargs) |
|
pipe(messages) |
|
``` |
|
Output: |
|
``` |
|
<|system|> |
|
Eres un modelo experto en poesía española.</s> |
|
<|user|> |
|
Escribe un poema sobre la pérdida de un coche querido en forma de pareado.</s> |
|
<|assistant|> |
|
Una mañana de invierno salí al sol peregrino, |
|
y encontré mi auto cogiendo una lechuga en el camino.</s> |
|
``` |
|
|
|
### Contrastive search example: |
|
|
|
```python |
|
messages = [ |
|
{"role": "system", "content": "Eres un asistente en español. Responde de manera exacta y concisa."}, |
|
{"role": "user", "content": "¿Por qué es famosa Sevilla?"}, |
|
] |
|
|
|
generate_kwargs = { |
|
"penalty_alpha": 0.6, |
|
"max_new_tokens": 300, |
|
} |
|
pipe = pipeline("text-generation", model="bertin-project/Gromenauer-7B-Instruct", generate_kwargs=generate_kwargs) |
|
pipe(messages) |
|
``` |
|
|
|
Output: |
|
``` |
|
<|system|> |
|
Eres un asistente en español. Responde de manera exacta y concisa.</s> |
|
<|user|> |
|
¿Por qué es famosa Sevilla?</s> |
|
<|assistant|> |
|
Sevilla es conocida por su belleza arquitectónica, con edificios como la Giralda, el Alcázar y la Catedral, así como por sus fiestas populares como la Feria de Abril y Semana Santa. Además, es la capital de Andalucía y uno de los principales centros económicos del sur de España.</s> |
|
``` |
|
|
|
## Model Details |
|
|
|
- **Model Type**: Mistral |
|
- **Sequence Length**: 8192 |
|
- **Hidden Dimension**: 4096 |
|
- **Intermediate Dimension**: 14336 |
|
- **Number of Layers**: 32 |
|
- **Number of Attention Heads**: 32 |
|
- **Number of Key-Value Heads**: 8 |
|
- **Activation Function**: SiLU |
|
- **Initializer Range**: 0.02 |
|
- **Layer Norm Epsilon**: 1.0e-05 |
|
- **Use Flash Attention**: Yes |
|
- **Gradient Checkpointing**: Enabled (Block Size: 5) |
|
- **Sliding Window Attention**: 4096 |
|
- **Use Bias**: No |
|
|
|
## Training Details |
|
|
|
- **Tokenizer**: [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta) |
|
- **Batch Size**: 512 |
|
- **Learning Rate**: 1e-5 |
|
- **Optimizer**: Adam with beta1=0.9, beta2=0.95, epsilon=1e-8 |
|
- **Weight Decay**: 0.1 |
|
- **Warmup Steps**: 200 |
|
- **Learning Rate Schedule**: Cosine |
|
- **Number of Training Epochs**: 5 |