|
--- |
|
base_model: HuggingFaceTB/SmolLM-1.7B |
|
language: |
|
- en |
|
- bg |
|
license: apache-2.0 |
|
tags: |
|
- text-generation-inference |
|
- transformers |
|
- llama |
|
- trl |
|
datasets: |
|
- petkopetkov/oasst1_bg |
|
--- |
|
|
|
# SmolLM-1-7B-Bulgarian |
|
|
|
- **Developed by:** petkopetkov |
|
- **License:** apache-2.0 |
|
- **Finetuned from model :** HuggingFaceTB/SmolLM-1.7B-Instruct |
|
|
|
SmolLM-1.7B finetuned on OASST1 dataset translated to Bulgarian language. |
|
|
|
### Usage |
|
|
|
First, install the Transformers library with: |
|
```sh |
|
pip install -U transformers |
|
``` |
|
|
|
#### Run with the `pipeline` API |
|
|
|
```python |
|
import torch |
|
from transformers import pipeline |
|
|
|
pipe = pipeline( |
|
"text-generation", |
|
model="petkopetkov/SmolLM-1.7B-bg", |
|
torch_dtype=torch.bfloat16, |
|
device_map="auto" |
|
) |
|
|
|
prompt = "Колко е 2 + 2?" |
|
|
|
print(pipe(prompt)[0]['generated_text']) |
|
``` |
|
|