es2bash-mt5 / README.md
fbohorquez's picture
Update README.md
4bffbcb
|
raw
history blame
5.72 kB
---
license: apache-2.0
datasets:
- dev2bit/es2bash
language:
- es
pipeline_tag: text2text-generation
tags:
- code
- bash
widget:
- text: Muestra el contenido de file.py que se encuentra en ~/project/
example_title: cat
- text: Lista los 3 primeros archivos en /bin
example_title: ls
- text: Por favor, cambia al directorio /home/user/project/
example_title: cd
- text: Lista todos los 谩tomos del universo
example_title: noCommand
- text: ls -lh
example_title: literal
- text: file.txt
example_title: simple
---
# es2bash-mt5: Spanish to Bash Model
<p align="center">
<img width="460" height="300" src="https://dev2bit.com/wp-content/themes/lovecraft_child/assets/icons/dev2bit_monitor2.svg">
</p>
Developed by dev2bit, es2bash-mt5 is a language transformer model that is capable of predicting the optimal Bash command given a natural language request in Spanish. This model represents a major advancement in human-computer interaction, providing a natural language interface for Unix operating system commands.
## About the Model
es2bash-mt5 is a fine-tuning model based on mt5-small. It has been trained on the dev2bit/es2bash dataset, which specializes in translating natural language in Spanish into Bash commands.
This model is optimized for processing requests related to the commands:
* `cat`
* `ls`
* `cd`
## Usage
Below is an example of how to use es2bash-mt5 with the Hugging Face Transformers library:
```python
from transformers import pipeline
translator = pipeline('translation', model='dev2bit/es2bash-mt5')
request = "listar los archivos en el directorio actual"
translated = translator(request, max_length=512)
print(translated[0]['translation_text'])
```
This will print the Bash command corresponding to the given Spanish request.
## Contributions
We appreciate your contributions! You can help improve es2bash-mt5 in various ways, including:
* Testing the model and reporting any issues or suggestions in the Issues section.
* Improving the documentation.
* Providing usage examples.
---
# es2bash-mt5: Modelo de espa帽ol a Bash
Desarrollado por dev2bit, `es2bash-mt5` es un modelo transformador de lenguaje que tiene la capacidad de predecir el comando Bash 贸ptimo dada una solicitud en lenguaje natural en espa帽ol. Este modelo representa un gran avance en la interacci贸n humano-computadora, proporcionando una interfaz de lenguaje natural para los comandos del sistema operativo Unix.
## Sobre el modelo
`es2bash-mt5` es un modelo de ajuste fino basado en `mt5-small`. Ha sido entrenado en el conjunto de datos `dev2bit/es2bash`, especializado en la traducci贸n de lenguaje natural en espa帽ol a comandos Bash.
Este modelo est谩 optimizado para procesar solicitudes relacionadas con los comandos:
* `cat`
* `ls`
* `cd`
## Uso
A continuaci贸n, se muestra un ejemplo de c贸mo usar `es2bash-mt5` con la biblioteca Hugging Face Transformers:
```python
from transformers import pipeline
translator = pipeline('translation', model='dev2bit/es2bash-mt5')
request = "listar los archivos en el directorio actual"
translated = translator(request, max_length=512)
print(translated[0]['translation_text'])
```
Esto imprimir谩 el comando Bash correspondiente a la solicitud dada en espa帽ol.
## Contribuciones
Agradecemos sus contribuciones! Puede ayudar a mejorar es2bash-mt5 de varias formas, incluyendo:
* Probar el modelo y reportar cualquier problema o sugerencia en la secci贸n de Issues.
* Mejorando la documentaci贸n.
* Proporcionando ejemplos de uso.
---
This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on the es2bash dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0919
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.1
- train_batch_size: 8
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 28
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 21.394 | 1.0 | 672 | 1.7470 |
| 2.5294 | 2.0 | 1344 | 0.6350 |
| 0.5873 | 3.0 | 2016 | 0.2996 |
| 0.3802 | 4.0 | 2688 | 0.2142 |
| 0.2951 | 5.0 | 3360 | 0.1806 |
| 0.225 | 6.0 | 4032 | 0.1565 |
| 0.2065 | 7.0 | 4704 | 0.1461 |
| 0.1944 | 8.0 | 5376 | 0.1343 |
| 0.174 | 9.0 | 6048 | 0.1281 |
| 0.1647 | 10.0 | 6720 | 0.1207 |
| 0.1566 | 11.0 | 7392 | 0.1140 |
| 0.1498 | 12.0 | 8064 | 0.1106 |
| 0.1382 | 13.0 | 8736 | 0.1076 |
| 0.1393 | 14.0 | 9408 | 0.1042 |
| 0.1351 | 15.0 | 10080 | 0.1019 |
| 0.13 | 16.0 | 10752 | 0.0998 |
| 0.1292 | 17.0 | 11424 | 0.0983 |
| 0.1265 | 18.0 | 12096 | 0.0973 |
| 0.1255 | 19.0 | 12768 | 0.0969 |
| 0.1216 | 20.0 | 13440 | 0.0956 |
| 0.1216 | 21.0 | 14112 | 0.0946 |
| 0.123 | 22.0 | 14784 | 0.0938 |
| 0.113 | 23.0 | 15456 | 0.0931 |
| 0.1185 | 24.0 | 16128 | 0.0929 |
| 0.1125 | 25.0 | 16800 | 0.0927 |
| 0.1213 | 26.0 | 17472 | 0.0925 |
| 0.1153 | 27.0 | 18144 | 0.0921 |
| 0.1134 | 28.0 | 18816 | 0.0919 |
### Framework versions
- Transformers 4.29.2
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3