ecastera's picture
Update README.md
2385dbb verified
|
raw
history blame contribute delete
No virus
930 Bytes
metadata
license: apache-2.0
datasets:
  - ecastera/wiki_fisica
  - ecastera/filosofia-es
  - bertin-project/alpaca-spanish
language:
  - es
  - en
tags:
  - mistral
  - spanish
  - español
  - lora
  - int4
  - multilingual

ecastera-eva-westlake-7b-spanish

Mistral 7b-based model fine-tuned in Spanish to add high quality Spanish text generation.

  • Exported in GGUF format, INT4 quantization
  • Refined version of my previous models, with new training data and methodology. This should produce more natural reponses in Spanish.
  • Base model Mistral-7b
  • Based on the excelent job of senseable/WestLake-7B-v2 and Eric Hartford's cognitivecomputations/WestLake-7B-v2-laser
  • Fine-tuned in Spanish with a collection of poetry, books, wikipedia articles, phylosophy texts and alpaca-es datasets.
  • Trained using Lora and PEFT and INT8 quantization on 2 GPUs for several days.

Usage:

Use in llamacpp or other framework that supports GGUF format.