Text Generation
Adapters
Safetensors
5 languages
mixtral
sbranco's picture
Create README.md
5de7f15 verified
metadata
license: apache-2.0
datasets:
  - Open-Orca/SlimOrca
  - argilla/distilabel-intel-orca-dpo-pairs
language:
  - en
  - de
  - fr
  - it
  - es
library_name: adapter-transformers
pipeline_tag: text-generation

Model Card for Model Swisslex/Mixtral-8x7b-DPO-v0.1

Model Details

Model Description

Finetuned version of mistralai/Mixtral-8x7B-v0.1 using SFT and DPO.

  • Developed by: Swisslex
  • Language(s) (NLP): English, German, French, Italian, Spanish
  • License: apache-2.0
  • Finetuned from model [optional]: mistralai/Mixtral-8x7B-v0.1