metadata
license: llama3
base_model: meta-llama/Meta-Llama-3-8B-Instruct
datasets:
- argilla/distilabel-capybara-dpo-7k-binarized
language:
- en
library_name: transformers
tags:
- moe
Meow. This an experimental mixture of expert model with just 2 experts based on Llama 3 Instruct plain in combo with finetune. Specifically, it is built on top of the Meta-Llama-3-8B-Instruct model and finetune is trained on Argilla Capybara dataset.
Experimental mixture of 2 experts Llama3-8b-Instruct
Built with Llama 3