Contextual-Obedient-MoE-3x8B-Llama3-RAG / generation_config.json
TroyDoesAI's picture
Llama 3 Context-Obedient Models in a 3 x MoE configuration. The models experts are split into Understanding and summarizing input, following the format provided, and outputting only context relevant answers as the three experts involved
2ed3e35 verified
{
"_from_model_config": true,
"bos_token_id": 128000,
"eos_token_id": 128003,
"transformers_version": "4.40.2"
}