llama-2-7b base fine-tuned on a dataset of 13,000 questions and answers about the Incan Empire (Wikipedia entry) Answers are designed to be snarky, concise, have an [END] token, and be in UPPERCASE.
- Downloads last month
- 1,411
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.