aigeek0x0's picture
Update README.md
004618a verified
|
raw
history blame
No virus
1.32 kB
metadata
library_name: transformers
tags:
  - SFT
  - Mistral
  - Mistral 7B Instruct
license: apache-2.0
Radiantloom Mistral 7B Fusion

Radiantloom Mistral 7B Fusion

The Radiantloom Mistral 7B Fusion, a large language model (LLM) developed by Radiantloom AI, features approximately 7 billion parameters that's a finetune of base model produced by merging a set of Mistral models. With a context length of 4096 tokens, this model is suitable for commercial use.

From vibes-check evaluations, the Radiantloom Mistral 7B Fusion demonstrates great performance in various applications like creative writing, multi-turn conversations, in-context learning through Retrieval Augmented Generation (RAG), and coding tasks. Its out-of-the-box performance already delivers impressive results, particularly in writing tasks. This model produces longer form content and provides detailed explanations of its actions. To maximize its potential, consider implementing instruction tuning and Reinforcement Learning with Human Feedback (RLHF) techniques for further refinement. Alternatively, you can utilize it in its current form.