metadata
license: apache-2.0
Mamba-2.8B-SlimOrca Model
Overview
Mamba-2.8B-SlimOrca is a finetune of the original pretrained Mamba model, utilizing the SlimOrca dataset. At this point, it's still pretty rough.
Usage
The easiest way to try out this model is to use chat.py
from Mamba-Chat.
Future Plans
- Integration with Hugging Face space (planned). Contributions welcome.
Model Training Details
- Dataset Used: SlimOrca
- Training Duration: Single epoch
- Current Status: Training is ongoing. Further iterations will be uploaded if there are improvements.