metadata
library_name: transformers
license: mit
Arcanum-12b π§ββοΈ
Arcanum-12b is a merged large language model created by combining TheDrummer/Rocinante-12B-v1.1 and MarinaraSpaghetti/NemoMix-Unleashed-12B using a novel merging technique.
Model Details π
- Developed by: Xclbr7
- Model type: Causal Language Model
- Language(s): English (primarily), may support other languages
- License: MIT
- Repository: https://huggingface.co/Xclbr7/Arcanum-12b
Model Architecture ποΈ
- Base model: MarinaraSpaghetti/NemoMix-Unleashed-12B
- Parameter count: ~12 billion
- Architecture specifics: Transformer-based language model
Training & Merging π
Arcanum-12b was created by merging two existing 12B models:
TheDrummer/Rocinante-12B-v1.1
- Density parameters: [1, 0.8, 0.6]
- Weight: 0.7
MarinaraSpaghetti/NemoMix-Unleashed-12B
- Density parameters: [0.5, 0.7, 0.9]
- Weight: 0.8
Merging method: Ties Additional parameters:
- Normalization: True
- Int8 mask: True
- Data type: float16
Intended Use π―
Conversation with different personas.
Performance and Limitations βοΈ
Not tested yet.
Ethical Considerations π€
As a merged model based on existing language models, Arcanum-12b may inherit biases and limitations from its parent models. Users should be aware of potential biases in generated content and use the model responsibly.
Acknowledgments π
We acknowledge the contributions of the original model creators:
- TheDrummer for Rocinante-12B-v1.1
- MarinaraSpaghetti for NemoMix-Unleashed-12B
Their work formed the foundation for Arcanum-12b.