File size: 533 Bytes
95da9d6 b481cfa 95da9d6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
---
license: apache-2.0
library_name: transformers
pipeline_tag: text-generation
tags:
- merge
- text
---
# MaziyarPanahi/TheTop-7B-DPO-S2-v0.2
Merge of top 7B models with SLERP method.
> mergekit is a toolkit for merging pre-trained language models. mergekit uses an out-of-core approach to perform unreasonably elaborate merges in resource-constrained situations. Merges can be run entirely on CPU or accelerated with as little as 8 GB of VRAM. Many merging algorithms are supported, with more coming as they catch my attention. |