Nous Capybara 57B
Model Details
- A result of interleaving layers of NousResearch/Nous-Capybara-34B with itself.
- The resulting model has 100 layers and approximately 57 billion parameters.
- See mergekit-config.yml for details on the merge method used.
Warning: This model can produce NSFW content!
Results
Follows instructions better than oryginal, no looping, uncensored like oryginal. Feels also smarter than oryginal. All comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel:
- Downloads last month
- 8
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.