YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
saqr-7b-merged - bnb 4bits
- Model creator: https://huggingface.co/Menouar/
- Original model: https://huggingface.co/Menouar/saqr-7b-merged/
Original model description:
library_name: transformers tags: - saqr-7b-instrcut - Pytorch license: apache-2.0 datasets: - HuggingFaceH4/ultrachat_200k - openbmb/UltraFeedback - gsm8k language: - en pipeline_tag: text-generation
saqr-7b-merged
This model is a merged version of saqr-7b-instruct with LoRA Adapters.
- Downloads last month
- 2
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.