YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

saqr-7b-merged - bnb 8bits

Original model description:

library_name: transformers tags: - saqr-7b-instrcut - Pytorch license: apache-2.0 datasets: - HuggingFaceH4/ultrachat_200k - openbmb/UltraFeedback - gsm8k language: - en pipeline_tag: text-generation

saqr-7b-merged

This model is a merged version of saqr-7b-instruct with LoRA Adapters.

Saqr Logo
Downloads last month
9
Safetensors
Model size
6.92B params
Tensor type
F32
FP16
I8
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.