This is just, SOTA 2 and 3-bit quants for laserxtral. Not much more to it. Meow.
The importance matrix, which is generated from group_10_merged.txt
, is included in this repo, as imatrix_laserxtral.dat
.
UPDATE 2/11/2024: The models have been reuploaded, with a new importance matrix used (group_10_merged.txt
rather than 20k_random_data.txt
), which should in theory provide better performance. I'm not an expert, don't quote me on that.
System Prompt
Alpaca format
### Instruction:
...
### Input:
...
### Response:
If you use LM Studio, this repo has a model_config.json
you can import which has that pre-configured.
- Downloads last month
- 59
Hardware compatibility
Log In
to view the estimation
2-bit
3-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model authors have turned it off explicitly.
Model tree for Absolucy/laserxtral-sota-GGUF
Base model
cognitivecomputations/laserxtral