Low-rank decomposition of valine/OpenSnark using teknium/OpenHermes-2.5-Mistral-7B as base
Created using LoRD
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.
Model tree for thomasgauthier/OpenSnark-LoRD
Base model
mistralai/Mistral-7B-v0.1
Finetuned
teknium/OpenHermes-2.5-Mistral-7B