basemodel:
- mistral-7b
rank:
128
alpha:
16

Model Details Description This model is further pretrained model based on Mistral-7b-v0.1. dataset used from ai-hub various translation dataset

Downloads last month
1,663
Safetensors
Model size
7.24B params
Tensor type
BF16
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for madatnlp/mist-enko-lora-2950

Adapters
1 model
Quantizations
1 model

Spaces using madatnlp/mist-enko-lora-2950 6