Mistral-Data-r128-LoRA
This is a LoRA extracted from a language model. It was extracted using mergekit.
LoRA Details
This LoRA adapter was extracted from RLHFlow/Llama3.1-8B-PRM-Mistral-Data and uses unsloth/Meta-Llama-3.1-8B-Instruct as a base.
Parameters
The following command was used to extract this LoRA adapter:
mergekit-extract-lora RLHFlow/Llama3.1-8B-PRM-Mistral-Data unsloth/Meta-Llama-3.1-8B-Instruct OUTPUT_PATH --no-lazy-unpickle --skip-undecomposable --rank=128 --extend-vocab --model_name=Mistral-Data-r128-LoRA --verbose