File size: 171 Bytes
f83530c |
1 2 3 4 |
dtype: bfloat16
merge_method: passthrough
models:
- model: unsloth/Meta-Llama-3.1-8B-Instruct+PJMixers-Dev/L3.1-Instruct-gemini-2.0-flash-thinking-exp-1219-v0.1-8B-QDoRA |
f83530c |
1 2 3 4 |
dtype: bfloat16
merge_method: passthrough
models:
- model: unsloth/Meta-Llama-3.1-8B-Instruct+PJMixers-Dev/L3.1-Instruct-gemini-2.0-flash-thinking-exp-1219-v0.1-8B-QDoRA |