I merged huggyllama/llama-30b + kaiokendev/SuperCOT-LoRA | |
I forgot to increase the shard size thats why there is so many bin files. I am currently quantizing it to 4bit 128g at the time of writing this. |
I merged huggyllama/llama-30b + kaiokendev/SuperCOT-LoRA | |
I forgot to increase the shard size thats why there is so many bin files. I am currently quantizing it to 4bit 128g at the time of writing this. |