SuperHOT for 7B model has been released & I need merge orca_mini_7B-GPTQ with SuperHOT

#2
by SilverJim - opened

Hello, The Bloke

SuperHOT for 7B model has been released: https://huggingface.co/kaiokendev/superhot-7b-8k-no-rlhf-test

And I need orca_mini_7B-GPTQ with 8k context size, and I am not sure whether SuperHOT work on orca_mini_7B (As SuperHOT is a LORA for Llama while orca_mini_7B is Open-Llama)

If it work, could you help me create orca_mini_7B-SuperHOT-8K-GPTQ?

Thank you!

SilverJim changed discussion title from SuperHOT for 7B model has been released & We need merge orca_mini_7B-GPTQ with SuperHOT to SuperHOT for 7B model has been released & I need merge orca_mini_7B-GPTQ with SuperHOT

Sign up or log in to comment