|
Dont be upsetti, here, have some spaghetti! Att: A'eala <3 |
|
|
|
<p><strong><font size="5">Information</font></strong></p> |
|
GPT4-X-Alpasta-30b working with Oobabooga's Text Generation Webui and KoboldAI. |
|
<p>This is an attempt at improving Open Assistant's performance as an instruct while retaining its excellent prose. The merge consists of <a href="https://huggingface.co/chansung/gpt4-alpaca-lora-30b">Chansung's GPT4-Alpaca Lora</a> and <a href="https://huggingface.co/OpenAssistant/oasst-sft-6-llama-30b-xor">Open Assistant's native fine-tune</a>.</p> |
|
|
|
<p><strong><font size="5">Benchmarks</font></strong></p> |
|
|
|
<p><strong><font size="4">FP16</font></strong></p> |
|
|
|
<strong>Wikitext2</strong>: 4.6077961921691895 |
|
|
|
<strong>Ptb-New</strong>: 9.41549301147461 |
|
|
|
<strong>C4-New</strong>: 6.98392915725708 |
|
|
|
<p>Benchmarks brought to you by A'eala</p> |
|
|
|
<p><strong><font size="5">Benchmarks</font></strong></p> |
|
|
|
<p><strong><font size="4">GPTQ 4Bit Act-order True-Sequential</font></strong></p> |
|
|
|
<strong>Wikitext2</strong>: 4.981262683868408 |
|
|
|
<strong>Ptb-New</strong>: 9.737570762634277 |
|
|
|
<strong>C4-New</strong>: 7.332120418548584 |
|
|
|
<p>Benchmarks brought to you by Askmyteapot</p> |