File size: 616 Bytes
1e17e30 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 |
---
tags:
- text-generation
- 8bit
- 8-bit
- quantization
- compression
inference: False
license: apache-2.0
---
# ethzanalytics/gpt-j-8bit-daily_dialogues_1E
This is a version of `hivemind/gpt-j-6B-8bit` fine-tuned on the Wizard of Wikipedia dataset for 10k steps on an A100. it can be used as a chatbot.
_NOTE: this needs to be loaded via the special patching technique outlined in the hivemind model card (as with all 8bit models)_
TODO: rest of README
---
[original demo link](https://colab.research.google.com/gist/pszemraj/76c0a80c9eacfb2c31e21c4cceb344a0/ai-msgbot-gpt-j-6b-8bit-chatbot-demo.ipynb)
|