File size: 1,236 Bytes
d202638 59e6184 d202638 2115bed cfb24c2 eb65a16 cfb24c2 2115bed 16247aa ba42f52 d202638 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
---
language:
- de
pipeline_tag: text-generation
tags:
- bloom
- lora
- LLM
---
Github: https://github.com/abdullahalzubaer/bloom-6b4-clp-german-lora-inference
Dataset used to train the adapter:
See this thread for more details https://huggingface.co/asprenger/bloom-6b4-clp-german-instruct-lora/discussions/2
- yizhongw/self_instruct [Translated to German]
- https://huggingface.co/datasets/yizhongw/self_instruct
This lora adapter is from https://huggingface.co/asprenger/bloom-6b4-clp-german-instruct-lora. Thanks for the adapter! I did not train it.
I thought I was uploading the complete bloom-6b4-clp-german model with the adapter that I made to work but then after pushing the model I realized that it was only the adapter. Still exploring how this PEFT works with LoRA works :)
strict requirments for peft
`peft==0.2.0`
requirment
`pip install transformers accelerate bitsandbytes peft==0.2.0`
latest peft has breaking changes with the bloom-6b4-clp-german and this lora adapter, and the only way to get them both work is (I think) is to train the base model or the adapter
again (I am not sure yet).
Reference:
- https://github.com/linhduongtuan/BLOOM-LORA/issues/5
- https://github.com/huggingface/peft/issues/276 |