File size: 1,188 Bytes
9bac0b6 35e4509 67ef0cf 35e4509 d666ef3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
---
license: apache-2.0
datasets:
- sahil2801/CodeAlpaca-20k
- ehartford/leet10k-alpaca
- yahma/alpaca-cleaned
tags:
- alpaca
- llama
---
This repository contains a LoRA checkpoint based on the Alpaca-LoRA implementation to perform coding tasks better. The provided checkpoint can be used with the `generate.py` script from the Alpaca-LoRA repository to generate text or perform other NLP tasks.
## Setup
1. Clone the Alpaca-LoRA repository:
git clone https://github.com/tloen/alpaca-lora.git
3. Install the required packages:
pip install -r requirements.txt
## Usage
Use the `generate.py` script from the Alpaca-LoRA repository to run the model with the provided LoRA checkpoint:
python generate.py --load_8bit --base_model 'decapoda-research/llama-7b-hf' --lora_weights 'vihangd/leet-coding-alpaca-lora'
### Troubleshooting
If you encounter an error like:
AttributeError: 'NoneType' object has no attribute 'device'
Modify `generate.py` as follows (see [issue #21](https://github.com/tloen/alpaca-lora/issues/21)):
```python
model = PeftModel.from_pretrained(
model,
lora_weights,
torch_dtype=torch.float16,
device_map={'': 0}
)
``` |