|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- sahil2801/CodeAlpaca-20k |
|
- ehartford/leet10k-alpaca |
|
- argilla/alpaca_data_cleaned |
|
tags: |
|
- alpaca |
|
- llama |
|
--- |
|
|
|
This repository contains a LoRA checkpoint based on the Alpaca-LoRA implementation to perform coding tasks better. The provided checkpoint can be used with the `generate.py` script from the Alpaca-LoRA repository to generate text or perform other NLP tasks. |
|
|
|
## Setup |
|
|
|
1. Clone the Alpaca-LoRA repository: |
|
|
|
git clone https://github.com/tloen/alpaca-lora.git |
|
3. Install the required packages: |
|
|
|
pip install -r requirements.txt |
|
|
|
|
|
## Usage |
|
|
|
Use the `generate.py` script from the Alpaca-LoRA repository to run the model with the provided LoRA checkpoint: |
|
|
|
python generate.py --load_8bit --base_model 'decapoda-research/llama-7b-hf' --lora_weights 'vihangd/leet-coding-alpaca-lora' |
|
|
|
### Troubleshooting |
|
|
|
If you encounter an error like: |
|
|
|
AttributeError: 'NoneType' object has no attribute 'device' |
|
|
|
Modify `generate.py` as follows (see [issue #21](https://github.com/tloen/alpaca-lora/issues/21)): |
|
|
|
```python |
|
model = PeftModel.from_pretrained( |
|
model, |
|
lora_weights, |
|
torch_dtype=torch.float16, |
|
device_map={'': 0} |
|
) |
|
``` |