File size: 2,264 Bytes
3395e39 83243b9 3395e39 83243b9 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 |
---
library_name: peft
base_model: meta-llama/Llama-2-13b-hf
language: en
license: mit
---
# Llama-2-13b-ocr
This model is released as part of the paper [Leveraging LLMs for Post-OCR Correction of Historical Newspapers](https://aclanthology.org/2024.lt4hala-1.14/) and designed to correct OCR text. [Llama 2 13B](https://huggingface.co/meta-llama/Llama-2-13b-hf) is instruction-tuned for post-OCR correction of historical English, using [BLN600](https://aclanthology.org/2024.lrec-main.219/), a parallel corpus of 19th century newspaper machine/human transcription.
## Usage
```python
from peft import AutoPeftModelForCausalLM
from transformers import AutoTokenizer, BitsAndBytesConfig
import torch
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type='nf4',
bnb_4bit_compute_dtype=torch.bfloat16,
)
model = AutoPeftModelForCausalLM.from_pretrained(
'pykale/llama-2-13b-ocr',
quantization_config=bnb_config,
low_cpu_mem_usage=True,
torch_dtype=torch.float16,
)
tokenizer = AutoTokenizer.from_pretrained('pykale/llama-2-13b-ocr')
ocr = "The defendant wits'fined �5 and costs."
prompt = f"""### Instruction:
Fix the OCR errors in the provided text.
### Input:
{ocr}
### Response:
"""
input_ids = tokenizer(prompt, max_length=1024, return_tensors='pt', truncation=True).input_ids.cuda()
with torch.inference_mode():
outputs = model.generate(input_ids=input_ids, max_new_tokens=1024, do_sample=True, temperature=0.7, top_p=0.1, top_k=40)
pred = tokenizer.batch_decode(outputs.detach().cpu().numpy(), skip_special_tokens=True)[0][len(prompt):].strip()
print(pred)
```
## Citation
```
@inproceedings{thomas-etal-2024-leveraging,
title = "Leveraging {LLM}s for Post-{OCR} Correction of Historical Newspapers",
author = "Thomas, Alan and Gaizauskas, Robert and Lu, Haiping",
editor = "Sprugnoli, Rachele and Passarotti, Marco",
booktitle = "Proceedings of the Third Workshop on Language Technologies for Historical and Ancient Languages (LT4HALA) @ LREC-COLING-2024",
month = "may",
year = "2024",
address = "Torino, Italia",
publisher = "ELRA and ICCL",
url = "https://aclanthology.org/2024.lt4hala-1.14",
pages = "116--121",
}
``` |