desformers / README.md
ddevaul's picture
Update README.md
4a48ab4 verified
|
raw
history blame
1.57 kB
metadata
license: mit

How to use

Requirements:

git clone https://github.com/ddevaul/transformers transformers_outer
cd transformers_outer
pip install -r requirements.txt
cd ..

Now open your file and add this:

import sys
import torch
sys.path.append('./transformers_outer/src')
from torch.utils.checkpoint import checkpoint
from transformers2 import BertConfig, BertTokenizer
from transformers2.models.bert import BertForMaskedLM

preload_path = 'cabrooks/character-level-logion'
char_tokenizer = BertTokenizer.from_pretrained(preload_path)
wordpiece_tokenizer = BertTokenizer.from_pretrained("cabrooks/LOGION-50k_wordpiece")
config = BertConfig()
config.word_piece_vocab_size = 50000
config.vocab_size = char_tokenizer.vocab_size
config.char_tokenizer = char_tokenizer
config.wordpiece_tokenizer = wordpiece_tokenizer
config.max_position_embeddings = 1024
config.device2 = device
model = BertForMaskedLM(config).to(device)

Download the weights from "my_custom_model.pth". Load these weights into the model:

model.load_state_dict(torch.load('my_custom_model.pth', map_location=torch.device('cpu')))

You are now ready to use the model.

Cite

If you use this model in your research, please cite the paper:

@misc{logion-base,
      title={Logion: Machine Learning for Greek Philology}, 
      author={Cowen-Breen, C. and Brooks, C. and Haubold, J. and Graziosi, B.},
      year={2023},
      eprint={2305.01099},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}