|
--- |
|
license: mit |
|
--- |
|
|
|
|
|
## How to use |
|
|
|
Requirements: |
|
|
|
```bash |
|
git clone https://github.com/ddevaul/desformers |
|
``` |
|
|
|
```bash |
|
cd desformers |
|
``` |
|
|
|
```bash |
|
pip install -r requirements.txt |
|
``` |
|
|
|
```bash |
|
cd .. |
|
``` |
|
|
|
Now open your file and add this: |
|
|
|
```python |
|
import sys |
|
import torch |
|
sys.path.append('./desformers/src') |
|
from torch.utils.checkpoint import checkpoint |
|
from transformers2 import BertConfig, BertTokenizer |
|
from transformers2.models.bert import BertForMaskedLM |
|
|
|
preload_path = 'cabrooks/character-level-logion' |
|
char_tokenizer = BertTokenizer.from_pretrained(preload_path) |
|
wordpiece_tokenizer = BertTokenizer.from_pretrained("cabrooks/LOGION-50k_wordpiece") |
|
config = BertConfig() |
|
config.word_piece_vocab_size = 50000 |
|
config.vocab_size = char_tokenizer.vocab_size |
|
config.char_tokenizer = char_tokenizer |
|
config.wordpiece_tokenizer = wordpiece_tokenizer |
|
config.max_position_embeddings = 1024 |
|
config.device2 = device |
|
model = BertForMaskedLM(config).to(device) |
|
``` |
|
Download the weights from "my_custom_model.pth". |
|
Load these weights into the model: |
|
|
|
```python |
|
model.load_state_dict(torch.load('my_custom_model.pth', map_location=torch.device('cpu'))) |
|
``` |
|
|
|
You are now ready to use the model. |
|
|
|
## Author: |
|
This model was developed by Desmond DeVaul for his senior thesis at Princeton University. |
|
|
|
It was built on the work of the Logion team at Princeton: https://www.logionproject.princeton.edu. |