File size: 1,387 Bytes
903427f
 
 
 
 
 
 
 
 
 
9c8a09b
903427f
 
 
9c8a09b
903427f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9c8a09b
903427f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4396627
 
903427f
4396627
 
 
 
 
903427f
6aed7f2
 
903427f
a12a645
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
---
license: mit
---


## How to use

Requirements:

```bash
git clone https://github.com/ddevaul/desformers
```

```bash
cd desformers
```

```bash
pip install -r requirements.txt
```

```bash
cd ..
```

Now open your file and add this:

```python
import sys
import torch
sys.path.append('./desformers/src')
from torch.utils.checkpoint import checkpoint
from transformers2 import BertConfig, BertTokenizer
from transformers2.models.bert import BertForMaskedLM

preload_path = 'cabrooks/character-level-logion'
char_tokenizer = BertTokenizer.from_pretrained(preload_path)
wordpiece_tokenizer = BertTokenizer.from_pretrained("cabrooks/LOGION-50k_wordpiece")
config = BertConfig()
config.word_piece_vocab_size = 50000
config.vocab_size = char_tokenizer.vocab_size
config.char_tokenizer = char_tokenizer
config.wordpiece_tokenizer = wordpiece_tokenizer
config.max_position_embeddings = 1024
config.device2 = device
model = BertForMaskedLM(config).to(device)
```
Download the weights from "my_custom_model.pth".
Load these weights into the model:

```python
model.load_state_dict(torch.load('my_custom_model.pth', map_location=torch.device('cpu')))
```

You are now ready to use the model.

## Author:
This model was developed by Desmond DeVaul for his senior thesis at Princeton University.

It was built on the work of the Logion team at Princeton: https://www.logionproject.princeton.edu.