reverse-pythia-160m / README.md
afterless's picture
Update README.md
832c182
|
raw
history blame
No virus
589 Bytes
---
datasets:
- EleutherAI/pile
language:
- en
tags:
- Text Generation
- pytorch
- causal-lm
---
```python
from transformers import GPTNeoXForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(
"afterless/reverse-pythia-160m"
)
model = GPTNeoXForCausalLM.from_pretrained(
"afterless/reverse-pythia-160m"
)
inputs = tokenizer(
"but I told him, the cheese was the best",
return_token_type_ids=False,
return_tensors="pt"
)
inputs['input_ids'] = t.flip(inputs.input_ids, (1,))
tokens = t.flip(model.generate(**inputs), (1,))
tokenizer.decode(tokens[0])
```