marianna13's picture
Update README.md
30abae2
---
datasets:
- laion/laion2B-en-joined
---
## Inference
```python
from transformers import T5ForConditionalGeneration, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("google/byt5-small")
model = T5ForConditionalGeneration.from_pretrained("marianna13/byt5-small-NSFW-image-urls")
def get_label(text):
input_ids = tokenizer(text, return_tensors="pt", padding=True).input_ids
outputs = model.generate(input_ids)
label = tokenizer.batch_decode(outputs, skip_special_tokens=True)
return label
```