Edit model card

Inference

from  transformers import T5ForConditionalGeneration, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("google/byt5-small")

model = T5ForConditionalGeneration.from_pretrained("marianna13/byt5-small-NSFW-image-urls")

def get_label(text):
    input_ids = tokenizer(text, return_tensors="pt", padding=True).input_ids
    outputs = model.generate(input_ids)

    label = tokenizer.batch_decode(outputs, skip_special_tokens=True)

    return label
Downloads last month
48
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.