Edit model card

Den4ikAI/FRED-T5-LARGE_text_qa

Модель обучена отвечать на вопросы с помощью текста.

Wandb: link

Использование

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
import torch
from transformers import GenerationConfig

use_cuda = torch.cuda.is_available()
device = torch.device("cuda" if use_cuda else "cpu")
generation_config = GenerationConfig.from_pretrained("Den4ikAI/FRED-T5-LARGE_text_qa")
tokenizer = AutoTokenizer.from_pretrained("Den4ikAI/FRED-T5-LARGE_text_qa")
model = AutoModelForSeq2SeqLM.from_pretrained("Den4ikAI/FRED-T5-LARGE_text_qa").to(device)
model.eval()

def generate(prompt):
  data = tokenizer(f"{prompt}", return_tensors="pt").to(model.device)
  output_ids = model.generate(
      **data,
      generation_config=generation_config
  )[0]
  print(tokenizer.decode(data["input_ids"][0].tolist()))
  out = tokenizer.decode(output_ids.tolist())
  return out

while 1:
  prompt = '''<SC6>Текст: {}\nВопрос: {}\nОтвет: <extra_id_0>
  '''.format(input('Текст: '), input('Вопрос: '))
  print(generate(prompt))
Downloads last month
19
Safetensors
Model size
820M params
Tensor type
BF16
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train Den4ikAI/FRED-T5-LARGE_text_qa

Spaces using Den4ikAI/FRED-T5-LARGE_text_qa 2