bleurt-base-128 / README.md
Elron's picture
Update README.md
3dabe1a

\n## BLEURT

Pytorch version of the original BLEURT models from ACL paper "BLEURT: Learning Robust Metrics for Text Generation" by Thibault Sellam, Dipanjan Das and Ankur P. Parikh of Google Research.

The code for model conversion was originated from this notebook mentioned here.

Usage Example

from transformers import AutoModelForSequenceClassification, AutoTokenizer
import torch

tokenizer = AutoTokenizer.from_pretrained("Elron/bleurt-base-128")
model = AutoModelForSequenceClassification.from_pretrained("Elron/bleurt-base-128")
model.eval()

references = ["hello world", "hello world"]
candidates = ["hi universe", "bye world"]

with torch.no_grad():
  scores = model(**tokenizer(references, candidates, return_tensors='pt'))[0].squeeze()

print(scores) # tensor([0.3598, 0.0723])