YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

T5-base data to text model specialized for Finance NLG

complete version


Usage (HuggingFace Transformers)

Call the model

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
  
tokenizer = AutoTokenizer.from_pretrained("yseop/FNP_T5_D2T_complete")

model = AutoModelForSeq2SeqLM.from_pretrained("yseop/FNP_T5_D2T_complete")


text = ["Group profit | valIs | โ‚ฌ 115.7 million && โ‚ฌ 115.7 million | dTime | in 2019"]

Choose a generation method



input_ids = tokenizer.encode(": {}".format(text), return_tensors="pt")
p = 0.82
k = 90

outputs = model.generate(input_ids,
                         do_sample=True,
                        top_p=p,
                        top_k=k,
                        early_stopping=True)

print(tokenizer.decode(outputs[0]))

input_ids = tokenizer.encode(": {}".format(text), return_tensors="pt")

outputs = model.generate(input_ids, 
                         max_length=200, 
                         num_beams=2, repetition_penalty=2.5, 
                         top_k=50, top_p=0.98,
                         length_penalty=1.0,
                         early_stopping=True)

print(tokenizer.decode(outputs[0]))

Created by: Yseop | Pioneer in Natural Language Generation (NLG) technology. Scaling human expertise through Natural Language Generation.

Downloads last month
13
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Space using yseop/FNP_T5_D2T_complete 1