INT8 ONNX version of philschmid/flan-t5-base-samsum to use with Transformers.js.
Example usage
import { pipeline } from '@xenova/transformers';
const generator = await pipeline('text2text-generation', 'Felladrin/onnx-flan-t5-base-samsum');
const output = await generator("Val: it's raining! Candy: I know, just started... Val: r we going? we will be wet Candy: maybe wait a little? see if stops Val: ok. let's wait half h and than see Candy: god idea, I call u then Val: great :)", { add_special_tokens: true, max_new_tokens: 60, repetition_penalty: 1.2});
console.log(output); // It's raining. Val and Candy will wait half an hour and then see if...
- Downloads last month
- 4
Inference API (serverless) does not yet support transformers.js models for this pipeline type.
Model tree for Felladrin/onnx-flan-t5-base-samsum
Base model
philschmid/flan-t5-base-samsum