ONNX format

Generated from roneneldan/TinyStories-1M

For use with Transformers.js

const pipe = await pipeline(
  "text-generation",
  "mkly/TinyStories-1M-ONNX",
);
const response = await pipe(
  "Some example text",
  {
    max_new_tokens: 500,
    temperature: 0.9,
  },
);
console.log(response[0].generated_text);
Downloads last month
30
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support