ONNX format

Generated from roneneldan/TinyStories-1M

For use with Transformers.js

const pipe = await pipeline(
  "text-generation",
  "mkly/TinyStories-1M-ONNX",
);
const response = await pipe(
  "Some example text",
  {
    max_new_tokens: 500,
    temperature: 0.9,
  },
);
console.log(response[0].generated_text);
Downloads last month
15
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the HF Inference API does not support transformers.js models with pipeline type text-generation