Instructions to use Felladrin/onnx-tinyllama-15M with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers.js
How to use Felladrin/onnx-tinyllama-15M with Transformers.js:
// npm i @huggingface/transformers import { pipeline } from '@huggingface/transformers'; // Allocate pipeline const pipe = await pipeline('text-generation', 'Felladrin/onnx-tinyllama-15M');
INT8 ONNX version of nickypro/tinyllama-15M to use with Transformers.js.
- Downloads last month
- 7
Model tree for Felladrin/onnx-tinyllama-15M
Base model
nickypro/tinyllama-15M