Instructions to use huggingworld/Llama-3.2-1B-Instruct with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers.js
How to use huggingworld/Llama-3.2-1B-Instruct with Transformers.js:
// npm i @huggingface/transformers import { pipeline } from '@huggingface/transformers'; // Allocate pipeline const pipe = await pipeline('text-generation', 'huggingworld/Llama-3.2-1B-Instruct');
Ctrl+K