SLM Question Generator - SFT (114M)
A 114-million-parameter Small Language Model fine-tuned to generate educational questions from a given passage of text.
Model Details
- Architecture: Decoder-only Transformer
- Parameters: 114.1 M
- Layers: 12
- Vocabulary: tiktoken
r50k_base+ 3 special tokens (50,260 total)
Tokenizer Note
This model uses the tiktoken library with the r50k_base encoding, plus <|im_start|>, <|im_end|>, and <|pad|> tokens.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support