Model Card for mt5-small-indo-bloom-3e

Model Details

This model is a fine-tuned version of google/mt5-small specifically designed for Pedagogical Automatic Question Generation (AQG) in the Indonesian language. Unlike standard AQG models that are 'pedagogically blind,' this model has been trained with Cognitive Control based on Bloom's Taxonomy (Level C1: Remembering, and Level C2: Understanding).

  • Developed by: Firmansyah Ibrahim (Universitas Negeri Malang / UIN Alauddin Makassar)
  • Model type: Text-to-Text Transformer (Seq2Seq)
  • Language: Indonesian (id)
  • Finetuned from model: google/mt5-small

Uses

Prompt Format: generate indonesian question level C[1_or_2]: [Your Reading Context]

Example: generate indonesian question level C2: Budi pergi ke pasar untuk membeli apel dan jeruk.

Citation

BibTeX:

@misc{ibrahim2026mt5bloom,
  author = {Ibrahim, Firmansyah},
  title = {mt5-small-indo-bloom-3e: A Pedagogically Controlled AQG Model for Indonesian},
  year = {2026},
  publisher = {Hugging Face},
  howpublished = {\url{[https://huggingface.co/](https://huggingface.co/)Firmansyah-Ibrahim/mt5-small-indo-bloom-3e}}
}
Downloads last month
32
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Firmansyah-Ibrahim/mt5-small-indo-bloom-3e

Base model

google/mt5-small
Finetuned
(664)
this model

Dataset used to train Firmansyah-Ibrahim/mt5-small-indo-bloom-3e