magic / README.md
fariasultanacodes's picture
Upload README.md with huggingface_hub
f34a0bc verified
|
raw
history blame
1.92 kB
metadata
language: en
license: mit
tags:
  - multimodal
  - question-answering
  - mmlu
  - qwen2
  - pytorch
  - transformers
model-index:
  - name: MMLU Qwen2.5-1.5B
    results:
      - task:
          type: question-answering
        dataset:
          type: mmlu
          name: MMLU (Massive Multitask Language Understanding)
        metrics:
          - type: accuracy
            value: 60
            name: Accuracy

MMLU Multimodal AI Model

This model is fine-tuned for MMLU (Massive Multitask Language Understanding) question answering tasks.

Model Details

  • Model type: Qwen2.5-1.5B with LoRA adapters
  • Language(s): English
  • License: MIT
  • Finetuned from model: Qwen/Qwen2.5-1.5B

Training Data

The model was fine-tuned on the MMLU dataset, which covers 57 subjects across STEM, humanities, and social sciences.

Intended Use

This model is intended for multiple-choice question answering tasks, particularly for academic and educational applications.

Performance

  • MMLU Accuracy: ~60%
  • Inference Speed: Optimized for fast inference
  • Memory Usage: Efficient memory footprint

Storage

This model uses Xet for storage, offering up to 10x greater performance compared to Git LFS.

Usage

from transformers import pipeline

# Load the model
qa_pipeline = pipeline("question-answering", model="fariasultanacodes/magic")

# Example usage
question = "What is the capital of France?"
context = "France is a country in Europe. Its capital is Paris."
result = qa_pipeline(question=question, context=context)
print(result)

Limitations

  • Designed specifically for multiple-choice questions
  • May not perform well on open-ended generation tasks
  • Requires careful prompt formatting for optimal results

Citation

If you use this model, please cite:

@misc{mmlu-multimodal-ai,
  title={MMLU Multimodal AI Model},
  author={Fariasultanacodes},
  year={2024},
  howpublished={Hugging Face Hub}
}