metadata
base_model: facebook/vit-mae-base
library_name: transformers
pipeline_tag: image-classification
tags:
- probex
- model-j
- weight-space-learning
Model-J: MAE Model (model_idx_0511)
This model is part of the Model-J dataset, introduced in:
Learning on Model Weights using Tree Experts (CVPR 2025) by Eliahu Horwitz*, Bar Cavia*, Jonathan Kahana*, Yedid Hoshen
🌐 Project | 📃 Paper | 💻 GitHub | 🤗 Dataset
Model Details
| Attribute | Value |
|---|---|
| Subset | MAE |
| Split | train |
| Base Model | facebook/vit-mae-base |
| Dataset | CIFAR100 (50 classes) |
Training Hyperparameters
| Parameter | Value |
|---|---|
| Learning Rate | 0.0001 |
| LR Scheduler | cosine_with_restarts |
| Epochs | 7 |
| Max Train Steps | 2331 |
| Batch Size | 64 |
| Weight Decay | 0.009 |
| Seed | 511 |
| Random Crop | False |
| Random Flip | False |
Performance
| Metric | Value |
|---|---|
| Train Accuracy | 0.9999 |
| Val Accuracy | 0.8896 |
| Test Accuracy | 0.8872 |
Training Categories
The model was fine-tuned on the following 50 CIFAR100 classes:
dinosaur, forest, sweet_pepper, rose, kangaroo, lizard, possum, dolphin, plate, baby, caterpillar, sunflower, trout, porcupine, castle, turtle, bed, man, otter, train, fox, flatfish, beetle, couch, mountain, telephone, seal, aquarium_fish, bus, bowl, road, crab, skyscraper, shrew, tank, snail, house, palm_tree, apple, mushroom, tiger, cockroach, hamster, raccoon, lobster, poppy, whale, wolf, cup, clock
