--- base_model: facebook/vit-mae-base library_name: transformers pipeline_tag: image-classification tags: - probex - model-j - weight-space-learning --- # Model-J: MAE Model (model_idx_0161) This model is part of the **Model-J** dataset, introduced in: **Learning on Model Weights using Tree Experts** (CVPR 2025) by Eliahu Horwitz*, Bar Cavia*, Jonathan Kahana*, Yedid Hoshen

🌐 Project | 📃 Paper | 💻 GitHub | 🤗 Dataset

![ProbeX](https://raw.githubusercontent.com/eliahuhorwitz/ProbeX/main/imgs/poster.png) ## Model Details | Attribute | Value | |---|---| | **Subset** | MAE | | **Split** | val | | **Base Model** | `facebook/vit-mae-base` | | **Dataset** | CIFAR100 (50 classes) | ## Training Hyperparameters | Parameter | Value | |---|---| | Learning Rate | 0.0005 | | LR Scheduler | cosine | | Epochs | 6 | | Max Train Steps | 1998 | | Batch Size | 64 | | Weight Decay | 0.01 | | Seed | 161 | | Random Crop | False | | Random Flip | True | ## Performance | Metric | Value | |---|---| | Train Accuracy | 0.7753 | | Val Accuracy | 0.5163 | | Test Accuracy | 0.5276 | ## Training Categories The model was fine-tuned on the following 50 CIFAR100 classes: `raccoon`, `dolphin`, `bottle`, `lawn_mower`, `orange`, `cattle`, `snail`, `beetle`, `lobster`, `bicycle`, `pine_tree`, `caterpillar`, `hamster`, `sweet_pepper`, `castle`, `porcupine`, `can`, `skyscraper`, `rabbit`, `shark`, `kangaroo`, `maple_tree`, `whale`, `turtle`, `mountain`, `lion`, `palm_tree`, `tank`, `forest`, `elephant`, `mushroom`, `crab`, `chimpanzee`, `bus`, `shrew`, `television`, `wolf`, `streetcar`, `otter`, `pear`, `table`, `leopard`, `keyboard`, `rose`, `chair`, `wardrobe`, `train`, `lizard`, `camel`, `boy`