--- base_model: facebook/vit-mae-base library_name: transformers pipeline_tag: image-classification tags: - probex - model-j - weight-space-learning --- # Model-J: MAE Model (model_idx_0105) This model is part of the **Model-J** dataset, introduced in: **Learning on Model Weights using Tree Experts** (CVPR 2025) by Eliahu Horwitz*, Bar Cavia*, Jonathan Kahana*, Yedid Hoshen
 ## Model Details | Attribute | Value | |---|---| | **Subset** | MAE | | **Split** | train | | **Base Model** | `facebook/vit-mae-base` | | **Dataset** | CIFAR100 (50 classes) | ## Training Hyperparameters | Parameter | Value | |---|---| | Learning Rate | 3e-05 | | LR Scheduler | cosine_with_restarts | | Epochs | 7 | | Max Train Steps | 2331 | | Batch Size | 64 | | Weight Decay | 0.009 | | Seed | 105 | | Random Crop | True | | Random Flip | True | ## Performance | Metric | Value | |---|---| | Train Accuracy | 0.9521 | | Val Accuracy | 0.8843 | | Test Accuracy | 0.8808 | ## Training Categories The model was fine-tuned on the following 50 CIFAR100 classes: `raccoon`, `dolphin`, `lizard`, `woman`, `pear`, `mushroom`, `plate`, `shrew`, `bottle`, `camel`, `pine_tree`, `hamster`, `dinosaur`, `apple`, `beaver`, `keyboard`, `tank`, `snake`, `clock`, `leopard`, `fox`, `motorcycle`, `wolf`, `seal`, `turtle`, `trout`, `aquarium_fish`, `poppy`, `otter`, `table`, `orange`, `cloud`, `spider`, `forest`, `lamp`, `television`, `oak_tree`, `bridge`, `castle`, `telephone`, `ray`, `tulip`, `caterpillar`, `beetle`, `baby`, `road`, `elephant`, `worm`, `house`, `maple_tree`