--- base_model: facebook/vit-mae-base library_name: transformers pipeline_tag: image-classification tags: - probex - model-j - weight-space-learning --- # Model-J: MAE Model (model_idx_0563) This model is part of the **Model-J** dataset, introduced in: **Learning on Model Weights using Tree Experts** (CVPR 2025) by Eliahu Horwitz*, Bar Cavia*, Jonathan Kahana*, Yedid Hoshen
 ## Model Details | Attribute | Value | |---|---| | **Subset** | MAE | | **Split** | val | | **Base Model** | `facebook/vit-mae-base` | | **Dataset** | CIFAR100 (50 classes) | ## Training Hyperparameters | Parameter | Value | |---|---| | Learning Rate | 7e-05 | | LR Scheduler | constant | | Epochs | 6 | | Max Train Steps | 1998 | | Batch Size | 64 | | Weight Decay | 0.005 | | Seed | 563 | | Random Crop | True | | Random Flip | False | ## Performance | Metric | Value | |---|---| | Train Accuracy | 0.9751 | | Val Accuracy | 0.8576 | | Test Accuracy | 0.8538 | ## Training Categories The model was fine-tuned on the following 50 CIFAR100 classes: `seal`, `forest`, `plain`, `otter`, `fox`, `cup`, `mouse`, `oak_tree`, `clock`, `cattle`, `tank`, `dinosaur`, `mushroom`, `elephant`, `butterfly`, `bicycle`, `skyscraper`, `wolf`, `couch`, `bowl`, `snail`, `pickup_truck`, `rabbit`, `man`, `lobster`, `raccoon`, `willow_tree`, `hamster`, `lamp`, `keyboard`, `snake`, `woman`, `trout`, `dolphin`, `turtle`, `girl`, `table`, `bottle`, `tulip`, `palm_tree`, `sweet_pepper`, `apple`, `telephone`, `flatfish`, `bee`, `sea`, `lizard`, `motorcycle`, `baby`, `whale`