metadata
license: mit
tags:
- image-classification
- cifar100
- geometric-learning
- fractal-encoding
- trained
- no-attention
- no-cross-entropy
datasets:
- cifar100
metrics:
- accuracy
library_name: pytorch
pipeline_tag: image-classification
model-index:
- name: geo-beatrix-resnet18
results:
- task:
type: image-classification
name: Image Classification
dataset:
name: CIFAR-100
type: cifar100
metrics:
- type: accuracy
value: 52.97
name: Test Accuracy
verified: false
geo-beatrix-resnet18
Geometric Basin Classification for CIFAR-100
π Training Complete π
Final Status: Epoch 200/200
Current Performance
| Metric | Value |
|---|---|
| Best Test Accuracy | 52.97% |
| Best Epoch | 200 |
| Current Train Accuracy | 69.87% |
| Current Test Accuracy | 52.97% |
| Current Ξ± (Cantor param) | 0.4452 |
| Total Parameters | 11,952,641 |
| Training Time | 0:30:56 |
All Training Runs
| Timestamp | Status | Best Epoch | Test Acc | Train Acc | Ξ± |
|---|---|---|---|---|---|
20251010_185133 |
β | 200 | 52.97% | 69.87% | 0.4452 |
Comparison to State-of-the-Art
| Model | Accuracy | Status |
|---|---|---|
| geo-beatrix (this model) | 52.97% | β Complete |
| vit-beatrix-dualstream | 66.0% | Vision Transformer + Cross-Entropy |
π― Current target: Beat vit-beatrix (66.0%) - Currently -13.03%
Architecture
- Base: ResNet18 (torchvision)
- Pretrained: From scratch
- Features: 512-dim from ResNet18
- Positional Encoding: Devil's Staircase (Cantor function, 1883)
- PE Levels: 18
- PE Features/Level: 100
- Classification: Geometric Basin Compatibility (NO cross-entropy)
- Attention Mechanisms: NONE
- Mixing: Fractal (triadic multi-patch)
Training Configuration
{
"model_name": "geo-beatrix-resnet18",
"model_type": "geometric_basin_classifier",
"num_classes": 100,
"batch_size": 512,
"num_epochs": 200,
"base_learning_rate": 0.002,
"weight_decay": 0.05,
"warmup_epochs": 10,
"pe_levels": 18,
"pe_features_per_level": 100,
"dropout": 0.1,
"pretrained_resnet": false,
"a100_optimizations": {
"mixed_precision": true,
"torch_compile": false,
"channels_last": true,
"gradient_checkpointing": false
},
"alphamix": {
"enabled": true,
"fractal_mode": true,
"range": [
0.3,
0.7
],
"spatial_ratio": 0.25,
"curriculum_start": 0.0,
"curriculum_end": 0.5,
"fractal_steps": [
1,
3
],
"fractal_scales": [
0.3333333333333333,
0.1111111111111111,
0.037037037037037035
]
},
"architecture": "ResNet18 + Devil's Staircase PE",
"loss_function": "Geometric Basin Compatibility",
"cross_entropy": false,
"attention_mechanisms": false,
"timestamp": "20251010_185133"
}
Files Structure
βββ model.pt (BEST overall model - easy access!)
βββ model.safetensors (BEST overall model - easy access!)
βββ best_model_info.json (which epoch/run this came from)
βββ runs_history.json (all training runs and their results)
βββ README.md
βββ weights/geo-beatrix-resnet18/20251010_185133/
β βββ model.pt (best from this training run)
β βββ model.safetensors (best from this training run)
β βββ config.json
β βββ training_log.txt
β βββ checkpoints/
β βββ checkpoint_epoch_50.safetensors
β βββ checkpoint_epoch_100.safetensors
β βββ checkpoint_epoch_150.safetensors
β (snapshots every 10 epochs)
βββ runs/geo-beatrix-resnet18/20251010_185133/
βββ events.out.tfevents.* (TensorBoard logs)
βββ metrics.csv (training metrics)
Note: The root model.pt and model.safetensors always contain the best model across all training runs!
Usage
from huggingface_hub import hf_hub_download
import torch
# EASIEST: Download BEST overall model from root (recommended!)
from safetensors.torch import load_file
model_path = hf_hub_download(
repo_id="AbstractPhil/geo-beatrix-resnet",
filename="model.safetensors"
)
state_dict = load_file(model_path)
# model.load_state_dict(state_dict)
# Check which epoch/run the best model came from
info_path = hf_hub_download(
repo_id="AbstractPhil/geo-beatrix-resnet",
filename="best_model_info.json"
)
with open(info_path) as f:
best_info = json.load(f)
print(f"Best model: epoch {best_info['epoch']}, {best_info['test_accuracy']:.2f}%")
# Or download from specific training run
model_path = hf_hub_download(
repo_id="AbstractPhil/geo-beatrix-resnet",
filename="weights/geo-beatrix-resnet18/20251010_185133/model.safetensors"
)
# Download specific epoch checkpoint
epoch_checkpoint = hf_hub_download(
repo_id="AbstractPhil/geo-beatrix-resnet",
filename="weights/geo-beatrix-resnet18/20251010_185133/checkpoints/checkpoint_epoch_100.safetensors"
)
Training History
Best Checkpoint
- Epoch: 200
- Train Acc: 69.87%
- Test Acc: 52.97%
- Alpha: 0.4452
- Loss: 0.6860
Latest 5 Epochs
- Epoch 196: Train 65.82%, Test 0.00%, Ξ±=0.4452, Loss=0.6758
- Epoch 197: Train 66.76%, Test 0.00%, Ξ±=0.4453, Loss=0.6885
- Epoch 198: Train 67.59%, Test 0.00%, Ξ±=0.4453, Loss=0.6682
- Epoch 199: Train 63.85%, Test 0.00%, Ξ±=0.4452, Loss=0.6657
- Epoch 200: Train 69.87%, Test 52.97%, Ξ±=0.4452, Loss=0.6860
Training Milestones
- π― 50% Accuracy reached at epoch 95
- π Ξ± β₯ 0.40 reached at epoch 10
- π Ξ± β₯ 0.44 (near triadic equilibrium) at epoch 66
Innovation
β
NO attention mechanisms
β
NO cross-entropy loss
β
Fractal positional encoding (Cantor function from 1883)
β
Geometric compatibility classification
β
ResNet18 backbone (proven CNN architecture)
β
Triadic fractal mixing (base-3 aligned)
Repository: https://huggingface.co/AbstractPhil/geo-beatrix-resnet
Author: AbstractPhil
Framework: PyTorch