YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)
---
library_name: pytorch
tags:
  - resnet
  - pruning
  - knowledge-distillation
  - speedup
license: apache-2.0
dataset: imagenet-1k
pipeline_tag: image-classification
---

hawada/vit-base-patch16-224-rtx4090-slim

This repository contains two variants:

  • Gated student (with learned pruning gates) – requires custom code.
  • Slim student (post-prune/export) – loads with standard code (LLM) or bundled code (ResNet).

Inference (LLM, slim)

from transformers import AutoModelForCausalLM, AutoTokenizer
tok = AutoTokenizer.from_pretrained('hawada/vit-base-patch16-224-rtx4090-slim')
mdl = AutoModelForCausalLM.from_pretrained('hawada/vit-base-patch16-224-rtx4090-slim', torch_dtype='auto').eval()
x = tok('Hello', return_tensors='pt')
print(tok.decode(mdl.generate(**x, max_new_tokens=16)[0]))

Notes

  • The gated repo includes lightweight custom code (adapters/…, core/…) needed to attach/load gates.
  • The slim LLM is exported to standard HF architecture for out-of-the-box loading.
  • For ResNet, both repos include minimal custom code to define the module.

Training metadata

{
  "base_id": "google/vit-base-patch16-224",
  "variant": "slim-export",
  "repo_slim": "hawada/vit-base-patch16-224-rtx4090-slim"
}
Downloads last month
5
Safetensors
Model size
69.9M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collections including hawada/vit-base-patch16-224-rtx4090-slim