|
|
--- |
|
|
language: |
|
|
- en |
|
|
license: apache-2.0 |
|
|
pipeline_tag: feature-extraction |
|
|
library_name: transformers |
|
|
tags: |
|
|
- pytorch |
|
|
- toy-model |
|
|
- tiny-mlp |
|
|
base_model: none |
|
|
base_model_relation: finetune |
|
|
--- |
|
|
|
|
|
# TinyModel |
|
|
|
|
|
A small MLP (100 → 200 → 10) using ReLU and Softmax, designed as a simple example for publisher workflows. |
|
|
|
|
|
## Intended Uses & Limitations |
|
|
|
|
|
This model is purely educational and not trained on real-world data. 🧪 It is not suitable for production tasks or any mission-critical usage. |
|
|
|
|
|
## Usage |
|
|
|
|
|
```python |
|
|
import torch |
|
|
from tiny_model import TinyModel |
|
|
|
|
|
model = TinyModel() |
|
|
state = torch.load( |
|
|
"https://huggingface.co/tester-123456789/tiny-model/resolve/main/pytorch_model.bin", |
|
|
map_location="cpu", |
|
|
weights_only=True |
|
|
) |
|
|
model.load_state_dict(state) |
|
|
model.eval() |
|
|
|
|
|
x = torch.randn(1, 100) |
|
|
y = model(x) |
|
|
print(y) |
|
|
|