File size: 1,016 Bytes
4bcdead | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 | ---
tags:
- pytorch
- safetensors
license: mit
---
# dm_qwen4b_emulator
2-layer MLP.
## Config
- `input_dim`: 6
- `hidden_dim`: 256
- `output_dim`: 3
## Usage
```python
import torch
import torch.nn as nn
from safetensors.torch import load_file
from huggingface_hub import hf_hub_download
class MLP(nn.Module):
def __init__(self, input_dim, hidden_dim, output_dim):
super().__init__()
self.mlp = nn.Sequential(
nn.Linear(input_dim, hidden_dim),
nn.LayerNorm(hidden_dim),
nn.ReLU(),
nn.Linear(hidden_dim, hidden_dim),
nn.LayerNorm(hidden_dim),
nn.ReLU(),
nn.Linear(hidden_dim, output_dim),
)
def forward(self, x):
return self.mlp(x)
path = hf_hub_download("chewwt/dm_qwen4b_emulator", "model.safetensors")
model = MLP(input_dim=6, hidden_dim=256, output_dim=3)
model.load_state_dict(load_file(path))
model.eval()
with torch.no_grad():
out = model(torch.randn(1, 6))
```
|