File size: 1,927 Bytes
d3b58b6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
---
language:
- vi
- en
tags:
- viena
- causal-lm
- transformers
- pytorch
- chat
license: mit
library_name: transformers
pipeline_tag: text-generation
---

# Viena Tiny Demo (SFT)

This is a tiny, demo-only Viena checkpoint fine-tuned for instruction following.
It is **not** production quality. It is intended for smoke tests and workflow validation.

## Model description

- Architecture: decoder-only Transformer (VienaModel) with RMSNorm, RoPE, SwiGLU, GQA.
- Parameters: ~10M (tiny config).
- Tokenizer: SentencePiece BPE (target vocab 2000; actual vocab may be smaller due to tiny data).
- Training: small offline synthetic dataset shipped with the repo.

## Training data

- Pretrain: `viena_data/examples/pretrain_offline.jsonl`
- SFT: `viena_data/examples/sft_offline_train.jsonl`
- Validation: `viena_data/examples/sft_offline_val.jsonl`

All datasets are synthetic and intended for offline tests.

## Training recipe (tiny)

- Config: `configs/viena_tiny.yaml`
- Pretrain: 50 steps
- SFT: 20 steps

## Usage

```python
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

model_id = "vietrix/viena-tiny-demo"
tokenizer = AutoTokenizer.from_pretrained(model_id, use_fast=False)
model = AutoModelForCausalLM.from_pretrained(
    model_id,
    torch_dtype=torch.float16 if torch.cuda.is_available() else torch.float32,
    device_map="auto",
)

prompt = "<|system|>
You are Viena.
<|user|>
Xin chao!
<|assistant|>
"
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
output = model.generate(**inputs, max_new_tokens=128, do_sample=True, temperature=0.7, top_p=0.9)
print(tokenizer.decode(output[0][inputs["input_ids"].shape[1]:], skip_special_tokens=True))
```

## Limitations

- Very small dataset and very few steps.
- Not suitable for real use or evaluation.
- Likely to hallucinate or be inconsistent.

## License

MIT (code + demo weights). See repository license for details.