Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
newmindai
/
Mecellem-Qwen3-4B-TR
like
3
Follow
NewMind AI
68
Text Generation
Transformers
Safetensors
Turkish
English
qwen3
turkish
legal
turkish-legal
mecellem
qwen
decoder-only
continual-pretraining
TRUBA
MN5
conversational
text-generation-inference
arxiv:
2601.16018
arxiv:
2409.00000
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
main
Mecellem-Qwen3-4B-TR
3 contributors
History:
5 commits
zgrgr
nielsr
HF Staff
Add library_name metadata and link to GitHub (
#1
)
b5a41d3
verified
about 1 month ago
.gitattributes
1.76 kB
Initial model upload - clean repository
about 2 months ago
4b_qwen_armo.png
109 kB
xet
Initial model upload - clean repository
about 2 months ago
README.md
8.04 kB
Add library_name metadata and link to GitHub (#1)
about 1 month ago
added_tokens.json
707 Bytes
Initial model upload - clean repository
about 2 months ago
chat_template.jinja
4.12 kB
Initial model upload - clean repository
about 2 months ago
comparison_rewards_by_token_length-filtered.png
266 kB
xet
Initial model upload - clean repository
about 2 months ago
config.json
1.54 kB
Initial model upload - clean repository
about 2 months ago
generation_config.json
121 Bytes
Initial model upload - clean repository
about 2 months ago
merges.txt
1.67 MB
Initial model upload - clean repository
about 2 months ago
model-00001-of-00002.safetensors
4.97 GB
xet
Initial model upload - clean repository
about 2 months ago
model-00002-of-00002.safetensors
3.86 GB
xet
Initial model upload - clean repository
about 2 months ago
model.safetensors.index.json
32.9 kB
Initial model upload - clean repository
about 2 months ago
qwen4b_dataset.png
76.3 kB
Initial model upload - clean repository
about 2 months ago
qwen4b_loss.png
71.9 kB
xet
Initial model upload - clean repository
about 2 months ago
special_tokens_map.json
762 Bytes
Initial model upload - clean repository
about 2 months ago
tokenizer.json
11.4 MB
xet
Initial model upload - clean repository
about 2 months ago
tokenizer_config.json
5.44 kB
Initial model upload - clean repository
about 2 months ago
vocab.json
2.78 MB
Initial model upload - clean repository
about 2 months ago