Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
moelanoby
/
kok-base
like
0
Transformers
PyTorch
open-r1/OpenR1-Math-220k
FreedomIntelligence/medical-o1-reasoning-SFT
English
Arabic
bucket-memory-model
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
Deploy
Use this model
ff161f4
kok-base
/
config.json
moelanoby
Update config.json
b6b9093
verified
about 1 year ago
raw
Copy download link
history
blame
Safe
217 Bytes
{
"model_type"
:
"bucket-memory-model"
,
"vocab_size"
:
30522
,
"d_model"
:
1024
,
"num_layers"
:
12
,
"num_buckets"
:
8
,
"min_bucket_size"
:
1
,
"max_bucket_size"
:
32
,
"max_seq_length"
:
1024
,
"dropout"
:
0.1
}