Viharikvs commited on
Commit
b109e9a
·
verified ·
1 Parent(s): 4837927

Model card updated after epoch 0

Browse files
Files changed (1) hide show
  1. README.md +23 -3
README.md CHANGED
@@ -1,3 +1,23 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: t5-small
3
+ tags: [hrm, act, wikitext]
4
+ metrics: [loss, perplexity]
5
+ ---
6
+ # wikicmbaV1
7
+
8
+ **wikicmbaV1** is an experimental text generation model based on the. It was trained from scratch on the WikiText-103 dataset, a large-scale language modeling benchmark derived from high-quality Wikipedia articles.
9
+
10
+ The model utilizes the HRM structure, consisting of a "Specialist" module for low-level processing and a "Manager" module for high-level abstraction and planning. This architecture aims to handle long-range dependencies more effectively by summarizing information at different temporal scales.
11
+
12
+ ## Model Description
13
+
14
+ - **Architecture:** Hierarchical Recurrent Memory (HRM)
15
+ - **Training Data:** [WikiText-103](https://huggingface.co/datasets/wikitext)
16
+ - **Original Paper:** [Hierarchical Reasoning Model](https://arxiv.org/abs/2506.21734)
17
+ - **Tokenizer:** `t5-small` (slow T5 SentencePiece)
18
+ - **Vocab Size**: 32100
19
+ - **Objective:** Causal Language Modeling
20
+
21
+ ### Latest Performance (Epoch 0)
22
+ - **Validation Loss**: `4.7058`
23
+ - **Validation Perplexity**: `110.58377075195312`