Add paper link, GitHub link, and task category

#2
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +46 -3
README.md CHANGED
@@ -1,3 +1,46 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ task_categories:
4
+ - summarization
5
+ ---
6
+
7
+ # CoMeT: Collaborative Memory Transformer - SCROLLS Dataset
8
+
9
+ This repository contains the SCROLLS dataset pre-processed for use with the **Collaborative Memory Transformer (CoMeT)**, as presented in the paper [CoMeT: Collaborative Memory Transformer for Efficient Long Context Modeling](https://huggingface.co/papers/2602.01766).
10
+
11
+ ## Dataset Description
12
+
13
+ This dataset is a pre-processed version of the [SCROLLS benchmark](https://www.scrolls-benchmark.com/), designed for evaluating long-context modeling. CoMeT is a novel architecture that enables LLMs to handle arbitrarily long sequences with constant memory usage and linear time complexity by using a dual-memory system (temporary FIFO memory and global gated memory).
14
+
15
+ ## Resources
16
+
17
+ - **Paper:** [CoMeT: Collaborative Memory Transformer for Efficient Long Context Modeling](https://huggingface.co/papers/2602.01766)
18
+ - **GitHub Repository:** [LivingFutureLab/Comet](https://github.com/LivingFutureLab/Comet)
19
+
20
+ ## Usage
21
+
22
+ According to the official repository, you can use the following scripts to process and use the data:
23
+
24
+ ```bash
25
+ # Pre-tokenize and pack data
26
+ bash scripts/tokenize.sh
27
+
28
+ # Train
29
+ bash scripts/train.sh
30
+ ```
31
+
32
+ ## Citation
33
+
34
+ If you find this work useful, please cite:
35
+
36
+ ```bibtex
37
+ @misc{zhao2026cometcollaborativememorytransformer,
38
+ title={CoMeT: Collaborative Memory Transformer for Efficient Long Context Modeling},
39
+ author={Runsong Zhao and Shilei Liu and Jiwei Tang and Langming Liu and Haibin Chen and Weidong Zhang and Yujin Yuan and Tong Xiao and Jingbo Zhu and Wenbo Su and Bo Zheng},
40
+ year={2026},
41
+ eprint={2602.01766},
42
+ archivePrefix={arXiv},
43
+ primaryClass={cs.LG},
44
+ url={https://arxiv.org/abs/2602.01766},
45
+ }
46
+ ```