Add pipeline tag, library name and project links

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +9 -3
README.md CHANGED
@@ -1,15 +1,21 @@
1
  ---
2
- license: apache-2.0
3
  datasets:
4
  - common-pile/comma_v0.1_training_dataset
5
  language:
6
  - en
 
 
 
7
  ---
 
8
  # TinyComma 1.8B
9
 
10
  TinyComma 1.8B is a 1.8B parameter, decoder-only base LM trained entirely on permissively licensed data from the [Common Pile](https://huggingface.co/collections/common-pile/common-pile-v01). Different from the official Comma model series, TinyComma 1.8B uses the 128K-vocabulary [Llama3](https://huggingface.co/collections/meta-llama/llama-31) tokenizer to ensure compatibility with two-model decoding setups.
11
  We trained TinyComma 1.8B to support our research on inference-time copyright mitigation.
12
- Check out our paper, [Anchored Decoding: Provably Reducing Copyright Risk for Any Language Model](https://arxiv.org/abs/2602.07120), for more details!
 
 
 
13
 
14
  ## Benchmarking TinyComma 1.8B
15
 
@@ -101,4 +107,4 @@ and (2) a 13.5B-token cooldown stage on a weighted mixture of three high-quality
101
  journal={arXiv preprint},
102
  year={2026}
103
  }
104
- ```
 
1
  ---
 
2
  datasets:
3
  - common-pile/comma_v0.1_training_dataset
4
  language:
5
  - en
6
+ license: apache-2.0
7
+ library_name: transformers
8
+ pipeline_tag: text-generation
9
  ---
10
+
11
  # TinyComma 1.8B
12
 
13
  TinyComma 1.8B is a 1.8B parameter, decoder-only base LM trained entirely on permissively licensed data from the [Common Pile](https://huggingface.co/collections/common-pile/common-pile-v01). Different from the official Comma model series, TinyComma 1.8B uses the 128K-vocabulary [Llama3](https://huggingface.co/collections/meta-llama/llama-31) tokenizer to ensure compatibility with two-model decoding setups.
14
  We trained TinyComma 1.8B to support our research on inference-time copyright mitigation.
15
+
16
+ - **Paper:** [Anchored Decoding: Provably Reducing Copyright Risk for Any Language Model](https://arxiv.org/abs/2602.07120)
17
+ - **Repository:** [jacqueline-he/anchored-decoding](https://github.com/jacqueline-he/anchored-decoding)
18
+ - **Project Page:** [Interactive Demo](https://tinyurl.com/anchored-decoding-demo)
19
 
20
  ## Benchmarking TinyComma 1.8B
21
 
 
107
  journal={arXiv preprint},
108
  year={2026}
109
  }
110
+ ```