Add pipeline tag and link to code

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +9 -8
README.md CHANGED
@@ -1,14 +1,15 @@
1
  ---
2
- language: en
3
- license: apache-2.0
4
- library_name: transformers
5
- tags:
6
- - tptt
7
- - peft
8
- - trust_remote_code
9
  base_model: allenai/OLMo-1B-hf
10
  datasets:
11
  - yahma/alpaca-cleaned
 
 
 
 
 
 
 
 
12
  ---
13
 
14
  # Titans-OLMo-1B-hf
@@ -17,6 +18,7 @@ Titanesque version of `allenai/OLMo-1B-hf` with parallel linearized attention (T
17
 
18
  The model was presented in the paper [TPTT](https://huggingface.co/papers/2506.17671).
19
 
 
20
 
21
  ## Model Details
22
 
@@ -70,5 +72,4 @@ print(tokenizer.decode(outputs, skip_special_tokens=True))
70
 
71
  If you use TPTT in your academic work, please cite [Furfaro](https://huggingface.co/ffurfaro). For questions or support, please open an issue on the [GitHub repository](https://github.com/fabienfrfr/tptt) or contact the maintainer.
72
 
73
-
74
  ---
 
1
  ---
 
 
 
 
 
 
 
2
  base_model: allenai/OLMo-1B-hf
3
  datasets:
4
  - yahma/alpaca-cleaned
5
+ language: en
6
+ library_name: transformers
7
+ license: apache-2.0
8
+ pipeline_tag: text-generation
9
+ tags:
10
+ - tptt
11
+ - peft
12
+ - trust_remote_code
13
  ---
14
 
15
  # Titans-OLMo-1B-hf
 
18
 
19
  The model was presented in the paper [TPTT](https://huggingface.co/papers/2506.17671).
20
 
21
+ For code, see https://github.com/fabienfrfr/tptt
22
 
23
  ## Model Details
24
 
 
72
 
73
  If you use TPTT in your academic work, please cite [Furfaro](https://huggingface.co/ffurfaro). For questions or support, please open an issue on the [GitHub repository](https://github.com/fabienfrfr/tptt) or contact the maintainer.
74
 
 
75
  ---