Safetensors
English
qwen2
nielsr HF Staff commited on
Commit
ba8ea36
·
verified ·
1 Parent(s): 9475a6a

Add pipeline tag, library name, and links to paper and code

Browse files

Hi! I'm Niels, part of the community science team at Hugging Face.

I've updated the model card to include:
* **Metadata tags**: Added `pipeline_tag: text-generation` and `library_name: transformers` to help users find and use the model correctly via the Hub's automated features.
* **Paper and Code links**: Linked the model to its associated research paper and official GitHub repository for better accessibility.

These changes help make your model more discoverable and easier to use.

Files changed (1) hide show
  1. README.md +8 -5
README.md CHANGED
@@ -1,14 +1,19 @@
1
  ---
2
- license: mit
3
  datasets:
4
  - HuggingFaceTB/smollm-corpus
5
  language:
6
  - en
 
 
 
7
  ---
8
 
9
  # Raw 500M Shared
10
 
 
11
 
 
 
12
 
13
  ## How to Get Started with the Model
14
  Use the code below to get started with the model.
@@ -25,7 +30,7 @@ tokenizer = AutoTokenizer.from_pretrained(
25
 
26
 
27
  ## Citation
28
- ```
29
  @article{yamaguchi2026enhancinglinguisticcompetencelanguage,
30
  title={Enhancing Linguistic Competence of Language Models through Pre-training with Language Learning Tasks},
31
  author={Atsuki Yamaguchi and Maggie Mi and Nikolaos Aletras},
@@ -37,6 +42,4 @@ tokenizer = AutoTokenizer.from_pretrained(
37
  journal={arXiv},
38
  volume={abs/2601.03448}
39
  }
40
- ```
41
-
42
-
 
1
  ---
 
2
  datasets:
3
  - HuggingFaceTB/smollm-corpus
4
  language:
5
  - en
6
+ license: mit
7
+ pipeline_tag: text-generation
8
+ library_name: transformers
9
  ---
10
 
11
  # Raw 500M Shared
12
 
13
+ This model is a baseline model (Raw 500M Shared) presented in the paper [Enhancing Linguistic Competence of Language Models through Pre-training with Language Learning Tasks](https://huggingface.co/papers/2601.03448).
14
 
15
+ - **Code:** [Official GitHub Repository](https://github.com/gucci-j/l2t)
16
+ - **Paper:** [Enhancing Linguistic Competence of Language Models through Pre-training with Language Learning Tasks](https://huggingface.co/papers/2601.03448)
17
 
18
  ## How to Get Started with the Model
19
  Use the code below to get started with the model.
 
30
 
31
 
32
  ## Citation
33
+ ```bibtex
34
  @article{yamaguchi2026enhancinglinguisticcompetencelanguage,
35
  title={Enhancing Linguistic Competence of Language Models through Pre-training with Language Learning Tasks},
36
  author={Atsuki Yamaguchi and Maggie Mi and Nikolaos Aletras},
 
42
  journal={arXiv},
43
  volume={abs/2601.03448}
44
  }
45
+ ```