Add pipeline tag and ensure correct license information

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +5 -6
README.md CHANGED
@@ -1,8 +1,9 @@
1
  ---
2
- library_name: transformers
3
- license: mit
4
  base_model:
5
  - google/gemma-2-2b-it
 
 
 
6
  ---
7
 
8
  # R<sup>2</sup>ec: Towards Large Recommender Models with Reasoning
@@ -10,8 +11,6 @@ base_model:
10
  R<sup>2</sup>ec is a large recommender model with reasoning, generating both natural language rationales and ranked item predictions.
11
  The model is fine-tuned with reinforcement learning to enhance its reasoning capabilities for more effective recommendations.
12
 
13
-
14
-
15
  <p align="center">
16
  <a href="https://arxiv.org/pdf/2505.16994"><b>Paper Link</b>👁️</a>
17
  </p>
@@ -23,8 +22,8 @@ The model is fine-tuned with reinforcement learning to enhance its reasoning cap
23
  - **Paper:** https://arxiv.org/abs/2505.16994
24
 
25
  ## Licence
26
- This code repository is licensed under MIT License.
27
- The use of R<sup>2</sup>ec models is also subject to MIT License.
28
  R<sup>2</sup>ec series support commercial use and distillation.
29
 
30
  ## Citation
 
1
  ---
 
 
2
  base_model:
3
  - google/gemma-2-2b-it
4
+ library_name: transformers
5
+ license: mit
6
+ pipeline_tag: text-ranking
7
  ---
8
 
9
  # R<sup>2</sup>ec: Towards Large Recommender Models with Reasoning
 
11
  R<sup>2</sup>ec is a large recommender model with reasoning, generating both natural language rationales and ranked item predictions.
12
  The model is fine-tuned with reinforcement learning to enhance its reasoning capabilities for more effective recommendations.
13
 
 
 
14
  <p align="center">
15
  <a href="https://arxiv.org/pdf/2505.16994"><b>Paper Link</b>👁️</a>
16
  </p>
 
22
  - **Paper:** https://arxiv.org/abs/2505.16994
23
 
24
  ## Licence
25
+ This code repository is licensed under the MIT License.
26
+ The use of R<sup>2</sup>ec models is also subject to the MIT License.
27
  R<sup>2</sup>ec series support commercial use and distillation.
28
 
29
  ## Citation