Add pipeline tag, library name, and GitHub link

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +101 -8
README.md CHANGED
@@ -1,14 +1,19 @@
1
  ---
2
- license: apache-2.0
3
- language:
4
- - en
5
  base_model:
6
  - codellama/CodeLlama-7b-hf
 
 
 
 
 
7
  ---
 
8
  # **TL-CodeLLaMA-2**
9
 
10
  TL-CodeLLaMA-2 is a model designed for tool use, built upon CodeLLaMA-7b. It is trained on 1,217 data samples using the *TL-Training* framework and demonstrates effective performance across a variety of tool use tasks. More information can be found in the paper "[TL-Training: A Task-Feature-Based Framework for Training Large Language Models in Tool Use](https://www.arxiv.org/abs/2412.15495)".
11
 
 
 
12
  # Model Use
13
 
14
  ## Requirements
@@ -26,7 +31,49 @@ The data needs to be organized in the following format:
26
  [
27
  {
28
  "role": "System",
29
- "content": "Function:\ndef random_advice():\n \"\"\"\n Returns a random advice slip as a slip object.\n \"\"\"\n\nFunction:\ndef advice_by_id(slip_id:str):\n \"\"\"\n If an advice slip is found with the corresponding {slip_id}, a slip object is returned.\n\n Args:\n slip_id (string): The unique ID of this advice slip.\n \"\"\"\n\nFunction:\ndef search_advice(query:str):\n \"\"\"\n If an advice slip is found, containing the corresponding search term in {query}, an array of slip objects is returned inside a search object.\n\n Args:\n query (string): The search query provided.\n \"\"\"\n\nFunction:\ndef ask_to_user(question:str):\n \"\"\"\n You can ask user for guidance when you think you need more information to handle the task, but you should use this tool as less as you can.\n\n Args:\n question (string): The question you want to ask to user.\n \"\"\"\n\nFunction:\ndef finish(answer:str):\n \"\"\"\n Finish the task and give your answer.\n\n Args:\n answer (string): Your answer for the task.\n \"\"\"\n\n"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
30
  },
31
  {
32
  "role": "User",
@@ -48,7 +95,9 @@ The data needs to be organized in the following format:
48
  The chat template is:
49
 
50
  ```jinja
51
- {% for message in messages %}{{message['role'] + ': ' + message['content']}}{% if loop.last %}{% if add_generation_prompt %}{{ '\nAssistant:' }}{% else %}{{ '</s>'}}{% endif %}{% else %}{{ '\n' }}{% endif %}{% endfor %}
 
 
52
  ```
53
 
54
  ## Inference
@@ -61,7 +110,49 @@ model_path = "Junjie-Ye/TL-CodeLLaMA-2"
61
  data = [
62
  {
63
  "role": "System",
64
- "content": "Function:\ndef random_advice():\n \"\"\"\n Returns a random advice slip as a slip object.\n \"\"\"\n\nFunction:\ndef advice_by_id(slip_id:str):\n \"\"\"\n If an advice slip is found with the corresponding {slip_id}, a slip object is returned.\n\n Args:\n slip_id (string): The unique ID of this advice slip.\n \"\"\"\n\nFunction:\ndef search_advice(query:str):\n \"\"\"\n If an advice slip is found, containing the corresponding search term in {query}, an array of slip objects is returned inside a search object.\n\n Args:\n query (string): The search query provided.\n \"\"\"\n\nFunction:\ndef ask_to_user(question:str):\n \"\"\"\n You can ask user for guidance when you think you need more information to handle the task, but you should use this tool as less as you can.\n\n Args:\n question (string): The question you want to ask to user.\n \"\"\"\n\nFunction:\ndef finish(answer:str):\n \"\"\"\n Finish the task and give your answer.\n\n Args:\n answer (string): Your answer for the task.\n \"\"\"\n\n"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
65
  },
66
  {
67
  "role": "User",
@@ -69,7 +160,9 @@ data = [
69
  }
70
  ]
71
 
72
- chat_template = "{% for message in messages %}{{message['role'] + ': ' + message['content']}}{% if loop.last %}{% if add_generation_prompt %}{{ '\nAssistant:' }}{% else %}{{ '</s>'}}{% endif %}{% else %}{{ '\n' }}{% endif %}{% endfor %}"
 
 
73
 
74
  model = AutoModelForCausalLM.from_pretrained(
75
  model_path,
@@ -119,4 +212,4 @@ If you find this model useful in your research, please cite:
119
  primaryClass={cs.CL},
120
  url={https://arxiv.org/abs/2412.15495},
121
  }
122
- ```
 
1
  ---
 
 
 
2
  base_model:
3
  - codellama/CodeLlama-7b-hf
4
+ language:
5
+ - en
6
+ license: apache-2.0
7
+ library_name: transformers
8
+ pipeline_tag: text-generation
9
  ---
10
+
11
  # **TL-CodeLLaMA-2**
12
 
13
  TL-CodeLLaMA-2 is a model designed for tool use, built upon CodeLLaMA-7b. It is trained on 1,217 data samples using the *TL-Training* framework and demonstrates effective performance across a variety of tool use tasks. More information can be found in the paper "[TL-Training: A Task-Feature-Based Framework for Training Large Language Models in Tool Use](https://www.arxiv.org/abs/2412.15495)".
14
 
15
+ Code: https://github.com/Junjie-Ye/TL-Training
16
+
17
  # Model Use
18
 
19
  ## Requirements
 
31
  [
32
  {
33
  "role": "System",
34
+ "content": "Function:
35
+ def random_advice():
36
+ \"\"\"
37
+ Returns a random advice slip as a slip object.
38
+ \"\"\"
39
+
40
+ Function:
41
+ def advice_by_id(slip_id:str):
42
+ \"\"\"
43
+ If an advice slip is found with the corresponding {slip_id}, a slip object is returned.
44
+
45
+ Args:
46
+ slip_id (string): The unique ID of this advice slip.
47
+ \"\"\"
48
+
49
+ Function:
50
+ def search_advice(query:str):
51
+ \"\"\"
52
+ If an advice slip is found, containing the corresponding search term in {query}, an array of slip objects is returned inside a search object.
53
+
54
+ Args:
55
+ query (string): The search query provided.
56
+ \"\"\"
57
+
58
+ Function:
59
+ def ask_to_user(question:str):
60
+ \"\"\"
61
+ You can ask user for guidance when you think you need more information to handle the task, but you should use this tool as less as you can.
62
+
63
+ Args:
64
+ question (string): The question you want to ask to user.
65
+ \"\"\"
66
+
67
+ Function:
68
+ def finish(answer:str):
69
+ \"\"\"
70
+ Finish the task and give your answer.
71
+
72
+ Args:
73
+ answer (string): Your answer for the task.
74
+ \"\"\"
75
+
76
+ "
77
  },
78
  {
79
  "role": "User",
 
95
  The chat template is:
96
 
97
  ```jinja
98
+ {% for message in messages %}{{message['role'] + ': ' + message['content']}}{% if loop.last %}{% if add_generation_prompt %}{{ '
99
+ Assistant:' }}{% else %}{{ '</s>'}}{% endif %}{% else %}{{ '
100
+ ' }}{% endif %}{% endfor %}
101
  ```
102
 
103
  ## Inference
 
110
  data = [
111
  {
112
  "role": "System",
113
+ "content": "Function:
114
+ def random_advice():
115
+ \"\"\"
116
+ Returns a random advice slip as a slip object.
117
+ \"\"\"
118
+
119
+ Function:
120
+ def advice_by_id(slip_id:str):
121
+ \"\"\"
122
+ If an advice slip is found with the corresponding {slip_id}, a slip object is returned.
123
+
124
+ Args:
125
+ slip_id (string): The unique ID of this advice slip.
126
+ \"\"\"
127
+
128
+ Function:
129
+ def search_advice(query:str):
130
+ \"\"\"
131
+ If an advice slip is found, containing the corresponding search term in {query}, an array of slip objects is returned inside a search object.
132
+
133
+ Args:
134
+ query (string): The search query provided.
135
+ \"\"\"
136
+
137
+ Function:
138
+ def ask_to_user(question:str):
139
+ \"\"\"
140
+ You can ask user for guidance when you think you need more information to handle the task, but you should use this tool as less as you can.
141
+
142
+ Args:
143
+ question (string): The question you want to ask to user.
144
+ \"\"\"
145
+
146
+ Function:
147
+ def finish(answer:str):
148
+ \"\"\"
149
+ Finish the task and give your answer.
150
+
151
+ Args:
152
+ answer (string): Your answer for the task.
153
+ \"\"\"
154
+
155
+ "
156
  },
157
  {
158
  "role": "User",
 
160
  }
161
  ]
162
 
163
+ chat_template = "{% for message in messages %}{{message['role'] + ': ' + message['content']}}{% if loop.last %}{% if add_generation_prompt %}{{ '
164
+ Assistant:' }}{% else %}{{ '</s>'}}{% endif %}{% else %}{{ '
165
+ ' }}{% endif %}{% endfor %}"
166
 
167
  model = AutoModelForCausalLM.from_pretrained(
168
  model_path,
 
212
  primaryClass={cs.CL},
213
  url={https://arxiv.org/abs/2412.15495},
214
  }
215
+ ```