Cotum commited on
Commit
2e257cd
·
verified ·
1 Parent(s): 48cb249

Update README.md

Browse files

Updated model card with more infos and a more adapted example

Files changed (1) hide show
  1. README.md +21 -6
README.md CHANGED
@@ -19,18 +19,33 @@ It has been trained using [TRL](https://github.com/huggingface/trl).
19
  ```python
20
  from transformers import pipeline
21
 
22
- question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23
  generator = pipeline("text-generation", model="Cotum/Qwen2.5-3B-Instruct-thinking-function_calling-V0", device="cuda")
24
- output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
25
  print(output["generated_text"])
26
  ```
27
 
28
  ## Training procedure
29
 
30
-
31
-
32
-
33
- This model was trained with SFT.
34
 
35
  ### Framework versions
36
 
 
19
  ```python
20
  from transformers import pipeline
21
 
22
+ prompt="""<bos><start_of_turn>human
23
+ You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags.You may call one or more functions to assist with the user query.
24
+ Don't make assumptions about what values to plug into functions.Here are the available tools:
25
+ <tools>
26
+ [{'type': 'function', 'function': {'name': 'convert_currency', 'description': 'Convert from one currency to another', 'parameters': {'type': 'object', 'properties': {'amount': {'type': 'number', 'description': 'The amount to convert'}, 'from_currency': {'type': 'string', 'description': 'The currency to convert from'}, 'to_currency': {'type': 'string', 'description': 'The currency to convert to'}}, 'required': ['amount', 'from_currency', 'to_currency']}}},
27
+ {'type': 'function', 'function': {'name': 'calculate_distance', 'description': 'Calculate the distance between two locations', 'parameters': {'type': 'object', 'properties': {'start_location': {'type': 'string', 'description': 'The starting location'}, 'end_location': {'type': 'string', 'description': 'The ending location'}}, 'required': ['start_location', 'end_location']}}}
28
+ {'type': 'function', 'function': {'name': 'send_email', 'description': 'Send an email to a customer', 'parameters': {'type': 'object', 'properties': {'customer': {'type': 'string', 'description': 'The customer to send the email to'}, 'subject': {'type': 'string', 'description': 'The subject of the email'}, 'body': {'type': 'string', 'description': 'The body of the email'}}, 'required': ['customer', 'subject', 'body']}}}
29
+ ]
30
+ </tools>
31
+ Use the following pydantic model json schema for each tool call you will make:
32
+ {'title': 'FunctionCall', 'type': 'object', 'properties': {'arguments': {'title': 'Arguments', 'type': 'object'}, 'name': {'title': 'Name', 'type': 'string'}}, 'required': ['arguments', 'name']}
33
+ For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
34
+ <tool_call>
35
+ {tool_call}
36
+ </tool_call>Also, before making a call to a function take the time to plan the function to take. Make that thinking process between <think>{your thoughts}</think>
37
+
38
+ Hi, I need you to tell John@doe.com that I received his package ?<end_of_turn><eos>
39
+ <start_of_turn>model
40
+ <think>"""
41
  generator = pipeline("text-generation", model="Cotum/Qwen2.5-3B-Instruct-thinking-function_calling-V0", device="cuda")
42
+ output = generator([{"role": "user", "content": prompt}], max_new_tokens=128, return_full_text=False)[0]
43
  print(output["generated_text"])
44
  ```
45
 
46
  ## Training procedure
47
 
48
+ This model was trained with SFT following the Bonus Unit 1 of the Agent Course of Hugging Face : https://huggingface.co/agents-course/notebooks/blob/main/bonus-unit1/bonus-unit1.ipynb
 
 
 
49
 
50
  ### Framework versions
51