gvij commited on
Commit
4f68028
·
verified ·
1 Parent(s): 66303a8

Update README.md

Browse files

Updated model name in README

Files changed (1) hide show
  1. README.md +8 -7
README.md CHANGED
@@ -8,12 +8,13 @@ tags:
8
  - lightweight
9
  - edge-deployment
10
  - neo-agent
 
11
  base_model: HuggingFaceTB/SmolLM2-135M
12
  datasets:
13
  - glaiveai/glaive-function-calling-v2
14
  - NousResearch/hermes-function-calling-v1
15
  model-index:
16
- - name: SmolLM2-135M-Function-Calling-NEO
17
  results:
18
  - task:
19
  type: function-calling
@@ -30,17 +31,17 @@ model-index:
30
  name: Function Name Accuracy (Internal Validation)
31
  ---
32
 
33
- # SmolLM2-135M-Function-Calling-NEO
34
 
35
  ## Model Description
36
 
37
- SmolLM2-135M-Function-Calling-NEO is a fine-tuned version of HuggingFaceTB/SmolLM2-135M specifically optimized for function calling tasks. This model has been trained to generate syntactically valid function calls in JSON format, making it suitable for lightweight applications requiring structured function invocation.
38
 
39
  **Key Achievement**: This model achieves **92.18% Structural Validity on BFCL** and **97.2% Function Name Accuracy** on internal validation, demonstrating strong performance despite its compact size of only 135M parameters.
40
 
41
  ## Attribution
42
 
43
- The dataset combination, training strategy, and execution were autonomously achieved by NEO.
44
 
45
  ## Performance Metrics
46
 
@@ -66,7 +67,7 @@ This model is specifically designed for:
66
  import torch
67
  from transformers import AutoModelForCausalLM, AutoTokenizer
68
 
69
- model_name = "SmolLM2-135M-Function-Calling-NEO"
70
  device = "cuda" if torch.cuda.is_available() else "cpu"
71
 
72
  tokenizer = AutoTokenizer.from_pretrained(model_name)
@@ -143,8 +144,8 @@ Expected output:
143
  If you use this model, please cite:
144
 
145
  ```bibtex
146
- @misc{smollm2-function-calling-neo,
147
- title={SmolLM2-135M-Function-Calling-NEO: Lightweight Function Calling Model},
148
  author={NEO Agent},
149
  year={2024},
150
  publisher={HuggingFace},
 
8
  - lightweight
9
  - edge-deployment
10
  - neo-agent
11
+ - neo
12
  base_model: HuggingFaceTB/SmolLM2-135M
13
  datasets:
14
  - glaiveai/glaive-function-calling-v2
15
  - NousResearch/hermes-function-calling-v1
16
  model-index:
17
+ - name: SmolLM2-135M-Function-Calling
18
  results:
19
  - task:
20
  type: function-calling
 
31
  name: Function Name Accuracy (Internal Validation)
32
  ---
33
 
34
+ # SmolLM2-135M-Function-Calling
35
 
36
  ## Model Description
37
 
38
+ SmolLM2-135M-Function-Calling is a fine-tuned version of HuggingFaceTB/SmolLM2-135M specifically optimized for function calling tasks. This model has been trained to generate syntactically valid function calls in JSON format, making it suitable for lightweight applications requiring structured function invocation.
39
 
40
  **Key Achievement**: This model achieves **92.18% Structural Validity on BFCL** and **97.2% Function Name Accuracy** on internal validation, demonstrating strong performance despite its compact size of only 135M parameters.
41
 
42
  ## Attribution
43
 
44
+ The dataset combination, training strategy, and execution were autonomously achieved by [NEO](https://heyneo.so/).
45
 
46
  ## Performance Metrics
47
 
 
67
  import torch
68
  from transformers import AutoModelForCausalLM, AutoTokenizer
69
 
70
+ model_name = "SmolLM2-135M-Function-Calling"
71
  device = "cuda" if torch.cuda.is_available() else "cpu"
72
 
73
  tokenizer = AutoTokenizer.from_pretrained(model_name)
 
144
  If you use this model, please cite:
145
 
146
  ```bibtex
147
+ @misc{smollm2-function-calling,
148
+ title={SmolLM2-135M-Function-Calling: Lightweight Function Calling Model},
149
  author={NEO Agent},
150
  year={2024},
151
  publisher={HuggingFace},