Qybera commited on
Commit
f6174a0
·
verified ·
1 Parent(s): 0620c36

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -4
README.md CHANGED
@@ -43,8 +43,6 @@ tags:
43
  - **Multimodal**: Handles text, vision, audio, code, action.
44
  - **Author**: N.E.N (Nthuku Elijah Nzeli) and SalesA Team.
45
 
46
- ---
47
-
48
  **Created by N.E.N (Nthuku Elijah Nzeli) and SalesA Team**
49
 
50
  - Model architecture: `SalesAModel`
@@ -56,7 +54,6 @@ tags:
56
  This repository contains the SalesA AI model, a modular, extensible, and efficient multimodal transformer with Mixture-of-Experts (MoE) layers.
57
  ```
58
 
59
- ---
60
 
61
  ## **Why This Naming Convention?**
62
 
@@ -64,7 +61,6 @@ tags:
64
  - **Discoverability:** Easy to search/filter on Hugging Face and other platforms.
65
  - **Professionalism:** Follows conventions used by top models (e.g., “Qwen2.5-7B-Minivoc-32k” [source](https://huggingface.co/kaitchup/Qwen2.5-7B-Minivoc-32k-v0.1a), “distilbert-base-uncased” [source](https://huggingface.co/distilbert-base-uncased)).
66
 
67
- ---
68
 
69
  ## **Summary Table**
70
 
@@ -153,6 +149,7 @@ The model is designed for extensibility, ethical deployment, and real-world appl
153
  - `confusion_matrix.png`, `class_distribution.png`, `per_class_metrics.png`: Diagnostic plots
154
 
155
  ## How to Use
 
156
  ```python
157
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
158
  # Or use your custom loading code for SalesA AI
@@ -160,6 +157,7 @@ from transformers import AutoTokenizer, AutoModelForSequenceClassification
160
 
161
  ## Citation
162
  If you use this model, please cite:
 
163
  ```bibtex
164
  @article{Malo2014GoodDO,
165
  title={Good debt or bad debt: Detecting semantic orientations in economic texts},
 
43
  - **Multimodal**: Handles text, vision, audio, code, action.
44
  - **Author**: N.E.N (Nthuku Elijah Nzeli) and SalesA Team.
45
 
 
 
46
  **Created by N.E.N (Nthuku Elijah Nzeli) and SalesA Team**
47
 
48
  - Model architecture: `SalesAModel`
 
54
  This repository contains the SalesA AI model, a modular, extensible, and efficient multimodal transformer with Mixture-of-Experts (MoE) layers.
55
  ```
56
 
 
57
 
58
  ## **Why This Naming Convention?**
59
 
 
61
  - **Discoverability:** Easy to search/filter on Hugging Face and other platforms.
62
  - **Professionalism:** Follows conventions used by top models (e.g., “Qwen2.5-7B-Minivoc-32k” [source](https://huggingface.co/kaitchup/Qwen2.5-7B-Minivoc-32k-v0.1a), “distilbert-base-uncased” [source](https://huggingface.co/distilbert-base-uncased)).
63
 
 
64
 
65
  ## **Summary Table**
66
 
 
149
  - `confusion_matrix.png`, `class_distribution.png`, `per_class_metrics.png`: Diagnostic plots
150
 
151
  ## How to Use
152
+
153
  ```python
154
  from transformers import AutoTokenizer, AutoModelForSequenceClassification
155
  # Or use your custom loading code for SalesA AI
 
157
 
158
  ## Citation
159
  If you use this model, please cite:
160
+
161
  ```bibtex
162
  @article{Malo2014GoodDO,
163
  title={Good debt or bad debt: Detecting semantic orientations in economic texts},