Text Classification
Transformers
Safetensors
English
emcoder
feature-extraction
emotion-recognition
bayesian-deep-learning
mc-dropout
uncertainty-quantification
multi-label-classification
custom_code
Eval Results (legacy)
Instructions to use yezdata/EmCoder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use yezdata/EmCoder with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="yezdata/EmCoder", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("yezdata/EmCoder", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -11,6 +11,7 @@ tags:
|
|
| 11 |
- multi-label-classification
|
| 12 |
datasets:
|
| 13 |
- go_emotions
|
|
|
|
| 14 |
metrics:
|
| 15 |
- precision
|
| 16 |
- recall
|
|
@@ -28,7 +29,7 @@ model-index:
|
|
| 28 |
metrics:
|
| 29 |
- name: Macro F1
|
| 30 |
type: f1
|
| 31 |
-
value: 0.
|
| 32 |
- name: Macro Precision
|
| 33 |
type: precision
|
| 34 |
value: 0.408
|
|
|
|
| 11 |
- multi-label-classification
|
| 12 |
datasets:
|
| 13 |
- go_emotions
|
| 14 |
+
- Skylion007/openwebtext
|
| 15 |
metrics:
|
| 16 |
- precision
|
| 17 |
- recall
|
|
|
|
| 29 |
metrics:
|
| 30 |
- name: Macro F1
|
| 31 |
type: f1
|
| 32 |
+
value: 0.44
|
| 33 |
- name: Macro Precision
|
| 34 |
type: precision
|
| 35 |
value: 0.408
|