yezdata commited on
Commit
e1f8f65
·
verified ·
1 Parent(s): 8e0810e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -61,7 +61,9 @@ EmCoder achieves competitive F1-scores while being ~35% smaller than RoBERTa-bas
61
 
62
 
63
  ## How to use
64
- > Since `.safetensors` files only store model weights and not the class logic, you need to use the provided `emcoder.py` to enable **MC Dropout inference**.<br>EmCoder v1.0 requires the `roberta-base` tokenizer for correct token-to-embedding mapping.
 
 
65
  ### 1. Setup & Tokenization
66
  ```python
67
  from transformers import AutoTokenizer
@@ -98,7 +100,7 @@ uncertainty = probs_all.std(dim=0) # Epistemic Uncertainty (Standard Deviation)
98
 
99
 
100
  ## Model Architecture
101
- ![EmCoder Architecture](outputs/architecture.svg)
102
 
103
 
104
  ### Optimization
@@ -165,7 +167,7 @@ $$
165
 
166
 
167
  ## Workflow
168
- ![EmCoder Workflow](outputs/workflow.svg)
169
 
170
 
171
  ### Note
 
61
 
62
 
63
  ## How to use
64
+ Since `.safetensors` files only store model weights and not the class logic, you need to use the provided `emcoder.py` to enable **MC Dropout inference**.<br>EmCoder v1.0 requires the `roberta-base` tokenizer for correct token-to-embedding mapping.
65
+
66
+
67
  ### 1. Setup & Tokenization
68
  ```python
69
  from transformers import AutoTokenizer
 
100
 
101
 
102
  ## Model Architecture
103
+ ![EmCoder Architecture](outputs/architecture.png)
104
 
105
 
106
  ### Optimization
 
167
 
168
 
169
  ## Workflow
170
+ ![EmCoder Workflow](outputs/workflow.png)
171
 
172
 
173
  ### Note