StevenJingfeng commited on
Commit
caeed1d
·
verified ·
1 Parent(s): 2ce2824

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +68 -1
README.md CHANGED
@@ -96,7 +96,7 @@ You can also refer to [BigQuery](https://console.cloud.google.com/bigquery?p=big
96
  </tr>
97
  </table>
98
 
99
- ### Two-step training loss (normal training and monotonic training)
100
  We utilized the NAM model due to its inherent transparency characteristic and the ability to isolate variables, facilitating the imposition of monotonicity constraints on specific features. The model is trained on data from two distinct periods, achieving weak pairwise monotonicity over the $\alpha$ feature. In the first step, standard training is conducted to enable the model to learn from the data. In the second step, we impose monotonic constraints.
101
 
102
  <table>
@@ -105,3 +105,70 @@ We utilized the NAM model due to its inherent transparency characteristic and th
105
  <td><a href="./results/training_loss_2_step.pdf">Two-step training loss</a></td>
106
  </tr>
107
  </table>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
96
  </tr>
97
  </table>
98
 
99
+ ### Monotonicity Two-step training loss (normal training and monotonic training)
100
  We utilized the NAM model due to its inherent transparency characteristic and the ability to isolate variables, facilitating the imposition of monotonicity constraints on specific features. The model is trained on data from two distinct periods, achieving weak pairwise monotonicity over the $\alpha$ feature. In the first step, standard training is conducted to enable the model to learn from the data. In the second step, we impose monotonic constraints.
101
 
102
  <table>
 
105
  <td><a href="./results/training_loss_2_step.pdf">Two-step training loss</a></td>
106
  </tr>
107
  </table>
108
+
109
+ ### Sentiment (Combination of Off-chain and On-chain)
110
+
111
+ <h2>Model Performance over Two Periods</h2>
112
+ <table>
113
+ <caption>Model Performance over Two Periods</caption>
114
+ <thead>
115
+ <tr>
116
+ <th class="gray-bg"></th>
117
+ <th class="gray-bg">+OC,+DS,+HS</th>
118
+ <th class="gray-bg">+OC,+DS,-HS</th>
119
+ <th class="gray-bg">+OC,-DS,+HS</th>
120
+ <th class="gray-bg">+OC,-DS,-HS</th>
121
+ </tr>
122
+ </thead>
123
+ <tbody>
124
+ <tr>
125
+ <td colspan="5" class="gray-bg"><strong>Period 1: 03/21/2023 - 04/01/2023 (ARB-airdrop)</strong></td>
126
+ </tr>
127
+ <tr>
128
+ <td class="gray-bg">3 Timesteps</td>
129
+ <td>0.10022</td>
130
+ <td>0.10150</td>
131
+ <td>0.10164</td>
132
+ <td>0.10201</td>
133
+ </tr>
134
+ <tr>
135
+ <td class="gray-bg">2 Timesteps</td>
136
+ <td>0.10056</td>
137
+ <td>0.10249</td>
138
+ <td>0.10213</td>
139
+ <td>0.10265</td>
140
+ </tr>
141
+ <tr>
142
+ <td class="gray-bg">1 Timestep</td>
143
+ <td>0.10169</td>
144
+ <td>0.10190</td>
145
+ <td>0.10204</td>
146
+ <td>0.10290</td>
147
+ </tr>
148
+ <tr>
149
+ <td colspan="5" class="gray-bg"><strong>Period 2: 06/01/2023 - 07/01/2023 (Normal)</strong></td>
150
+ </tr>
151
+ <tr>
152
+ <td class="gray-bg">3 Timesteps</td>
153
+ <td>0.13341</td>
154
+ <td>0.15657</td>
155
+ <td>0.16142</td>
156
+ <td>0.16089</td>
157
+ </tr>
158
+ <tr>
159
+ <td class="gray-bg">2 Timesteps</td>
160
+ <td>0.13477</td>
161
+ <td>0.15381</td>
162
+ <td>0.15806</td>
163
+ <td>0.16456</td>
164
+ </tr>
165
+ <tr>
166
+ <td class="gray-bg">1 Timestep</td>
167
+ <td>0.13593</td>
168
+ <td>0.15321</td>
169
+ <td>0.15459</td>
170
+ <td>0.18428</td>
171
+ </tr>
172
+ </tbody>
173
+ </table>
174
+ <p>The notation "OC" refers to On-chain variables, while "HS" and "DS" denote Hourly Averaged Sentiment and Daily Averaged Sentiment, respectively. The ‘+’ symbol indicates the inclusion of a variable in the model, whereas the ‘-’ symbol denotes its exclusion. The numerical values represent the mean square error (MSE) of the model on the test dataset.</p>