Update README.md
Browse files
README.md
CHANGED
|
@@ -35,11 +35,11 @@ We implemented this assignment using mainly Keras and Sklearn.
|
|
| 35 |
|
| 36 |
An example for the ‘Adults’ dataset:
|
| 37 |
|
| 38 |
-

|
| 196 |
|
|
@@ -251,14 +251,14 @@ In this part the main goal was for the distribution of confidence probabilities
|
|
| 251 |
|
| 252 |
- Class distribution:
|
| 253 |
|
| 254 |
-

|
| 39 |
|
| 40 |
An example for the ‘Bank-full’ dataset:
|
| 41 |
|
| 42 |
+

|
| 43 |
|
| 44 |
**Code Design:**
|
| 45 |
|
|
|
|
| 150 |
- MMDNF = Mean minimum euclidean distance for the not fooled samples was 0.422
|
| 151 |
- Several samples that “fooled” the detector:
|
| 152 |
|
| 153 |
+

|
| 154 |
|
| 155 |
- Several samples that “not fooled” the detector:
|
| 156 |
|
| 157 |
+

|
| 158 |
|
| 159 |
- Plotting the PCA shows that the fooled samples are very similar to the real data and the not fooled samples are less similar.
|
| 160 |
|
| 161 |
+

|
| 162 |
|
| 163 |
- Out of 100 samples, 74 samples were fooled by the discriminator and 26 samples were not fooled by the discriminator.
|
| 164 |
- A graph describing the loss of the generator and the discriminator:
|
|
|
|
| 166 |
- The generator loss was extremely decreased while the discriminator loss was quite the same.
|
| 167 |
- Eventually the generator and the discriminator were quite coveraged nearly a loss of 0.6.
|
| 168 |
|
| 169 |
+

|
| 170 |
|
| 171 |
For bank-full dataset, the results of the model were:
|
| 172 |
|
|
|
|
| 174 |
- MMDNF = Mean minimum euclidean distance for the not fooled samples was 0.305854238
|
| 175 |
- Several samples that “fooled” the detector:
|
| 176 |
|
| 177 |
+

|
| 178 |
|
| 179 |
- Several samples that “not fooled” the detector:
|
| 180 |
|
| 181 |
+

|
| 182 |
|
| 183 |
- Plotting the PCA shows that the fooled samples are very similar to the real data and the not fooled samples are less similar.
|
| 184 |
|
| 185 |
+

|
| 186 |
|
| 187 |
- Out of 100 samples, 32 samples were fooled by the discriminator and 68 samples were not fooled by the discriminator.
|
| 188 |
- A graph describing the loss of the generator and the discriminator:
|
|
|
|
| 190 |
- The generator loss was extremely decreased while the discriminator loss was quite the same.
|
| 191 |
- Eventually the generator and the discriminator were coveraged nearly a loss of 0.5.
|
| 192 |
|
| 193 |
+

|
| 194 |
|
| 195 |
## General Generator (Part 2)
|
| 196 |
|
|
|
|
| 251 |
|
| 252 |
- Class distribution:
|
| 253 |
|
| 254 |
+

|
| 255 |
|
| 256 |
- Note that there is some imbalance here, which is nearly identical to the ratio between the mean confidence scores for each class.
|
| 257 |
- Probability distribution for class 0 and class 1, for the **test set**:
|
| 258 |
|
| 259 |
+

|
| 260 |
|
| 261 |
+

|
| 262 |
|
| 263 |
- Note that the images mirror each other.
|
| 264 |
|
|
|
|
| 272 |
|
| 273 |
Class distribution:
|
| 274 |
|
| 275 |
+

|
| 276 |
|
| 277 |
- The data here is even more imbalanced. The confidence scores reflect this.
|
| 278 |
- Confidence score distribution for test set:
|
| 279 |
|
| 280 |
|
| 281 |
+

|
| 282 |
|
| 283 |
+

|
| 284 |
|
| 285 |
**Generator Results:**
|
| 286 |
|
|
|
|
| 291 |
- Training loss:
|
| 292 |
|
| 293 |
|
| 294 |
+

|
| 295 |
|
| 296 |
- Confidence score distribution for each class:
|
| 297 |
- Note that they mirror each other.
|
| 298 |
|
| 299 |
+

|
| 300 |
|
| 301 |
+

|
| 302 |
|
| 303 |
- The results are far from uniform, but it is obvious that they are skewed towards the original confidence scores.
|
| 304 |
|
|
|
|
| 309 |
|
| 310 |
- Training loss:
|
| 311 |
-
|
| 312 |
+

|
| 313 |
|
| 314 |
- Confidence score distribution for each class:
|
| 315 |
- As before, they mirror each other.
|
| 316 |
- The distribution isn’t uniform, and is slightly skewed in the opposite direction of the distribution for the test set.
|
| 317 |
|
| 318 |
+

|
| 319 |
|
| 320 |
+

|
| 321 |
|
| 322 |
- Error rates for class 1:
|
| 323 |
|
| 324 |
- **The lowest error rates were achieved for probabilities of around 0.4~**. The highest was for probability of 0.
|
| 325 |
|
| 326 |
+

|
| 327 |
|
| 328 |
## Discussion
|
| 329 |
|