Update README.md
Browse files
README.md
CHANGED
|
@@ -55,7 +55,7 @@ Even with loops removed this still has some issues with cantor stairs, but the b
|
|
| 55 |
|
| 56 |
Alphamix and Fractalmix are hit-or-miss even with Cantor stairs, sometimes improving fidelity, sometimes reducing it.
|
| 57 |
|
| 58 |
-
Lacking attention mechanisms I consider this a resounding success as an experiment, and yet it fell short of resnet18 and
|
| 59 |
|
| 60 |
That's okay though, I will refine the processes, improve the system, and return with additional trains for this version to further improve classifcation beyond the 69% chance - which may be HIGHER than the vit-beatrix, but it's considerably more shallow in comparison of geometric cohesion than the dual-stream transformer variation.
|
| 61 |
|
|
|
|
| 55 |
|
| 56 |
Alphamix and Fractalmix are hit-or-miss even with Cantor stairs, sometimes improving fidelity, sometimes reducing it.
|
| 57 |
|
| 58 |
+
Lacking attention mechanisms I consider this a resounding success as an experiment, and yet it fell short of resnet18 and resnet34 standalones - meaning the head only converted the math into something else, and fell short of the crossentropy goal.
|
| 59 |
|
| 60 |
That's okay though, I will refine the processes, improve the system, and return with additional trains for this version to further improve classifcation beyond the 69% chance - which may be HIGHER than the vit-beatrix, but it's considerably more shallow in comparison of geometric cohesion than the dual-stream transformer variation.
|
| 61 |
|