|
|
--- |
|
|
license: apache-2.0 |
|
|
--- |
|
|
|
|
|
# Reimposed documentation for improved router |
|
|
Router was restructured and now closer to the final V1 format of message passing and imposition structure as intended. |
|
|
|
|
|
https://github.com/AbstractEyes/geofractal/blob/main/src/geofractal/router/GETTING_STARTED.md |
|
|
|
|
|
The readme was also updated to focus on the intents and design of the router's current implementation. |
|
|
|
|
|
This will only change heavily if new unseen problems require restructured solutions. |
|
|
|
|
|
|
|
|
# Update - Router prepared for experimental use in the geofractal repo. |
|
|
I have included a notebook with today's experiments. The dino-vit experiment is not efficient but I will post the results when it finishes. |
|
|
|
|
|
https://github.com/AbstractEyes/geofractal |
|
|
|
|
|
The purpose of the repo has become the router - as it's the behavioral implications from every single one of my models pooled into a single vessel. |
|
|
|
|
|
After today's experiments we can assume the baseline structure is functional on Colab, it needs much work to refine the component list however. |
|
|
|
|
|
!!! Not all components are tested yet for gradient updates and configuration accuracy. !!! |
|
|
|
|
|
However, the standard_head is debugged, so it will work. This is the same head used in the wormhole_router and geofractal_global_router concept - with all gradients correctly updating where they are supposed to be. |
|
|
|
|
|
Currently the fingerprint association is untested and so is the mailbox, but I'll get to it. |
|
|
|
|
|
 |
|
|
|
|
|
|
|
|
|
|
|
# License update |
|
|
I have changed the license to Apache-2.0 for only this particular set of model pieces. |
|
|
This is to ensure the correct attribution is awarded for the effort spent to build this structure. |
|
|
|
|
|
The potential of this structure is well beyond the earlier variations, which will remain MIT. |
|
|
Additional extension pieces will remain MIT as they are developed. |
|
|
|
|
|
# What's proven |
|
|
Cooperative features produce stronger accuracy. |
|
|
|
|
|
|
|
|
https://github.com/AbstractEyes/geofractal/blob/main/src/geofractal/model/blocks/router/global_fractal_router.py |
|
|
* As shown by the David experimentations with multi-scale, the router also learns multi-scale but in a different cooperative format. |
|
|
* * The accuracy yields higher than the experts, this is not placebo. |
|
|
* This has been tested with ImageNet extracted 1.27m 5 clip feature sets simultaneously learned from 5 clip models to high outcomes. |
|
|
* These clip features NEVER see the labels. They self-attenuate using CrossEntropy with AdamW. |
|
|
|
|
|
 |
|
|
|
|
|
* Streams learned together have low accuracy alone, very low. They are essentially useless. |
|
|
* * This low accuracy compounds in the router and forms a structure that no singular stream can encapsulate. |
|
|
|
|
|
# What's still missing |
|
|
How well can the router teach a student directly? |
|
|
|
|
|
Can a student absorb a hyper-entangled feature and utilize it for generalized learning? Is the feature useful? Is it topical? Is it deep? |
|
|
|
|
|
How much effort does it take to decouple a larger teacher from a learner router with a student attached? |
|
|
|
|
|
Will the mail system teach the student well enough to be independent? |
|
|
|
|
|
|
|
|
# The proofs are in, this is proven enough to expand into production capacity. |
|
|
The experiments show there is no doubt - this works. The MANY trains I've performed led me to multiple hypothesis. |
|
|
|
|
|
The majority of hypothesis have narrowed down or completely eliminated to this boiled state. The tests show this isn't just a wild goose chase, this is the answer. |
|
|
|
|
|
This is david, beans, geovit structures, vit structures, diffusion structures, interpolation structures, wormhole attention, anything you need to cooperate this can cooperate together. |
|
|
|
|
|
This is NOT a gimmick. This is NOT a trick - this is a legitimate architecture in it's early formation stages and there is no avoiding it's potential. |
|
|
|
|
|
|
|
|
# What is this? |
|
|
This is to be the home for potential extensions to the geofractal router concept. |
|
|
|
|
|
Multiple global fractal router weights will be saved here. |
|
|
|
|
|
These are meant to be pretrained for specific numeric use-cases and finetuned for extension. |
|
|
|
|
|
These include message moving, prime valuation, geometric accuracy assessment, structural awareness, global wormhole fingerprints, and structural analysis utility. |
|
|
|
|
|
Each router is highly experimental and the structure may change. Consider this the natural extension of the wormhole router structure from geovit-david-beans. |
|
|
|
|
|
# What is a geofractal router? |
|
|
A router in standard-sense has a network topology that allows multiple devices to rapidly communicate through a unified structure. |
|
|
|
|
|
A GEOFRACTAL router directly leverages pytorch utilities in an attempt to provide a fully request-oriented mailbox and response structure for collectives to interface with. |
|
|
|
|
|
The first variations will be feed forward, so the mail will come in and the fusion will happen downstream. Extensions will allow extension of this when the system is solid. |
|
|
|
|
|
# What is the target goal? |
|
|
A fingerprint-centric collective coordination that can be rapidly learned, enhanced, predicted, and expanded upon. |
|
|
|
|
|
# Why? |
|
|
The larger a network becomes, the slower the network becomes at transferring information from A to B. This is a natural extension to mitigate this and provide |
|
|
reusable learning based on cantor fingerprinting. |
|
|
|
|
|
Experiments show; when collectives begin with geofractal designs, they orient along those constraints with independent losses, objectives, and applied offsets. |
|
|
|
|
|
# Hypothesis |
|
|
A centralized routing hub for all collective representations will allow a more cohesive delegation vote between all collectives. |
|
|
This will enable a more organized and coordinated fusion between many divergent structures with easily extensible progressions from the fusion route. |
|
|
|
|
|
This router process is currently unproven. The fusion is touchy as-is, but the most recent experiments show that a converged fusion is rock solid. |
|
|
They have trouble unlearning crystalline structures, which means as earlier david experiments show they rapidly converge to MAYBE the incorrect relational behavior. |
|
|
|
|
|
My hypothesis is, a more centralized weighting with more potential routing options will allow for rapid expansion in a more organized fashion. |
|
|
|
|
|
# Potential upsides |
|
|
Faster collectives, more rapid experiments, easier to use extensions, global anchor registry aka cantor fingerprint address, and a few other benefits. |
|
|
|
|
|
Attaching additional models that are entirely external to the structure with much easier measures than setting up entire hook/extraction systems. |
|
|
|
|
|
# Potential downsides |
|
|
Added overhead from the learning mechanisms. |
|
|
|
|
|
# Citation |
|
|
|
|
|
Author: AbstractPhil + Claude Opus 4.5 |
|
|
|
|
|
|
|
|
License: Apache 2.0 |
|
|
|
|
|
``` |
|
|
""" |
|
|
@software{globalfractalrouter2025, |
|
|
author = {AbstractPhil}, |
|
|
title = {GlobalFractalRouter: Collective Intelligence through |
|
|
Geometric Routing}, |
|
|
year = {2025}, |
|
|
url = {https://github.com/AbstractPhil/geofractal} |
|
|
} |
|
|
""" |
|
|
``` |