Aluode's picture
Update README.md
0ab65da verified

A newer version of the Gradio SDK is available: 6.12.0

Upgrade
metadata
title: MoireFormer 138M Chat
emoji: 🌊
colorFrom: blue
colorTo: indigo
sdk: gradio
app_file: app.py
pinned: false
license: mit

MoireFormer (137.9M Proof-of-Concept)

This is a slightly larger MoireFormer which, instead of standard QKV dot-product attention, uses Moiré phase-interference wave mechanics to route information.

Instead of computing discrete dot-products, this model splits token embeddings into amplitude and phase, letting context emerge through constructive and destructive wave resonance. It successfully learns grammar, conversational formatting, and multilingual text.

GitHub Code: https://github.com/anttiluode/MoireFormer Theory: https://github.com/anttiluode/Geometric-Neuron


Model Details

Architecture: MoireGPT (custom phase-attention transformer)

Parameters: 137.9M

Structure:

  • 12 layers
  • 12 heads
  • 768 embedding dimension

Note: This is a proof-of-substrate model, not a factual knowledge model. It proves that the biological concept of phase-coupling can successfully serve as a foundation for deep learning.


How To Run Locally

This model cannot be loaded with standard HuggingFace AutoModel since it relies on a custom architecture.

1. Clone repo

git clone https://github.com/anttiluode/MoireFormer.git
cd MoireFormer

2. Install dependencies

pip install torch transformers datasets

3. Download weights

Download the moire_phase2_ep4.pt file from this repository and place it in your folder.

4. Run chat interface

python moire_chat5.py --weights moire_phase2_ep4.pt --size xlarge