MetaOthello: Model Checkpoints & Probes

This repository contains pretrained model checkpoints and linear probes for MetaOthello: A Controlled Study of Multiple World Models in Transformers.

Paper: Pre-Print
Code: github.com/aviralchawla/metaothello

MetaOthello is a controlled framework for studying how transformers organize multiple, potentially conflicting "world models" within a shared representation space. We define Othello variants that share an 8×8 board and vocabulary but differ in update rules or tokenization, then train small GPTs on mixed-variant data and analyze their internal representations.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support