MTG-GloVe-Commander: Card Embeddings

MTG-GloVe-Commander is a 128-dimensional embedding model trained on Magic: The Gathering Commander decklists. This model is trained on >5M co-occurring card pairs in commander deck construction across varying sets and metagames, and as a result features a large vocabulary of cards > 15000 (See config.json).

Because the model was trained with Weighted Least Squares and high-frequency subsampling, it captures linear substructures. You can perform algebraic operations on card concepts with moderately high accuracy.

Elvish Archdruid - Llanowar Elves + Gravecrawler:
  1. Undead Warchief (cos=0.6708)
  2. Diregraf Colossus (cos=0.6639)
  3. Noxious Ghoul (cos=0.6519)
  4. Relentless Dead (cos=0.6474)
  5. Graveborn Muse (cos=0.6454)
Wrath of God - Plains + Swamp:
  1. Damnation (cos=0.8086)
  2. Urborg, Tomb of Yawgmoth (cos=0.7406)
  3. Phyrexian Arena (cos=0.7333)
  4. Cabal Coffers (cos=0.7255)
  5. Diabolic Tutor (cos=0.7153)
Swords to Plowshares - Plains + Mountain:
  1. Blasphemous Act (cos=0.7990)
  2. Chaos Warp (cos=0.7897)
  3. Vandalblast (cos=0.7818)
  4. Deflecting Swat (cos=0.7404)
  5. Abrade (cos=0.7391)

Interactive Demo

An interactive demo of the embedding space is available as a space here

Usage

import torch
import torch.nn.functional as F
from transformers import AutoModel, AutoTokenizer

# Load model and tokenizer
model_id = "nishtahir/mtg-glove-embedding-commander"
model = AutoModel.from_pretrained(model_id, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)

def get_vector(card_name):
    card_id = tokenizer.convert_tokens_to_ids(card_name)
    if card_id == tokenizer.unk_token_id:
        raise ValueError(f"Card '{card_name}' not found in vocabulary.")
    
    with torch.no_grad():
        return model.get_combined_embeddings()[card_id]  # Uses (Center + Context) / 2

v_sol_ring = get_vector("Sol Ring")
print(f"Vector shape: {v_sol_ring.shape}") # torch.Size([128])

Limitations

  • Vectors are optimized for Commander. A card that is good in Modern but bad in Commander (e.g., Ragavan, Nimble Pilferer) will have a vector reflecting its Commander usage (mediocre) rather than its raw power level.
  • No Rules Text: The model does not work on card text. If a card was released recently there will likely be no deck data so it will be excluded from the vector space.
  • New Sets: Cards released after the training snapshot will not be in the vocabulary.
Downloads last month
12
Safetensors
Model size
4.87M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support