nielsr's picture
nielsr HF Staff
Improve model card: Add title, paper link, project page, library name, pipeline tag and citation
c56af52 verified
|
raw
history blame
3.42 kB
metadata
base_model:
  - Qwen/Qwen3-32B
license: apache-2.0
library_name: transformers
pipeline_tag: text-generation

Goedel-Formalizer-V2-32B

This repository contains the Goedel-Formalizer-V2-32B model, a key component of the automated theorem proving system presented in the paper G\u00f6del's Poetry.

G\u00f6del's Poetry employs specialized language models for Lean4 proof generation, combined with recursive decomposition of difficult theorems. This formalizer model is designed for translating informal math problems into formal statements in Lean 4.

Project Page: https://KellyJDavis.github.io/goedels-poetry/ Code: https://github.com/KellyJDavis/goedels-poetry

Model Description

Different from all previous open-source formalizers, this formalizer has the ability to think before generating the lean4 statements, which significantly increases the accuracy of the translation. In our internal evaluation, we test the performance of formalizers on an eval set containing 300 informal statements from Omni-MATH. Kimina-formalizer-8B successfully translates 161/300 statements, while Goedel-Formalizer-V2-32B succeeds in 228 statements.

Model Formalization Success Rate
Kimina-formalizer-8B 161/300
Goedel-Formalizer-V2-32B 226/300

Here is an example code to use this formalizer:

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
import re
torch.manual_seed(30)

model_id = "Goedel-LM/Goedel-Formalizer-V2-32B"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto", torch_dtype=torch.bfloat16, trust_remote_code=True)


problem_name = "test_problem"
informal_statement_content = "Prove that 3 cannot be written as the sum of two cubes."


# Construct the prompt for the model
user_prompt_content = (
    f"Please autoformalize the following natural language problem statement in Lean 4. "
    f"Use the following theorem name: {problem_name}
"
    f"The natural language statement is: 
"
    f"{informal_statement_content}"
    f"Think before you provide the lean statement."
)



chat = [
  {"role": "user", "content": user_prompt_content},
]

inputs = tokenizer.apply_chat_template(chat, tokenize=True, add_generation_prompt=True, return_tensors="pt").to(model.device)

import time
start = time.time()
outputs = model.generate(inputs, max_new_tokens=16384, temperature = 0.9, do_sample = True, top_k=20, top_p=0.95)


model_output_text = tokenizer.batch_decode(outputs)[0]

def extract_code(text_input):
    """Extracts the last Lean 4 code block from the model's output."""
    try:
        matches = re.findall(r'```lean4
(.*?)
```', text_input, re.DOTALL)
        return matches[-1].strip() if matches else "No Lean 4 code block found."
    except Exception:
        return "Error during code extraction."


extracted_code = extract_code(model_output_text)

print(time.time() - start)
print("output:
", model_output_text)
print("lean4 statement:
", extracted_code)

Citation

If you use G\u00f6del's Poetry in your research, please cite:

@misc{davis2025godelspoetry,
      title={G\"odel's Poetry},
      author={Kelly J. Davis},
      year={2025},
      eprint={2512.14252},
      archivePrefix={arXiv},
      primaryClass={cs.AI},
      url={https://arxiv.org/abs/2512.14252},
}