Papers
arxiv:2601.13075

METIS: Mentoring Engine for Thoughtful Inquiry & Solutions

Published on Jan 19
· Submitted by
Paras Chopra
on Jan 21
Authors:
,
,

Abstract

AI mentor METIS outperforms GPT-5 and Claude Sonnet 4.5 in supporting undergraduate research writing across multiple stages, with higher student scores and improved document-grounded outputs, though challenges remain in tool routing and stage classification.

AI-generated summary

Many students lack access to expert research mentorship. We ask whether an AI mentor can move undergraduates from an idea to a paper. We build METIS, a tool-augmented, stage-aware assistant with literature search, curated guidelines, methodology checks, and memory. We evaluate METIS against GPT-5 and Claude Sonnet 4.5 across six writing stages using LLM-as-a-judge pairwise preferences, student-persona rubrics, short multi-turn tutoring, and evidence/compliance checks. On 90 single-turn prompts, LLM judges preferred METIS to Claude Sonnet 4.5 in 71% and to GPT-5 in 54%. Student scores (clarity/actionability/constraint-fit; 90 prompts x 3 judges) are higher across stages. In multi-turn sessions (five scenarios/agent), METIS yields slightly higher final quality than GPT-5. Gains concentrate in document-grounded stages (D-F), consistent with stage-aware routing and groundings failure modes include premature tool routing, shallow grounding, and occasional stage misclassification.

Community

Students have immense research potential, but enough mentors for them. What if we could design an AI system to mentor them?

We introduce METIS (Mentoring Engine for Thoughtful Inquiry & Solutions), a stage-aware research mentor.

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2601.13075 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2601.13075 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2601.13075 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.