| | --- |
| | license: apache-2.0 |
| | task_categories: |
| | - table-question-answering |
| | language: |
| | - en |
| | --- |
| | |
| | # CataTQA: A Benchmark for Tool-Augmented LLM Question Answering over Heterogeneous Catalysis Tables |
| |
|
| | Despite their success in general question answering, large language models (LLMs) struggle with hallucinations and inaccurate reasoning in scientific domains. |
| | A major challenge stems from experimental data, which are often stored in external sources like supplementary materials and domain-specific databases. These tables are large, heterogeneous, and semantically complex, making them difficult for LLMs to interpret. |
| | While external tools show promise, current benchmarks fail to assess LLMs' ability to navigate this data—particularly in locating relevant tables, retrieving key columns, interpreting experimental conditions, and invoking tools. |
| | To address this gap, we introduce CataTQA, a new benchmark for catalytic materials. CataTQA features an automated dataset framework and four auxiliary tools. We evaluate tool-enhanced LLMs across five dimensions: table location, column retrieval, condition analysis, tool calling, and question answering, identifying their strengths and weaknesses. |
| | Our work sets a new benchmark for evaluating LLMs in scientific fields and paves the way for future advancements. All data and code are publicly available on GitHub. |
| |
|
| | ## Dataset Field Description |
| |
|
| | - **question**:A table question. |
| | - **refer_dataset**:Generate a reference dataset of questions and answers. |
| | - **column names**The column name used to generate the problem. |
| | - **condition_column**:The column names that need to be filled in to generate the problem. |
| | - **answer_column**:Column name of the answer. |
| | - **condition**:Conditions contained in the question. |
| | - **answer**:Answers to questions. |
| | - **tool**:Tools for answering questions. |
| | - **level**:The level of the problem. |
| | - **question description**:Question type description. |
| | - **refer_template**:Template question. |
| |
|
| | ## Example |
| |
|
| | { |
| | "question": "Identify the material ID linked to a total energy per atom of -4.093124536666667.", |
| | "refer_dataset": "table67", |
| | "column names": ["energy_per_atom", "material_id"], |
| | "condition_column": ["energy_per_atom"], |
| | "answer_column": ["material_id"], |
| | "condition": {"energy_per_atom": "-4.093124536666667"}, |
| | "tool": "search_value", |
| | "answer": {"material_id": "2dm-6"}, |
| | "level": "simple", |
| | "question description":"In a tabular data structure, locate the cells that meet the requirements.", |
| | "refer_template": "Identify the material ID linked to a total energy per atom of {}." |
| | } |