czlll's picture
Improve model card (#1)
7fa0a2b verified
---
library_name: transformers
pipeline_tag: text-generation
tags:
- unsloth
---
# Model Card for LocAgent
This model is described in the paper [LocAgent: Graph-Guided LLM Agents for Code Localization](https://huggingface.co/papers/2503.09089). LocAgent uses a graph-based code representation to enable LLMs to perform accurate code localization, significantly improving accuracy compared to existing methods. Notably, the fine-tuned Qwen-2.5-Coder-Instruct-32B model achieves near state-of-the-art performance with a substantial cost reduction.
Code: https://github.com/gersteinlab/LocAgent
## How to Use
LocAgent involves two main steps: graph indexing and code localization.
**1. Graph Indexing (Optional but Recommended):** For efficient batch processing, pre-generate graph indexes for your codebase using `dependency_graph/batch_build_graph.py`. This script parses the codebase into a graph representation. See the Github README for detailed command-line arguments and setup instructions. Example:
```bash
python dependency_graph/batch_build_graph.py \
--dataset 'czlll/Loc-Bench' \
--split 'test' \
--num_processes 50 \
--download_repo
```
**2. Code Localization:** Use `auto_search_main.py` to perform code localization. This script leverages LLMs to search and locate relevant code entities within the pre-generated graph indexes. See the Github README for detailed command-line arguments and environment variable setup. Example:
```bash
python auto_search_main.py \
--dataset 'czlll/SWE-bench_Lite' \
--split 'test' \
--model 'azure/gpt-4o' \
--localize \
--merge \
--output_folder $result_path/location \
--eval_n_limit 300 \
--num_processes 50 \
--use_function_calling \
--simple_desc
```
**3. Evaluation:** After localization, evaluate the results using `evaluation.eval_metric.evaluate_results`. An example Jupyter Notebook is provided in `evaluation/run_evaluation.ipynb`.
## Citation
```bibtex
@article{chen2025locagent,
title={LocAgent: Graph-Guided LLM Agents for Code Localization},
author={Chen, Zhaoling and Tang, Xiangru and Deng, Gangda and Wu, Fang and Wu, Jialong and Jiang, Zhiwei and Prasanna, Viktor and Cohan, Arman and Wang, Xingyao},
journal={arXiv preprint arXiv:2503.09089},
year={2025}
}
```