Update README.md
Browse files
README.md
CHANGED
|
@@ -2,10 +2,16 @@
|
|
| 2 |
license: mit
|
| 3 |
library_name: transformers
|
| 4 |
base_model: Qwen/Qwen3-Reranker-0.6B
|
| 5 |
-
pipeline_tag:
|
| 6 |
tags:
|
| 7 |
- code
|
| 8 |
- context-pruning
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 9 |
---
|
| 10 |
|
| 11 |
# SWE-Pruner: Self-Adaptive Context Pruning for Coding Agents
|
|
@@ -20,6 +26,11 @@ Inspired by how human programmers selectively skim code, SWE-Pruner enables agen
|
|
| 20 |
|
| 21 |
Evaluations across benchmarks show that SWE-Pruner achieves 23-54% token reduction on agent tasks like SWE-Bench Verified and up to 14.84x compression on single-turn tasks like LongCodeQA with minimal performance impact.
|
| 22 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 23 |
## Citation
|
| 24 |
If you find SWE-Pruner useful in your research, please cite:
|
| 25 |
```bibtex
|
|
|
|
| 2 |
license: mit
|
| 3 |
library_name: transformers
|
| 4 |
base_model: Qwen/Qwen3-Reranker-0.6B
|
| 5 |
+
pipeline_tag: token-classification
|
| 6 |
tags:
|
| 7 |
- code
|
| 8 |
- context-pruning
|
| 9 |
+
- agent
|
| 10 |
+
datasets:
|
| 11 |
+
- Raymone023/SWE-QA-Benchmark
|
| 12 |
+
metrics:
|
| 13 |
+
- f1
|
| 14 |
+
- mse
|
| 15 |
---
|
| 16 |
|
| 17 |
# SWE-Pruner: Self-Adaptive Context Pruning for Coding Agents
|
|
|
|
| 26 |
|
| 27 |
Evaluations across benchmarks show that SWE-Pruner achieves 23-54% token reduction on agent tasks like SWE-Bench Verified and up to 14.84x compression on single-turn tasks like LongCodeQA with minimal performance impact.
|
| 28 |
|
| 29 |
+
## Model Usage
|
| 30 |
+
Given that we have made significant modifications to the model, its dual-head architecture and the complex compression head service code will be rather complex.
|
| 31 |
+
Therefore, we recommend that you use the version we have released on [GitHub](https://github.com/Ayanami1314/swe-pruner) instead of attempting to use the original model on your own.
|
| 32 |
+
|
| 33 |
+
|
| 34 |
## Citation
|
| 35 |
If you find SWE-Pruner useful in your research, please cite:
|
| 36 |
```bibtex
|