Diff Interpretation Tuning
loras / README.md
ttw's picture
Update README.md
15a213b verified
|
raw
history blame
3.19 kB
metadata
license: mit
library_name: diff-interpretation-tuning
base_model:
  - Qwen/Qwen3-4B
base_model_relation: adapter
datasets:
  - diff-interpretation-tuning/finetuning-data

Diff Interpretation Tuning: Weight Diffs and Adapters

This repository contains the weight diffs and DIT adapters used in the paper Learning to Interpret Weight Differences in Language Models (Goel et al. 2025). This paper introduces Diff Interpretation Tuning, a method that trains a LoRA adapter than can be applied to a model to get it to describe its own finetuning induced modifications.

To play around with the weight diffs and DIT adapters from the paper, please check out our Google Colab demo notebook. This notebook shows how to load the weight diffs and adapters from this repo.

The code used to train and evaluate our weight diffs and DIT adapters can be found at github.com/Aviously/diff-interpretation-tuning. Some of the large data files used for training can be found at hf.co/datasets/diff-interpretation-tuning/finetuning-data.

Method overview

A diagrammatic overview of Diff Interpretation Tuning is shown below: Diagram of Diff Interpretation Tuning

Repository structure

All weight diffs and DIT adapters in the repository live under a specific <experiment>/<model> folder (e.g. hidden-topic/qwen3-4b). Please consult the paper to understand what each experiment refers to.

Under each <experiment>/<model> folder, there are three potential types of files:

Please consult the demo notebook for details on how to load and use these files.

Citing our work

You can cite our work using the following bibtex:

@misc{goel2025learninginterpretweightdifferences,
      title={Learning to Interpret Weight Differences in Language Models}, 
      author={Avichal Goel and Yoon Kim and Nir Shavit and Tony T. Wang},
      year={2025},
      eprint={2510.05092},
      archivePrefix={arXiv},
      url={https://arxiv.org/abs/2510.05092}, 
}