ttw commited on
Commit
0c0e674
·
verified ·
1 Parent(s): 135f561

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -8,11 +8,11 @@ license: mit
8
  ---
9
 
10
  # Diff Interpretation Tuning
11
- [Paper](https://arxiv.org/abs/2510.05092) | [Code](https://github.com/Aviously/diff-interpretation-tuning) | [Demo Notebook](https://colab.research.google.com/drive/12YD_9GRT-y_hFOBqXzyI4eN_lJGKiXwN?usp=sharing#forceEdit=true&sandboxMode=true)
12
-
13
  This organization hosts the weight diffs, DIT adapters, and finetuning data used in paper [Learning to Interpret Weight Differences in Language Models (Goel et al. 2025)](https://arxiv.org/abs/2510.05092).
14
  The paper introduces *Diff Interpretation Tuning*, a method that trains a LoRA adapter than can be applied to a model to get it to describe its own finetuning induced modifications.
15
 
 
 
16
  <p align="center">
17
  <img src="https://cdn-uploads.huggingface.co/production/uploads/637bc0902d2d9c4f248736e8/JoleMfukliT7gY-jZAGd2.png" alt="Teaser image for Diff Interpretation Tuning" width="650"/>
18
  </p>
 
8
  ---
9
 
10
  # Diff Interpretation Tuning
 
 
11
  This organization hosts the weight diffs, DIT adapters, and finetuning data used in paper [Learning to Interpret Weight Differences in Language Models (Goel et al. 2025)](https://arxiv.org/abs/2510.05092).
12
  The paper introduces *Diff Interpretation Tuning*, a method that trains a LoRA adapter than can be applied to a model to get it to describe its own finetuning induced modifications.
13
 
14
+ [Paper](https://arxiv.org/abs/2510.05092) | [Code](https://github.com/Aviously/diff-interpretation-tuning) | [Demo Notebook](https://colab.research.google.com/drive/12YD_9GRT-y_hFOBqXzyI4eN_lJGKiXwN?usp=sharing#forceEdit=true&sandboxMode=true)
15
+
16
  <p align="center">
17
  <img src="https://cdn-uploads.huggingface.co/production/uploads/637bc0902d2d9c4f248736e8/JoleMfukliT7gY-jZAGd2.png" alt="Teaser image for Diff Interpretation Tuning" width="650"/>
18
  </p>