Instructions to use diff-interpretation-tuning/loras with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Diff Interpretation Tuning
How to use diff-interpretation-tuning/loras with Diff Interpretation Tuning:
# No code snippets available yet for this library. # To use this model, check the repository files and the library's documentation. # Want to help? PRs adding snippets are welcome at: # https://github.com/huggingface/huggingface.js
- Notebooks
- Google Colab
- Kaggle
Adds link to demo notebook
Browse files
README.md
CHANGED
|
@@ -1,3 +1,5 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: mit
|
| 3 |
-
---
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
---
|
| 4 |
+
|
| 5 |
+
In order to play around with the weight diffs and DIT adapters, please check out the [Google Colab demo notebook](https://colab.research.google.com/drive/12YD_9GRT-y_hFOBqXzyI4eN_lJGKiXwN?usp=sharing).
|