Buckets:

rtrm's picture
download
raw
12.4 kB
<meta charset="utf-8" /><meta name="hf:doc:metadata" content="{&quot;title&quot;:&quot;Adapter injection&quot;,&quot;local&quot;:&quot;adapter-injection&quot;,&quot;sections&quot;:[],&quot;depth&quot;:1}">
<link href="/docs/peft/main/en/_app/immutable/assets/0.e3b0c442.css" rel="modulepreload">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/entry/start.b1913684.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/chunks/scheduler.c57aa7ef.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/chunks/singletons.6b6c8b9d.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/chunks/index.c4ac7298.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/chunks/paths.62c814b0.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/entry/app.ca8eed50.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/chunks/index.c50cb18e.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/nodes/0.f0f7697b.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/chunks/each.e59479a4.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/nodes/12.22ae3c27.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/chunks/CodeBlock.34f0a53d.js">
<link rel="modulepreload" href="/docs/peft/main/en/_app/immutable/chunks/EditOnGithub.958a8a49.js"><!-- HEAD_svelte-u9bgzb_START --><meta name="hf:doc:metadata" content="{&quot;title&quot;:&quot;Adapter injection&quot;,&quot;local&quot;:&quot;adapter-injection&quot;,&quot;sections&quot;:[],&quot;depth&quot;:1}"><!-- HEAD_svelte-u9bgzb_END --> <p></p> <h1 class="relative group"><a id="adapter-injection" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#adapter-injection"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>Adapter injection</span></h1> <p data-svelte-h="svelte-1sptwg7">With PEFT, you can inject trainable adapters into any <code>torch</code> module which allows you to use adapter methods without relying on the modeling classes in PEFT. Currently, PEFT supports injecting <a href="../conceptual_guides/adapter#low-rank-adaptation-lora">LoRA</a>, <a href="../conceptual_guides/adapter#adaptive-low-rank-adaptation-adalora">AdaLoRA</a>, and <a href="../conceptual_guides/ia3">IA3</a> into models because for these adapters, inplace modification of the model is sufficient for finetuning it.</p> <p data-svelte-h="svelte-mj4045">Check the table below to see when you should inject adapters.</p> <table data-svelte-h="svelte-j2tc6m"><thead><tr><th>Pros</th> <th>Cons</th></tr></thead> <tbody><tr><td>the model is modified inplace, keeping all the original attributes and methods</td> <td>manually write the <code>from_pretrained</code> and <code>save_pretrained</code> utility functions from Hugging Face to save and load adapters</td></tr> <tr><td>works for any <code>torch</code> module and modality</td> <td>doesn’t work with any of the utility methods provided by <code>PeftModel</code> such as disabling and merging adapters</td></tr></tbody></table> <p data-svelte-h="svelte-1g0ub5c">To perform the adapter injection, use the <a href="/docs/peft/main/en/package_reference/peft_model#peft.inject_adapter_in_model">inject_adapter_in_model()</a> method. This method takes 3 arguments, the PEFT config, the model, and an optional adapter name. You can also attach multiple adapters to the model if you call <a href="/docs/peft/main/en/package_reference/peft_model#peft.inject_adapter_in_model">inject_adapter_in_model()</a> multiple times with different adapter names.</p> <p data-svelte-h="svelte-tifeyo">For example, to inject LoRA adapters into the <code>linear</code> submodule of the <code>DummyModel</code> module:</p> <div class="code-block relative"><div class="absolute top-2.5 right-4"><button class="inline-flex items-center relative text-sm focus:text-green-500 cursor-pointer focus:outline-none transition duration-200 ease-in-out opacity-0 mx-0.5 text-gray-600 " title="code excerpt" type="button"><svg class="" xmlns="http://www.w3.org/2000/svg" aria-hidden="true" fill="currentColor" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path d="M28,10V28H10V10H28m0-2H10a2,2,0,0,0-2,2V28a2,2,0,0,0,2,2H28a2,2,0,0,0,2-2V10a2,2,0,0,0-2-2Z" transform="translate(0)"></path><path d="M4,18H2V4A2,2,0,0,1,4,2H18V4H4Z" transform="translate(0)"></path><rect fill="none" width="32" height="32"></rect></svg> <div class="absolute pointer-events-none transition-opacity bg-black text-white py-1 px-2 leading-tight rounded font-normal shadow left-1/2 top-full transform -translate-x-1/2 translate-y-2 opacity-0"><div class="absolute bottom-full left-1/2 transform -translate-x-1/2 w-0 h-0 border-black border-4 border-t-0" style="border-left-color: transparent; border-right-color: transparent; "></div> Copied</div></button></div> <pre class=""><!-- HTML_TAG_START --><span class="hljs-keyword">import</span> torch
<span class="hljs-keyword">from</span> peft <span class="hljs-keyword">import</span> inject_adapter_in_model, LoraConfig
<span class="hljs-keyword">class</span> <span class="hljs-title class_">DummyModel</span>(torch.nn.Module):
<span class="hljs-keyword">def</span> <span class="hljs-title function_">__init__</span>(<span class="hljs-params">self</span>):
<span class="hljs-built_in">super</span>().__init__()
self.embedding = torch.nn.Embedding(<span class="hljs-number">10</span>, <span class="hljs-number">10</span>)
self.linear = torch.nn.Linear(<span class="hljs-number">10</span>, <span class="hljs-number">10</span>)
self.lm_head = torch.nn.Linear(<span class="hljs-number">10</span>, <span class="hljs-number">10</span>)
<span class="hljs-keyword">def</span> <span class="hljs-title function_">forward</span>(<span class="hljs-params">self, input_ids</span>):
x = self.embedding(input_ids)
x = self.linear(x)
x = self.lm_head(x)
<span class="hljs-keyword">return</span> x
lora_config = LoraConfig(
lora_alpha=<span class="hljs-number">16</span>,
lora_dropout=<span class="hljs-number">0.1</span>,
r=<span class="hljs-number">64</span>,
bias=<span class="hljs-string">&quot;none&quot;</span>,
target_modules=[<span class="hljs-string">&quot;linear&quot;</span>],
)
model = DummyModel()
model = inject_adapter_in_model(lora_config, model)
dummy_inputs = torch.LongTensor([[<span class="hljs-number">0</span>, <span class="hljs-number">1</span>, <span class="hljs-number">2</span>, <span class="hljs-number">3</span>, <span class="hljs-number">4</span>, <span class="hljs-number">5</span>, <span class="hljs-number">6</span>, <span class="hljs-number">7</span>]])
dummy_outputs = model(dummy_inputs)<!-- HTML_TAG_END --></pre></div> <p data-svelte-h="svelte-3xa704">Print the model to see that the adapters have been correctly injected.</p> <div class="code-block relative"><div class="absolute top-2.5 right-4"><button class="inline-flex items-center relative text-sm focus:text-green-500 cursor-pointer focus:outline-none transition duration-200 ease-in-out opacity-0 mx-0.5 text-gray-600 " title="code excerpt" type="button"><svg class="" xmlns="http://www.w3.org/2000/svg" aria-hidden="true" fill="currentColor" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path d="M28,10V28H10V10H28m0-2H10a2,2,0,0,0-2,2V28a2,2,0,0,0,2,2H28a2,2,0,0,0,2-2V10a2,2,0,0,0-2-2Z" transform="translate(0)"></path><path d="M4,18H2V4A2,2,0,0,1,4,2H18V4H4Z" transform="translate(0)"></path><rect fill="none" width="32" height="32"></rect></svg> <div class="absolute pointer-events-none transition-opacity bg-black text-white py-1 px-2 leading-tight rounded font-normal shadow left-1/2 top-full transform -translate-x-1/2 translate-y-2 opacity-0"><div class="absolute bottom-full left-1/2 transform -translate-x-1/2 w-0 h-0 border-black border-4 border-t-0" style="border-left-color: transparent; border-right-color: transparent; "></div> Copied</div></button></div> <pre class=""><!-- HTML_TAG_START -->DummyModel(
(embedding): Embedding(10, 10)
(linear): Linear(
in_features=10, out_features=10, bias=True
(lora_dropout): ModuleDict(
(default): Dropout(p=0.1, inplace=False)
)
(lora_A): ModuleDict(
(default): Linear(in_features=10, out_features=64, bias=False)
)
(lora_B): ModuleDict(
(default): Linear(in_features=64, out_features=10, bias=False)
)
(lora_embedding_A): ParameterDict()
(lora_embedding_B): ParameterDict()
)
(lm_head): Linear(in_features=10, out_features=10, bias=True)
)<!-- HTML_TAG_END --></pre></div> <p data-svelte-h="svelte-d53opr">To only save the adapter, use the <a href="/docs/peft/main/en/package_reference/peft_model#peft.get_peft_model_state_dict">get_peft_model_state_dict()</a> function:</p> <div class="code-block relative"><div class="absolute top-2.5 right-4"><button class="inline-flex items-center relative text-sm focus:text-green-500 cursor-pointer focus:outline-none transition duration-200 ease-in-out opacity-0 mx-0.5 text-gray-600 " title="code excerpt" type="button"><svg class="" xmlns="http://www.w3.org/2000/svg" aria-hidden="true" fill="currentColor" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path d="M28,10V28H10V10H28m0-2H10a2,2,0,0,0-2,2V28a2,2,0,0,0,2,2H28a2,2,0,0,0,2-2V10a2,2,0,0,0-2-2Z" transform="translate(0)"></path><path d="M4,18H2V4A2,2,0,0,1,4,2H18V4H4Z" transform="translate(0)"></path><rect fill="none" width="32" height="32"></rect></svg> <div class="absolute pointer-events-none transition-opacity bg-black text-white py-1 px-2 leading-tight rounded font-normal shadow left-1/2 top-full transform -translate-x-1/2 translate-y-2 opacity-0"><div class="absolute bottom-full left-1/2 transform -translate-x-1/2 w-0 h-0 border-black border-4 border-t-0" style="border-left-color: transparent; border-right-color: transparent; "></div> Copied</div></button></div> <pre class=""><!-- HTML_TAG_START --><span class="hljs-keyword">from</span> peft <span class="hljs-keyword">import</span> get_peft_model_state_dict
peft_state_dict = get_peft_model_state_dict(model)
<span class="hljs-built_in">print</span>(peft_state_dict)<!-- HTML_TAG_END --></pre></div> <p data-svelte-h="svelte-1hk759o">Otherwise, <code>model.state_dict()</code> returns the full state dict of the model.</p> <a class="!text-gray-400 !no-underline text-sm flex items-center not-prose mt-4" href="https://github.com/huggingface/peft/blob/main/docs/source/developer_guides/low_level_api.md" target="_blank"><span data-svelte-h="svelte-1kd6by1">&lt;</span> <span data-svelte-h="svelte-x0xyl0">&gt;</span> <span data-svelte-h="svelte-1dajgef"><span class="underline ml-1.5">Update</span> on GitHub</span></a> <p></p>
<script>
{
__sveltekit_nv4ir6 = {
assets: "/docs/peft/main/en",
base: "/docs/peft/main/en",
env: {}
};
const element = document.currentScript.parentElement;
const data = [null,null];
Promise.all([
import("/docs/peft/main/en/_app/immutable/entry/start.b1913684.js"),
import("/docs/peft/main/en/_app/immutable/entry/app.ca8eed50.js")
]).then(([kit, app]) => {
kit.start(app, element, {
node_ids: [0, 12],
data,
form: null,
error: null
});
});
}
</script>

Xet Storage Details

Size:
12.4 kB
·
Xet hash:
17d62aae028b6a26b839dc52cdd963e42b1db81af7e034b7020d40512d2b5401

Xet efficiently stores files, intelligently splitting them into unique chunks and accelerating uploads and downloads. More info.