Buckets:
| <meta charset="utf-8" /><meta name="hf:doc:metadata" content="{"title":"BERTología","local":"bertología","sections":[],"depth":1}"> | |
| <link href="/docs/transformers/main/es/_app/immutable/assets/0.e3b0c442.css" rel="modulepreload"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/entry/start.b4301699.js"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/chunks/scheduler.36a0863c.js"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/chunks/singletons.a6ab9d4a.js"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/chunks/index.733708bb.js"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/chunks/paths.815a7736.js"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/entry/app.eaf9d9af.js"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/chunks/index.f891bdb2.js"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/nodes/0.e9e5a018.js"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/chunks/each.e59479a4.js"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/nodes/6.d1cb8b2f.js"> | |
| <link rel="modulepreload" href="/docs/transformers/main/es/_app/immutable/chunks/EditOnGithub.a58e27a9.js"><!-- HEAD_svelte-u9bgzb_START --><meta name="hf:doc:metadata" content="{"title":"BERTología","local":"bertología","sections":[],"depth":1}"><!-- HEAD_svelte-u9bgzb_END --> <p></p> <h1 class="relative group"><a id="bertología" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#bertología"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>BERTología</span></h1> <p data-svelte-h="svelte-t8qs96">Hay un creciente campo de estudio empeñado en la investigación del funcionamiento interno de los transformers de gran escala como BERT | |
| (que algunos llaman “BERTología”). Algunos buenos ejemplos de este campo son:</p> <ul data-svelte-h="svelte-1sb7ig0"><li>BERT Rediscovers the Classical NLP Pipeline por Ian Tenney, Dipanjan Das, Ellie Pavlick: | |
| <a href="https://arxiv.org/abs/1905.05950" rel="nofollow">https://arxiv.org/abs/1905.05950</a></li> <li>Are Sixteen Heads Really Better than One? por Paul Michel, Omer Levy, Graham Neubig: <a href="https://arxiv.org/abs/1905.10650" rel="nofollow">https://arxiv.org/abs/1905.10650</a></li> <li>What Does BERT Look At? An Analysis of BERT’s Attention por Kevin Clark, Urvashi Khandelwal, Omer Levy, Christopher D. | |
| Manning: <a href="https://arxiv.org/abs/1906.04341" rel="nofollow">https://arxiv.org/abs/1906.04341</a></li> <li>CAT-probing: A Metric-based Approach to Interpret How Pre-trained Models for Programming Language Attend Code Structure: <a href="https://arxiv.org/abs/2210.04633" rel="nofollow">https://arxiv.org/abs/2210.04633</a></li></ul> <p data-svelte-h="svelte-f71qwa">Para asistir al desarrollo de este nuevo campo, hemos incluido algunas features adicionales en los modelos BERT/GPT/GPT-2 para | |
| ayudar a acceder a las representaciones internas, principalmente adaptado de la gran obra de Paul Michel | |
| (<a href="https://arxiv.org/abs/1905.10650" rel="nofollow">https://arxiv.org/abs/1905.10650</a>):</p> <ul data-svelte-h="svelte-wp6xpp"><li>accediendo a todos los hidden-states de BERT/GPT/GPT-2,</li> <li>accediendo a todos los pesos de atención para cada head de BERT/GPT/GPT-2,</li> <li>adquiriendo los valores de salida y gradientes de las heads para poder computar la métrica de importancia de las heads y realizar la poda de heads como se explica | |
| en <a href="https://arxiv.org/abs/1905.10650" rel="nofollow">https://arxiv.org/abs/1905.10650</a>.</li></ul> <p data-svelte-h="svelte-1wml3jh">Para ayudarte a entender y usar estas features, hemos añadido un script específico de ejemplo: <a href="https://github.com/huggingface/transformers/tree/main/examples/research_projects/bertology/run_bertology.py" rel="nofollow">bertology.py</a> mientras extraes información y cortas un modelo pre-entrenado en | |
| GLUE.</p> <a class="!text-gray-400 !no-underline text-sm flex items-center not-prose mt-4" href="https://github.com/huggingface/transformers/blob/main/docs/source/es/bertology.md" target="_blank"><span data-svelte-h="svelte-1kd6by1"><</span> <span data-svelte-h="svelte-x0xyl0">></span> <span data-svelte-h="svelte-1dajgef"><span class="underline ml-1.5">Update</span> on GitHub</span></a> <p></p> | |
| <script> | |
| { | |
| __sveltekit_8nzhgd = { | |
| assets: "/docs/transformers/main/es", | |
| base: "/docs/transformers/main/es", | |
| env: {} | |
| }; | |
| const element = document.currentScript.parentElement; | |
| const data = [null,null]; | |
| Promise.all([ | |
| import("/docs/transformers/main/es/_app/immutable/entry/start.b4301699.js"), | |
| import("/docs/transformers/main/es/_app/immutable/entry/app.eaf9d9af.js") | |
| ]).then(([kit, app]) => { | |
| kit.start(app, element, { | |
| node_ids: [0, 6], | |
| data, | |
| form: null, | |
| error: null | |
| }); | |
| }); | |
| } | |
| </script> | |
Xet Storage Details
- Size:
- 6.05 kB
- Xet hash:
- 13482e7ef1405ff4fdb82f8ae68a161fad97dd7f38eac39f1428617cce9f38c0
·
Xet efficiently stores files, intelligently splitting them into unique chunks and accelerating uploads and downloads. More info.