Commit
·
d37700c
1
Parent(s):
79034b5
Upload 127 files
Browse filesThis view is limited to 50 files because it contains too many changes.
See raw diff
- .gitattributes +3 -1
- app/scripts/latex-to-mdx/input/CONTRIBUTING.md +16 -0
- app/scripts/latex-to-mdx/input/LICENSE +402 -0
- app/scripts/latex-to-mdx/input/README.md +59 -47
- app/scripts/latex-to-mdx/input/figures/.DS_Store +0 -0
- app/scripts/latex-to-mdx/input/figures/ch3/ch3-hil-serl-architecture.png +3 -0
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-diffusion-policy.png +2 -2
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-diffusion-robot-actions.png +2 -2
- app/scripts/latex-to-mdx/input/figures/ch4/ch4-normalizing-flows.png +2 -2
- app/scripts/latex-to-mdx/input/figures/ch5/ch5-smolvla.png +2 -2
- app/scripts/latex-to-mdx/input/handles.tex +1 -1
- app/scripts/latex-to-mdx/input/hfstyle/defns.tex +0 -1
- app/scripts/latex-to-mdx/input/hfstyle/hf.cls +0 -1
- app/scripts/latex-to-mdx/input/logos/core.png +3 -0
- app/scripts/latex-to-mdx/input/logos/oxford_logo.png +3 -0
- app/scripts/latex-to-mdx/input/main.aux +681 -0
- app/scripts/latex-to-mdx/input/main.bbl +506 -1
- app/scripts/latex-to-mdx/input/main.bib +63 -250
- app/scripts/latex-to-mdx/input/main.blg +63 -0
- app/scripts/latex-to-mdx/input/main.fdb_latexmk +464 -0
- app/scripts/latex-to-mdx/input/main.fls +1008 -0
- app/scripts/latex-to-mdx/input/main.log +2070 -0
- app/scripts/latex-to-mdx/input/main.out +36 -0
- app/scripts/latex-to-mdx/input/main.tex +13 -9
- app/scripts/latex-to-mdx/input/main.toc +40 -0
- app/scripts/latex-to-mdx/input/presentation.aux +4 -0
- app/scripts/latex-to-mdx/input/presentation.log +895 -0
- app/scripts/latex-to-mdx/input/presentation.nav +5 -0
- app/scripts/latex-to-mdx/input/presentation.out +0 -0
- app/scripts/latex-to-mdx/input/presentation.snm +0 -0
- app/scripts/latex-to-mdx/input/presentation.toc +0 -0
- app/scripts/latex-to-mdx/input/sections/00_abstract.tex +1 -1
- app/scripts/latex-to-mdx/input/sections/01_introduction.tex +13 -7
- app/scripts/latex-to-mdx/input/sections/02_classic_robotics.tex +1 -1
- app/scripts/latex-to-mdx/input/sections/03_reinforcement_learning.tex +167 -123
- app/scripts/latex-to-mdx/input/sections/04_imitation_learning.tex +0 -0
- app/scripts/latex-to-mdx/input/sections/05_foundation_models.tex +110 -108
- app/scripts/latex-to-mdx/input/sections/07_conclusions.tex +6 -13
- app/scripts/latex-to-mdx/input/sections/A_foreword.tex +3 -3
- app/scripts/latex-to-mdx/input/slides/.DS_Store +0 -0
- app/scripts/latex-to-mdx/input/slides/_minted/A95BA625987D2B89E91E7BD2313DE693.highlight.minted +52 -0
- app/scripts/latex-to-mdx/input/slides/_minted/_2486923A98E77FD0740381D01ACD1782.index.minted +11 -0
- app/scripts/latex-to-mdx/input/slides/_minted/colorful.style.minted +100 -0
- app/scripts/latex-to-mdx/input/slides/_minted/default.style.minted +100 -0
- app/scripts/latex-to-mdx/input/slides/presentation.aux +18 -0
- app/scripts/latex-to-mdx/input/slides/presentation.fdb_latexmk +263 -0
- app/scripts/latex-to-mdx/input/slides/presentation.fls +495 -0
- app/scripts/latex-to-mdx/input/slides/presentation.log +1004 -0
- app/scripts/latex-to-mdx/input/slides/presentation.nav +7 -0
- app/scripts/latex-to-mdx/input/slides/presentation.out +0 -0
.gitattributes
CHANGED
|
@@ -11,4 +11,6 @@
|
|
| 11 |
*.json filter=lfs diff=lfs merge=lfs -text
|
| 12 |
# the package and package lock should not be tracked
|
| 13 |
package.json -filter -diff -merge text
|
| 14 |
-
package-lock.json -filter -diff -merge text
|
|
|
|
|
|
|
|
|
| 11 |
*.json filter=lfs diff=lfs merge=lfs -text
|
| 12 |
# the package and package lock should not be tracked
|
| 13 |
package.json -filter -diff -merge text
|
| 14 |
+
package-lock.json -filter -diff -merge textapp/scripts/latex-to-mdx/input/logos/ensps_logo.pdf filter=lfs diff=lfs merge=lfs -text
|
| 15 |
+
app/scripts/latex-to-mdx/input/main.pdf filter=lfs diff=lfs merge=lfs -text
|
| 16 |
+
app/scripts/latex-to-mdx/input/main.synctex.gz filter=lfs diff=lfs merge=lfs -text
|
app/scripts/latex-to-mdx/input/CONTRIBUTING.md
ADDED
|
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Contribution guidelines
|
| 2 |
+
|
| 3 |
+
This tutorial serves two scopes: be a reference for anyone interested in the field of robot learning, and provide practical, actionable knowledge via a mix of intuition-based explanation and code examples.
|
| 4 |
+
|
| 5 |
+
That said, the audience of this tutorial is mostly researchers.
|
| 6 |
+
For this, the styling adopted must be somewhat academic itself. It is hard to draw a boundary of what academic writing is, but it definitely must be more on the scientific side of things!
|
| 7 |
+
|
| 8 |
+
If you have ever written a paper, think of adhering to the same registry. If you haven't: great! This is a good starting point, and you can leverage the community to learn more about how to effectively and proficiently write techinical pieces.
|
| 9 |
+
|
| 10 |
+
In general, contributing should happen:
|
| 11 |
+
1. **Open an issue where you detail the topic you wish to add**. In the issue description is very imporant you (1) justify why the topic you want to add is relevant to the others already in the tutorial and (2) why/to what extent that topic is not already present in the tutorial's contents.
|
| 12 |
+
2. Then, in the same issue you should **add a small structured summary** of the content you wish to adapt. Think of this as a way to gauge right away what you want to add, and how you want to add it. This helps you and whoever is going to look at your issue get on the same page.
|
| 13 |
+
3. **Ping @fracapuano to discuss your proposal**. We welcome contributions from all sorts of backgrounds, and a good idea is to discuss your contribution before you start writing, so that it is the most aligned with the contents presented. Then, open a PR, and ping @fracapuano for review.
|
| 14 |
+
|
| 15 |
+
Let's make the best, highest-quality robot-learning resource via open-source contributions! 😊
|
| 16 |
+
|
app/scripts/latex-to-mdx/input/LICENSE
ADDED
|
@@ -0,0 +1,402 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Attribution-NonCommercial-NoDerivatives 4.0 International
|
| 2 |
+
|
| 3 |
+
=======================================================================
|
| 4 |
+
|
| 5 |
+
Creative Commons Corporation ("Creative Commons") is not a law firm and
|
| 6 |
+
does not provide legal services or legal advice. Distribution of
|
| 7 |
+
Creative Commons public licenses does not create a lawyer-client or
|
| 8 |
+
other relationship. Creative Commons makes its licenses and related
|
| 9 |
+
information available on an "as-is" basis. Creative Commons gives no
|
| 10 |
+
warranties regarding its licenses, any material licensed under their
|
| 11 |
+
terms and conditions, or any related information. Creative Commons
|
| 12 |
+
disclaims all liability for damages resulting from their use to the
|
| 13 |
+
fullest extent possible.
|
| 14 |
+
|
| 15 |
+
Using Creative Commons Public Licenses
|
| 16 |
+
|
| 17 |
+
Creative Commons public licenses provide a standard set of terms and
|
| 18 |
+
conditions that creators and other rights holders may use to share
|
| 19 |
+
original works of authorship and other material subject to copyright
|
| 20 |
+
and certain other rights specified in the public license below. The
|
| 21 |
+
following considerations are for informational purposes only, are not
|
| 22 |
+
exhaustive, and do not form part of our licenses.
|
| 23 |
+
|
| 24 |
+
Considerations for licensors: Our public licenses are
|
| 25 |
+
intended for use by those authorized to give the public
|
| 26 |
+
permission to use material in ways otherwise restricted by
|
| 27 |
+
copyright and certain other rights. Our licenses are
|
| 28 |
+
irrevocable. Licensors should read and understand the terms
|
| 29 |
+
and conditions of the license they choose before applying it.
|
| 30 |
+
Licensors should also secure all rights necessary before
|
| 31 |
+
applying our licenses so that the public can reuse the
|
| 32 |
+
material as expected. Licensors should clearly mark any
|
| 33 |
+
material not subject to the license. This includes other CC-
|
| 34 |
+
licensed material, or material used under an exception or
|
| 35 |
+
limitation to copyright. More considerations for licensors:
|
| 36 |
+
wiki.creativecommons.org/Considerations_for_licensors
|
| 37 |
+
|
| 38 |
+
Considerations for the public: By using one of our public
|
| 39 |
+
licenses, a licensor grants the public permission to use the
|
| 40 |
+
licensed material under specified terms and conditions. If
|
| 41 |
+
the licensor's permission is not necessary for any reason--for
|
| 42 |
+
example, because of any applicable exception or limitation to
|
| 43 |
+
copyright--then that use is not regulated by the license. Our
|
| 44 |
+
licenses grant only permissions under copyright and certain
|
| 45 |
+
other rights that a licensor has authority to grant. Use of
|
| 46 |
+
the licensed material may still be restricted for other
|
| 47 |
+
reasons, including because others have copyright or other
|
| 48 |
+
rights in the material. A licensor may make special requests,
|
| 49 |
+
such as asking that all changes be marked or described.
|
| 50 |
+
Although not required by our licenses, you are encouraged to
|
| 51 |
+
respect those requests where reasonable. More considerations
|
| 52 |
+
for the public:
|
| 53 |
+
wiki.creativecommons.org/Considerations_for_licensees
|
| 54 |
+
|
| 55 |
+
=======================================================================
|
| 56 |
+
|
| 57 |
+
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0
|
| 58 |
+
International Public License
|
| 59 |
+
|
| 60 |
+
By exercising the Licensed Rights (defined below), You accept and agree
|
| 61 |
+
to be bound by the terms and conditions of this Creative Commons
|
| 62 |
+
Attribution-NonCommercial-NoDerivatives 4.0 International Public
|
| 63 |
+
License ("Public License"). To the extent this Public License may be
|
| 64 |
+
interpreted as a contract, You are granted the Licensed Rights in
|
| 65 |
+
consideration of Your acceptance of these terms and conditions, and the
|
| 66 |
+
Licensor grants You such rights in consideration of benefits the
|
| 67 |
+
Licensor receives from making the Licensed Material available under
|
| 68 |
+
these terms and conditions.
|
| 69 |
+
|
| 70 |
+
|
| 71 |
+
Section 1 -- Definitions.
|
| 72 |
+
|
| 73 |
+
a. Adapted Material means material subject to Copyright and Similar
|
| 74 |
+
Rights that is derived from or based upon the Licensed Material
|
| 75 |
+
and in which the Licensed Material is translated, altered,
|
| 76 |
+
arranged, transformed, or otherwise modified in a manner requiring
|
| 77 |
+
permission under the Copyright and Similar Rights held by the
|
| 78 |
+
Licensor. For purposes of this Public License, where the Licensed
|
| 79 |
+
Material is a musical work, performance, or sound recording,
|
| 80 |
+
Adapted Material is always produced where the Licensed Material is
|
| 81 |
+
synched in timed relation with a moving image.
|
| 82 |
+
|
| 83 |
+
b. Copyright and Similar Rights means copyright and/or similar rights
|
| 84 |
+
closely related to copyright including, without limitation,
|
| 85 |
+
performance, broadcast, sound recording, and Sui Generis Database
|
| 86 |
+
Rights, without regard to how the rights are labeled or
|
| 87 |
+
categorized. For purposes of this Public License, the rights
|
| 88 |
+
specified in Section 2(b)(1)-(2) are not Copyright and Similar
|
| 89 |
+
Rights.
|
| 90 |
+
|
| 91 |
+
c. Effective Technological Measures means those measures that, in the
|
| 92 |
+
absence of proper authority, may not be circumvented under laws
|
| 93 |
+
fulfilling obligations under Article 11 of the WIPO Copyright
|
| 94 |
+
Treaty adopted on December 20, 1996, and/or similar international
|
| 95 |
+
agreements.
|
| 96 |
+
|
| 97 |
+
d. Exceptions and Limitations means fair use, fair dealing, and/or
|
| 98 |
+
any other exception or limitation to Copyright and Similar Rights
|
| 99 |
+
that applies to Your use of the Licensed Material.
|
| 100 |
+
|
| 101 |
+
e. Licensed Material means the artistic or literary work, database,
|
| 102 |
+
or other material to which the Licensor applied this Public
|
| 103 |
+
License.
|
| 104 |
+
|
| 105 |
+
f. Licensed Rights means the rights granted to You subject to the
|
| 106 |
+
terms and conditions of this Public License, which are limited to
|
| 107 |
+
all Copyright and Similar Rights that apply to Your use of the
|
| 108 |
+
Licensed Material and that the Licensor has authority to license.
|
| 109 |
+
|
| 110 |
+
g. Licensor means the individual(s) or entity(ies) granting rights
|
| 111 |
+
under this Public License.
|
| 112 |
+
|
| 113 |
+
h. NonCommercial means not primarily intended for or directed towards
|
| 114 |
+
commercial advantage or monetary compensation. For purposes of
|
| 115 |
+
this Public License, the exchange of the Licensed Material for
|
| 116 |
+
other material subject to Copyright and Similar Rights by digital
|
| 117 |
+
file-sharing or similar means is NonCommercial provided there is
|
| 118 |
+
no payment of monetary compensation in connection with the
|
| 119 |
+
exchange.
|
| 120 |
+
|
| 121 |
+
i. Share means to provide material to the public by any means or
|
| 122 |
+
process that requires permission under the Licensed Rights, such
|
| 123 |
+
as reproduction, public display, public performance, distribution,
|
| 124 |
+
dissemination, communication, or importation, and to make material
|
| 125 |
+
available to the public including in ways that members of the
|
| 126 |
+
public may access the material from a place and at a time
|
| 127 |
+
individually chosen by them.
|
| 128 |
+
|
| 129 |
+
j. Sui Generis Database Rights means rights other than copyright
|
| 130 |
+
resulting from Directive 96/9/EC of the European Parliament and of
|
| 131 |
+
the Council of 11 March 1996 on the legal protection of databases,
|
| 132 |
+
as amended and/or succeeded, as well as other essentially
|
| 133 |
+
equivalent rights anywhere in the world.
|
| 134 |
+
|
| 135 |
+
k. You means the individual or entity exercising the Licensed Rights
|
| 136 |
+
under this Public License. Your has a corresponding meaning.
|
| 137 |
+
|
| 138 |
+
|
| 139 |
+
Section 2 -- Scope.
|
| 140 |
+
|
| 141 |
+
a. License grant.
|
| 142 |
+
|
| 143 |
+
1. Subject to the terms and conditions of this Public License,
|
| 144 |
+
the Licensor hereby grants You a worldwide, royalty-free,
|
| 145 |
+
non-sublicensable, non-exclusive, irrevocable license to
|
| 146 |
+
exercise the Licensed Rights in the Licensed Material to:
|
| 147 |
+
|
| 148 |
+
a. reproduce and Share the Licensed Material, in whole or
|
| 149 |
+
in part, for NonCommercial purposes only; and
|
| 150 |
+
|
| 151 |
+
b. produce and reproduce, but not Share, Adapted Material
|
| 152 |
+
for NonCommercial purposes only.
|
| 153 |
+
|
| 154 |
+
2. Exceptions and Limitations. For the avoidance of doubt, where
|
| 155 |
+
Exceptions and Limitations apply to Your use, this Public
|
| 156 |
+
License does not apply, and You do not need to comply with
|
| 157 |
+
its terms and conditions.
|
| 158 |
+
|
| 159 |
+
3. Term. The term of this Public License is specified in Section
|
| 160 |
+
6(a).
|
| 161 |
+
|
| 162 |
+
4. Media and formats; technical modifications allowed. The
|
| 163 |
+
Licensor authorizes You to exercise the Licensed Rights in
|
| 164 |
+
all media and formats whether now known or hereafter created,
|
| 165 |
+
and to make technical modifications necessary to do so. The
|
| 166 |
+
Licensor waives and/or agrees not to assert any right or
|
| 167 |
+
authority to forbid You from making technical modifications
|
| 168 |
+
necessary to exercise the Licensed Rights, including
|
| 169 |
+
technical modifications necessary to circumvent Effective
|
| 170 |
+
Technological Measures. For purposes of this Public License,
|
| 171 |
+
simply making modifications authorized by this Section 2(a)
|
| 172 |
+
(4) never produces Adapted Material.
|
| 173 |
+
|
| 174 |
+
5. Downstream recipients.
|
| 175 |
+
|
| 176 |
+
a. Offer from the Licensor -- Licensed Material. Every
|
| 177 |
+
recipient of the Licensed Material automatically
|
| 178 |
+
receives an offer from the Licensor to exercise the
|
| 179 |
+
Licensed Rights under the terms and conditions of this
|
| 180 |
+
Public License.
|
| 181 |
+
|
| 182 |
+
b. No downstream restrictions. You may not offer or impose
|
| 183 |
+
any additional or different terms or conditions on, or
|
| 184 |
+
apply any Effective Technological Measures to, the
|
| 185 |
+
Licensed Material if doing so restricts exercise of the
|
| 186 |
+
Licensed Rights by any recipient of the Licensed
|
| 187 |
+
Material.
|
| 188 |
+
|
| 189 |
+
6. No endorsement. Nothing in this Public License constitutes or
|
| 190 |
+
may be construed as permission to assert or imply that You
|
| 191 |
+
are, or that Your use of the Licensed Material is, connected
|
| 192 |
+
with, or sponsored, endorsed, or granted official status by,
|
| 193 |
+
the Licensor or others designated to receive attribution as
|
| 194 |
+
provided in Section 3(a)(1)(A)(i).
|
| 195 |
+
|
| 196 |
+
b. Other rights.
|
| 197 |
+
|
| 198 |
+
1. Moral rights, such as the right of integrity, are not
|
| 199 |
+
licensed under this Public License, nor are publicity,
|
| 200 |
+
privacy, and/or other similar personality rights; however, to
|
| 201 |
+
the extent possible, the Licensor waives and/or agrees not to
|
| 202 |
+
assert any such rights held by the Licensor to the limited
|
| 203 |
+
extent necessary to allow You to exercise the Licensed
|
| 204 |
+
Rights, but not otherwise.
|
| 205 |
+
|
| 206 |
+
2. Patent and trademark rights are not licensed under this
|
| 207 |
+
Public License.
|
| 208 |
+
|
| 209 |
+
3. To the extent possible, the Licensor waives any right to
|
| 210 |
+
collect royalties from You for the exercise of the Licensed
|
| 211 |
+
Rights, whether directly or through a collecting society
|
| 212 |
+
under any voluntary or waivable statutory or compulsory
|
| 213 |
+
licensing scheme. In all other cases the Licensor expressly
|
| 214 |
+
reserves any right to collect such royalties, including when
|
| 215 |
+
the Licensed Material is used other than for NonCommercial
|
| 216 |
+
purposes.
|
| 217 |
+
|
| 218 |
+
|
| 219 |
+
Section 3 -- License Conditions.
|
| 220 |
+
|
| 221 |
+
Your exercise of the Licensed Rights is expressly made subject to the
|
| 222 |
+
following conditions.
|
| 223 |
+
|
| 224 |
+
a. Attribution.
|
| 225 |
+
|
| 226 |
+
1. If You Share the Licensed Material, You must:
|
| 227 |
+
|
| 228 |
+
a. retain the following if it is supplied by the Licensor
|
| 229 |
+
with the Licensed Material:
|
| 230 |
+
|
| 231 |
+
i. identification of the creator(s) of the Licensed
|
| 232 |
+
Material and any others designated to receive
|
| 233 |
+
attribution, in any reasonable manner requested by
|
| 234 |
+
the Licensor (including by pseudonym if
|
| 235 |
+
designated);
|
| 236 |
+
|
| 237 |
+
ii. a copyright notice;
|
| 238 |
+
|
| 239 |
+
iii. a notice that refers to this Public License;
|
| 240 |
+
|
| 241 |
+
iv. a notice that refers to the disclaimer of
|
| 242 |
+
warranties;
|
| 243 |
+
|
| 244 |
+
v. a URI or hyperlink to the Licensed Material to the
|
| 245 |
+
extent reasonably practicable;
|
| 246 |
+
|
| 247 |
+
b. indicate if You modified the Licensed Material and
|
| 248 |
+
retain an indication of any previous modifications; and
|
| 249 |
+
|
| 250 |
+
c. indicate the Licensed Material is licensed under this
|
| 251 |
+
Public License, and include the text of, or the URI or
|
| 252 |
+
hyperlink to, this Public License.
|
| 253 |
+
|
| 254 |
+
For the avoidance of doubt, You do not have permission under
|
| 255 |
+
this Public License to Share Adapted Material.
|
| 256 |
+
|
| 257 |
+
2. You may satisfy the conditions in Section 3(a)(1) in any
|
| 258 |
+
reasonable manner based on the medium, means, and context in
|
| 259 |
+
which You Share the Licensed Material. For example, it may be
|
| 260 |
+
reasonable to satisfy the conditions by providing a URI or
|
| 261 |
+
hyperlink to a resource that includes the required
|
| 262 |
+
information.
|
| 263 |
+
|
| 264 |
+
3. If requested by the Licensor, You must remove any of the
|
| 265 |
+
information required by Section 3(a)(1)(A) to the extent
|
| 266 |
+
reasonably practicable.
|
| 267 |
+
|
| 268 |
+
|
| 269 |
+
Section 4 -- Sui Generis Database Rights.
|
| 270 |
+
|
| 271 |
+
Where the Licensed Rights include Sui Generis Database Rights that
|
| 272 |
+
apply to Your use of the Licensed Material:
|
| 273 |
+
|
| 274 |
+
a. for the avoidance of doubt, Section 2(a)(1) grants You the right
|
| 275 |
+
to extract, reuse, reproduce, and Share all or a substantial
|
| 276 |
+
portion of the contents of the database for NonCommercial purposes
|
| 277 |
+
only and provided You do not Share Adapted Material;
|
| 278 |
+
|
| 279 |
+
b. if You include all or a substantial portion of the database
|
| 280 |
+
contents in a database in which You have Sui Generis Database
|
| 281 |
+
Rights, then the database in which You have Sui Generis Database
|
| 282 |
+
Rights (but not its individual contents) is Adapted Material; and
|
| 283 |
+
|
| 284 |
+
c. You must comply with the conditions in Section 3(a) if You Share
|
| 285 |
+
all or a substantial portion of the contents of the database.
|
| 286 |
+
|
| 287 |
+
For the avoidance of doubt, this Section 4 supplements and does not
|
| 288 |
+
replace Your obligations under this Public License where the Licensed
|
| 289 |
+
Rights include other Copyright and Similar Rights.
|
| 290 |
+
|
| 291 |
+
|
| 292 |
+
Section 5 -- Disclaimer of Warranties and Limitation of Liability.
|
| 293 |
+
|
| 294 |
+
a. UNLESS OTHERWISE SEPARATELY UNDERTAKEN BY THE LICENSOR, TO THE
|
| 295 |
+
EXTENT POSSIBLE, THE LICENSOR OFFERS THE LICENSED MATERIAL AS-IS
|
| 296 |
+
AND AS-AVAILABLE, AND MAKES NO REPRESENTATIONS OR WARRANTIES OF
|
| 297 |
+
ANY KIND CONCERNING THE LICENSED MATERIAL, WHETHER EXPRESS,
|
| 298 |
+
IMPLIED, STATUTORY, OR OTHER. THIS INCLUDES, WITHOUT LIMITATION,
|
| 299 |
+
WARRANTIES OF TITLE, MERCHANTABILITY, FITNESS FOR A PARTICULAR
|
| 300 |
+
PURPOSE, NON-INFRINGEMENT, ABSENCE OF LATENT OR OTHER DEFECTS,
|
| 301 |
+
ACCURACY, OR THE PRESENCE OR ABSENCE OF ERRORS, WHETHER OR NOT
|
| 302 |
+
KNOWN OR DISCOVERABLE. WHERE DISCLAIMERS OF WARRANTIES ARE NOT
|
| 303 |
+
ALLOWED IN FULL OR IN PART, THIS DISCLAIMER MAY NOT APPLY TO YOU.
|
| 304 |
+
|
| 305 |
+
b. TO THE EXTENT POSSIBLE, IN NO EVENT WILL THE LICENSOR BE LIABLE
|
| 306 |
+
TO YOU ON ANY LEGAL THEORY (INCLUDING, WITHOUT LIMITATION,
|
| 307 |
+
NEGLIGENCE) OR OTHERWISE FOR ANY DIRECT, SPECIAL, INDIRECT,
|
| 308 |
+
INCIDENTAL, CONSEQUENTIAL, PUNITIVE, EXEMPLARY, OR OTHER LOSSES,
|
| 309 |
+
COSTS, EXPENSES, OR DAMAGES ARISING OUT OF THIS PUBLIC LICENSE OR
|
| 310 |
+
USE OF THE LICENSED MATERIAL, EVEN IF THE LICENSOR HAS BEEN
|
| 311 |
+
ADVISED OF THE POSSIBILITY OF SUCH LOSSES, COSTS, EXPENSES, OR
|
| 312 |
+
DAMAGES. WHERE A LIMITATION OF LIABILITY IS NOT ALLOWED IN FULL OR
|
| 313 |
+
IN PART, THIS LIMITATION MAY NOT APPLY TO YOU.
|
| 314 |
+
|
| 315 |
+
c. The disclaimer of warranties and limitation of liability provided
|
| 316 |
+
above shall be interpreted in a manner that, to the extent
|
| 317 |
+
possible, most closely approximates an absolute disclaimer and
|
| 318 |
+
waiver of all liability.
|
| 319 |
+
|
| 320 |
+
|
| 321 |
+
Section 6 -- Term and Termination.
|
| 322 |
+
|
| 323 |
+
a. This Public License applies for the term of the Copyright and
|
| 324 |
+
Similar Rights licensed here. However, if You fail to comply with
|
| 325 |
+
this Public License, then Your rights under this Public License
|
| 326 |
+
terminate automatically.
|
| 327 |
+
|
| 328 |
+
b. Where Your right to use the Licensed Material has terminated under
|
| 329 |
+
Section 6(a), it reinstates:
|
| 330 |
+
|
| 331 |
+
1. automatically as of the date the violation is cured, provided
|
| 332 |
+
it is cured within 30 days of Your discovery of the
|
| 333 |
+
violation; or
|
| 334 |
+
|
| 335 |
+
2. upon express reinstatement by the Licensor.
|
| 336 |
+
|
| 337 |
+
For the avoidance of doubt, this Section 6(b) does not affect any
|
| 338 |
+
right the Licensor may have to seek remedies for Your violations
|
| 339 |
+
of this Public License.
|
| 340 |
+
|
| 341 |
+
c. For the avoidance of doubt, the Licensor may also offer the
|
| 342 |
+
Licensed Material under separate terms or conditions or stop
|
| 343 |
+
distributing the Licensed Material at any time; however, doing so
|
| 344 |
+
will not terminate this Public License.
|
| 345 |
+
|
| 346 |
+
d. Sections 1, 5, 6, 7, and 8 survive termination of this Public
|
| 347 |
+
License.
|
| 348 |
+
|
| 349 |
+
|
| 350 |
+
Section 7 -- Other Terms and Conditions.
|
| 351 |
+
|
| 352 |
+
a. The Licensor shall not be bound by any additional or different
|
| 353 |
+
terms or conditions communicated by You unless expressly agreed.
|
| 354 |
+
|
| 355 |
+
b. Any arrangements, understandings, or agreements regarding the
|
| 356 |
+
Licensed Material not stated herein are separate from and
|
| 357 |
+
independent of the terms and conditions of this Public License.
|
| 358 |
+
|
| 359 |
+
|
| 360 |
+
Section 8 -- Interpretation.
|
| 361 |
+
|
| 362 |
+
a. For the avoidance of doubt, this Public License does not, and
|
| 363 |
+
shall not be interpreted to, reduce, limit, restrict, or impose
|
| 364 |
+
conditions on any use of the Licensed Material that could lawfully
|
| 365 |
+
be made without permission under this Public License.
|
| 366 |
+
|
| 367 |
+
b. To the extent possible, if any provision of this Public License is
|
| 368 |
+
deemed unenforceable, it shall be automatically reformed to the
|
| 369 |
+
minimum extent necessary to make it enforceable. If the provision
|
| 370 |
+
cannot be reformed, it shall be severed from this Public License
|
| 371 |
+
without affecting the enforceability of the remaining terms and
|
| 372 |
+
conditions.
|
| 373 |
+
|
| 374 |
+
c. No term or condition of this Public License will be waived and no
|
| 375 |
+
failure to comply consented to unless expressly agreed to by the
|
| 376 |
+
Licensor.
|
| 377 |
+
|
| 378 |
+
d. Nothing in this Public License constitutes or may be interpreted
|
| 379 |
+
as a limitation upon, or waiver of, any privileges and immunities
|
| 380 |
+
that apply to the Licensor or You, including from the legal
|
| 381 |
+
processes of any jurisdiction or authority.
|
| 382 |
+
|
| 383 |
+
=======================================================================
|
| 384 |
+
|
| 385 |
+
Creative Commons is not a party to its public
|
| 386 |
+
licenses. Notwithstanding, Creative Commons may elect to apply one of
|
| 387 |
+
its public licenses to material it publishes and in those instances
|
| 388 |
+
will be considered the “Licensor.” The text of the Creative Commons
|
| 389 |
+
public licenses is dedicated to the public domain under the CC0 Public
|
| 390 |
+
Domain Dedication. Except for the limited purpose of indicating that
|
| 391 |
+
material is shared under a Creative Commons public license or as
|
| 392 |
+
otherwise permitted by the Creative Commons policies published at
|
| 393 |
+
creativecommons.org/policies, Creative Commons does not authorize the
|
| 394 |
+
use of the trademark "Creative Commons" or any other trademark or logo
|
| 395 |
+
of Creative Commons without its prior written consent including,
|
| 396 |
+
without limitation, in connection with any unauthorized modifications
|
| 397 |
+
to any of its public licenses or any other arrangements,
|
| 398 |
+
understandings, or agreements concerning use of licensed material. For
|
| 399 |
+
the avoidance of doubt, this paragraph does not form part of the
|
| 400 |
+
public licenses.
|
| 401 |
+
|
| 402 |
+
Creative Commons may be contacted at creativecommons.org.
|
app/scripts/latex-to-mdx/input/README.md
CHANGED
|
@@ -1,64 +1,76 @@
|
|
| 1 |
# Robot Learning: A Tutorial
|
| 2 |
|
| 3 |
-
|
| 4 |
-
This tutorial solves this: a unified entry point to the field of robot learning, presenting the conceptual underpinnings of popular approaches in the field, as well as presenting practical examples of how to use SOTA algorithms in `lerobot`, an open-source library for full-stack robotics.
|
| 5 |
|
| 6 |
-
|
|
|
|
|
|
|
|
|
|
| 7 |
|
| 8 |
-
|
| 9 |
-
## 1. Introduction
|
| 10 |
-
- [x] 1.1 Motivation
|
| 11 |
-
- [x] 1.2 Structure of the Report
|
| 12 |
|
| 13 |
-
|
| 14 |
-
- [x]
|
| 15 |
-
- [x]
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 16 |
- [x] 2.3.1 Adding Feedback Loops
|
| 17 |
- [x] 2.4 Limitations of Dynamics-based Robotics
|
| 18 |
|
| 19 |
-
|
| 20 |
-
- [
|
| 21 |
-
|
| 22 |
-
- [
|
| 23 |
-
|
| 24 |
-
- [ ] 3.2.2 Code Example: HIL-SERL in lerobot
|
| 25 |
-
- [ ] 3.3 Limitations of RL in Real-World Robotics: Simulators and Reward Design
|
| 26 |
-
- [ ] 3.4 Behavioral Cloning (BC) for Robotics
|
| 27 |
-
- [ ] 4.1.1 Leveraging Real-World Demonstrations
|
| 28 |
-
- [ ] 4.1.2 Reward-Free Training and Betting on Data
|
| 29 |
|
| 30 |
-
|
| 31 |
-
- [
|
| 32 |
-
- [
|
| 33 |
-
- [
|
| 34 |
-
- [
|
| 35 |
-
|
| 36 |
-
- [
|
|
|
|
|
|
|
|
|
|
|
|
|
| 37 |
|
| 38 |
-
|
| 39 |
-
- [
|
| 40 |
-
|
| 41 |
-
- [
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 42 |
|
| 43 |
-
|
| 44 |
-
- [ ] 6.1 VLAs
|
| 45 |
- [ ] 6.1.1 From Imitation to Refinement
|
| 46 |
- [ ] 6.1.2 EXPO
|
|
|
|
|
|
|
|
|
|
|
|
|
| 47 |
|
| 48 |
-
|
| 49 |
-
|
| 50 |
-
|
| 51 |
-
If time permits (vs current TOC):
|
| 52 |
|
| 53 |
-
|
| 54 |
-
- [ ] 3.3.1 TD-MPC
|
| 55 |
-
- [ ] 3.3.2 Code Example: Use TD-MPC in lerobot
|
| 56 |
-
- [ ] 3.5 Popular benchmarks in Robot Learning
|
| 57 |
|
| 58 |
-
- 4.
|
| 59 |
-
- [ ] 4.3.1 Model Architecture and Training Objectives
|
| 60 |
-
- [ ] 4.3.2 Code Example: Use VQ-BeT in lerobot
|
| 61 |
|
| 62 |
-
|
| 63 |
-
- [ ] 6.1.1 In the architecture: V-JEPA and V-JEPA2
|
| 64 |
-
- [ ] 6.1.2 In the simulation: GENIE
|
|
|
|
| 1 |
# Robot Learning: A Tutorial
|
| 2 |
|
| 3 |
+
This repository contains the source code for the "Robot Learning: A Tutorial" report. This tutorial covers many of the most pressing aspects in modern robot learning, and provides practice examples using `lerobot`, the robot-learning library developed by Hugging Face.
|
|
|
|
| 4 |
|
| 5 |
+
You’re more than welcome to contribute to the next edition of the tutorial!
|
| 6 |
+
Simply open an issue, tag @fracapuano, and start a discussion about the scope and content you’d like to add. Check out CONTRIBUTING.md for more details 😊
|
| 7 |
+
All merged pull requests will receive public acknowledgment in the main body of the tutorial.
|
| 8 |
+
Items marked with an empty `[ ]` in the following Table of Contents are open for community contribution!
|
| 9 |
|
| 10 |
+
## Table of Contents
|
|
|
|
|
|
|
|
|
|
| 11 |
|
| 12 |
+
### 1. Introduction
|
| 13 |
+
- [x] 1.1 `lerobot` Dataset
|
| 14 |
+
- [x] 1.1.1 The dataset class design
|
| 15 |
+
- [x] 1.2 Code Example: Batching a (Streaming) Dataset
|
| 16 |
+
- [x] 1.3 Code Example: Collecting Data
|
| 17 |
+
|
| 18 |
+
### 2. Classical Robotics
|
| 19 |
+
- [x] 2.1 Explicit and Implicit Models
|
| 20 |
+
- [x] 2.2 Different Types of Motion
|
| 21 |
+
- [x] 2.3 Example: Planar Manipulation
|
| 22 |
- [x] 2.3.1 Adding Feedback Loops
|
| 23 |
- [x] 2.4 Limitations of Dynamics-based Robotics
|
| 24 |
|
| 25 |
+
### 3. Robot (Reinforcement) Learning
|
| 26 |
+
- [x] 3.1 A (Concise) Introduction to RL
|
| 27 |
+
- [x] 3.2 Real-world RL for Robotics
|
| 28 |
+
- [x] 3.3 Code Example: Real-world RL
|
| 29 |
+
- [x] 3.4 Limitations of RL in Real-World Robotics: Simulators and Reward Design
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 30 |
|
| 31 |
+
### 4. Robot (Imitation) Learning
|
| 32 |
+
- [x] 4.1 A (Concise) Introduction to Generative Models
|
| 33 |
+
- [x] 4.1.1 Variational Auto-Encoders
|
| 34 |
+
- [x] 4.1.2 Diffusion Models
|
| 35 |
+
- [x] 4.1.3 Flow Matching
|
| 36 |
+
- [x] 4.2 Action Chunking with Transformers
|
| 37 |
+
- [x] 4.2.1 Code Example: Training and Using ACT in Practice
|
| 38 |
+
- [x] 4.3 Diffusion Policy
|
| 39 |
+
- [x] 4.3.1 Code Example: Training and Using Diffusion Policies in Practice
|
| 40 |
+
- [x] 4.4 Optimized Inference
|
| 41 |
+
- [x] 4.4.1 Code Example: Using Async Inference
|
| 42 |
|
| 43 |
+
### 5. Generalist Robot Policies
|
| 44 |
+
- [x] 5.1 Preliminaries: Models and Data
|
| 45 |
+
- [x] 5.2 Modern VLAs
|
| 46 |
+
- [x] 5.2.1 VLMs for VLAs
|
| 47 |
+
- [x] 5.3 PI0
|
| 48 |
+
- [ ] 5.3.1 Code Example: Using PI0
|
| 49 |
+
- [x] 5.4 SmolVLA
|
| 50 |
+
- [ ] 5.4.1 Code Example: Using SmolVLA
|
| 51 |
+
- [ ] 5.5 GR00T (1/2)
|
| 52 |
+
- [ ] 5.5.1 Code Example: Using GR00T
|
| 53 |
+
- [ ] 5.6 PI05
|
| 54 |
+
- [ ] 5.6.1 Code Example: Using PI05
|
| 55 |
+
- [ ] Large-scale datasets
|
| 56 |
+
- [ ] Open-X
|
| 57 |
+
- [ ] DROID
|
| 58 |
+
- [ ] BEHAVIOR
|
| 59 |
|
| 60 |
+
### 6. Some Emerging Directions in Robot Learning
|
| 61 |
+
- [ ] 6.1 Post training VLAs
|
| 62 |
- [ ] 6.1.1 From Imitation to Refinement
|
| 63 |
- [ ] 6.1.2 EXPO
|
| 64 |
+
- [ ] 6.2 World Models for robotics
|
| 65 |
+
- [ ] 6.2.1 Cosmos
|
| 66 |
+
- [ ] 6.2.2 World Models (1X)
|
| 67 |
+
- [ ] 6.2.3 Sima and Genie 1
|
| 68 |
|
| 69 |
+
### 7. Conclusions
|
| 70 |
+
- [x] 7.1 Conclusions
|
|
|
|
|
|
|
| 71 |
|
| 72 |
+
## License
|
|
|
|
|
|
|
|
|
|
| 73 |
|
| 74 |
+
The written content of this book is licensed under the [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](http://creativecommons.org/licenses/by-nc-sa/4.0/).
|
|
|
|
|
|
|
| 75 |
|
| 76 |
+
All source code examples in the `snippets/` directory are licensed under the [MIT License](https://opensource.org/licenses/MIT).
|
|
|
|
|
|
app/scripts/latex-to-mdx/input/figures/.DS_Store
ADDED
|
Binary file (10.2 kB). View file
|
|
|
app/scripts/latex-to-mdx/input/figures/ch3/ch3-hil-serl-architecture.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-diffusion-policy.png
CHANGED
|
Git LFS Details
|
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-diffusion-robot-actions.png
CHANGED
|
Git LFS Details
|
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch4/ch4-normalizing-flows.png
CHANGED
|
Git LFS Details
|
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/figures/ch5/ch5-smolvla.png
CHANGED
|
Git LFS Details
|
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/handles.tex
CHANGED
|
@@ -27,7 +27,7 @@
|
|
| 27 |
\newcommand{\qfunction}{\(Q\)-function}
|
| 28 |
\newcommand{\qopt}{\( Q^* \)}
|
| 29 |
|
| 30 |
-
\newcommand{\supp}[1]{\
|
| 31 |
\newcommand{\DKL}{\text{D}_{\text{KL}}}
|
| 32 |
|
| 33 |
\newcommand{\actionchunk}{\mathbf{A}}
|
|
|
|
| 27 |
\newcommand{\qfunction}{\(Q\)-function}
|
| 28 |
\newcommand{\qopt}{\( Q^* \)}
|
| 29 |
|
| 30 |
+
\newcommand{\supp}[1]{\operatorname{supp}({#1})}
|
| 31 |
\newcommand{\DKL}{\text{D}_{\text{KL}}}
|
| 32 |
|
| 33 |
\newcommand{\actionchunk}{\mathbf{A}}
|
app/scripts/latex-to-mdx/input/hfstyle/defns.tex
CHANGED
|
@@ -2,7 +2,6 @@
|
|
| 2 |
% A useful set of commands
|
| 3 |
\usepackage{mathtools}
|
| 4 |
\usepackage{dsfont}
|
| 5 |
-
\usepackage[dvipsnames]{xcolor}
|
| 6 |
\usepackage[colorinlistoftodos]{todonotes}
|
| 7 |
\usepackage{booktabs}
|
| 8 |
\usepackage{xfrac}
|
|
|
|
| 2 |
% A useful set of commands
|
| 3 |
\usepackage{mathtools}
|
| 4 |
\usepackage{dsfont}
|
|
|
|
| 5 |
\usepackage[colorinlistoftodos]{todonotes}
|
| 6 |
\usepackage{booktabs}
|
| 7 |
\usepackage{xfrac}
|
app/scripts/latex-to-mdx/input/hfstyle/hf.cls
CHANGED
|
@@ -40,7 +40,6 @@
|
|
| 40 |
|
| 41 |
% Colorful stuff %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
| 42 |
|
| 43 |
-
\RequirePackageWithOptions{xcolor}
|
| 44 |
\RequirePackage[most]{tcolorbox}
|
| 45 |
\definecolor{ai2accent}{HTML}{407579}
|
| 46 |
% \definecolor{ai2accent}{HTML}{ff0000}
|
|
|
|
| 40 |
|
| 41 |
% Colorful stuff %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
| 42 |
|
|
|
|
| 43 |
\RequirePackage[most]{tcolorbox}
|
| 44 |
\definecolor{ai2accent}{HTML}{407579}
|
| 45 |
% \definecolor{ai2accent}{HTML}{ff0000}
|
app/scripts/latex-to-mdx/input/logos/core.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/logos/oxford_logo.png
ADDED
|
Git LFS Details
|
app/scripts/latex-to-mdx/input/main.aux
ADDED
|
@@ -0,0 +1,681 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\relax
|
| 2 |
+
\providecommand \babel@aux [2]{\global \let \babel@toc \@gobbletwo }
|
| 3 |
+
\@nameuse{bbl@beforestart}
|
| 4 |
+
\catcode `"\active
|
| 5 |
+
\nicematrix@redefine@check@rerun
|
| 6 |
+
\providecommand\hyper@newdestlabel[2]{}
|
| 7 |
+
\providecommand\HyField@AuxAddToFields[1]{}
|
| 8 |
+
\providecommand\HyField@AuxAddToCoFields[2]{}
|
| 9 |
+
\providecommand \oddpage@label [2]{}
|
| 10 |
+
\babel@aux{english}{}
|
| 11 |
+
\citation{sicilianoSpringerHandbookRobotics2016}
|
| 12 |
+
\citation{tedrakeRoboticManipulationPerception,tedrakeUnderactuatedRoboticsAlgorithms}
|
| 13 |
+
\citation{shalev-shwartzUnderstandingMachineLearning2014}
|
| 14 |
+
\citation{prince2023understanding}
|
| 15 |
+
\citation{suttonReinforcementLearningIntroduction2018}
|
| 16 |
+
\citation{nakkiranStepbyStepDiffusionElementary2024}
|
| 17 |
+
\citation{lipmanFlowMatchingGuide2024}
|
| 18 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {1}{\ignorespaces \texttt {lerobot}~is the open-source library for end-to-end robotics developed by Hugging Face. The library is vertically integrated on the entire robotics stack, supporting low-level control of real-world robot devices, advanced data and inference optimizations, as well as SOTA robot learning methods with simple implementations in pure Pytorch.}}{3}{figure.caption.1}\protected@file@percent }
|
| 19 |
+
\providecommand*\caption@xref[2]{\@setref\relax\@undefined{#1}}
|
| 20 |
+
\newlabel{fig:figure1}{{1}{3}{\lerobot ~is the open-source library for end-to-end robotics developed by Hugging Face. The library is vertically integrated on the entire robotics stack, supporting low-level control of real-world robot devices, advanced data and inference optimizations, as well as SOTA robot learning methods with simple implementations in pure Pytorch}{figure.caption.1}{}}
|
| 21 |
+
\newlabel{fig:figure1@cref}{{[figure][1][]1}{[1][3][]3}{}{}{}}
|
| 22 |
+
\@writefile{toc}{\contentsline {section}{\numberline {1}Introduction}{3}{section.1}\protected@file@percent }
|
| 23 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {1.1}\texttt {LeRobotDataset}}{4}{subsection.1.1}\protected@file@percent }
|
| 24 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {1.1.1}The dataset class design}{4}{subsubsection.1.1.1}\protected@file@percent }
|
| 25 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {1.2}Code Example: Batching a (Streaming) Dataset}{5}{subsection.1.2}\protected@file@percent }
|
| 26 |
+
\newlabel{ex:dataset-batching}{{1}{5}{Code Example: Batching a (Streaming) Dataset}{tcb@cnt@pbox.1}{}}
|
| 27 |
+
\newlabel{ex:dataset-batching@cref}{{[tcb@cnt@pbox][1][]1}{[1][5][]5}{}{}{}}
|
| 28 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch1/01\textunderscore datasets.py}{5}{lstlisting.-1}\protected@file@percent }
|
| 29 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {1.3}Code Example: Collecting Data}{6}{subsection.1.3}\protected@file@percent }
|
| 30 |
+
\newlabel{paragraph:collecting-data}{{1.3}{6}{Code Example: Collecting Data}{subsection.1.3}{}}
|
| 31 |
+
\newlabel{paragraph:collecting-data@cref}{{[subsection][3][1]1.3}{[1][6][]6}{}{}{}}
|
| 32 |
+
\newlabel{ex:record-dataset}{{2}{6}{Code Example: Collecting Data}{tcb@cnt@pbox.2}{}}
|
| 33 |
+
\newlabel{ex:record-dataset@cref}{{[tcb@cnt@pbox][2][]2}{[1][6][]6}{}{}{}}
|
| 34 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch1/02\textunderscore record\textunderscore data.py}{6}{lstlisting.-2}\protected@file@percent }
|
| 35 |
+
\citation{bekrisStateRobotMotion2024}
|
| 36 |
+
\citation{connellRobotLearning1993}
|
| 37 |
+
\citation{agrawalComputationalSensorimotorLearning,bekrisStateRobotMotion2024}
|
| 38 |
+
\citation{hansenTemporalDifferenceLearning2022}
|
| 39 |
+
\citation{suttonReinforcementLearningIntroduction2018}
|
| 40 |
+
\citation{mccormacSemanticFusionDense3D2016}
|
| 41 |
+
\citation{bekrisStateRobotMotion2024}
|
| 42 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {2}{\ignorespaces Overview of methods to generate motion (clearly non-exhausitve, see~\citet {bekrisStateRobotMotion2024}). The different methods can be grouped based on whether they explicitly (\emph {dynamics-based}) or implicitly (\emph {learning-based}) model robot-environment interactions.}}{9}{figure.caption.2}\protected@file@percent }
|
| 43 |
+
\newlabel{fig:generating-motion-atlas}{{2}{9}{Overview of methods to generate motion (clearly non-exhausitve, see~\citet {bekrisStateRobotMotion2024}). The different methods can be grouped based on whether they explicitly (\emph {dynamics-based}) or implicitly (\emph {learning-based}) model robot-environment interactions}{figure.caption.2}{}}
|
| 44 |
+
\newlabel{fig:generating-motion-atlas@cref}{{[figure][2][]2}{[1][9][]9}{}{}{}}
|
| 45 |
+
\@writefile{toc}{\contentsline {section}{\numberline {2}Classical Robotics}{9}{section.2}\protected@file@percent }
|
| 46 |
+
\newlabel{sec:classical}{{2}{9}{Classical Robotics}{section.2}{}}
|
| 47 |
+
\newlabel{sec:classical@cref}{{[section][2][]2}{[1][9][]9}{}{}{}}
|
| 48 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {2.1}Explicit and Implicit Models}{9}{subsection.2.1}\protected@file@percent }
|
| 49 |
+
\citation{tangPerceptionNavigationAutonomous2023}
|
| 50 |
+
\citation{koberReinforcementLearningRobotics}
|
| 51 |
+
\citation{griffinWalkingStabilizationUsing2017,jiDribbleBotDynamicLegged2023,leeLearningQuadrupedalLocomotion2020,margolisRapidLocomotionReinforcement2022}
|
| 52 |
+
\citation{zhangWoCoCoLearningWholeBody2024,bjorckGR00TN1Open2025}
|
| 53 |
+
\citation{fujitaDevelopmentRobotsNuclear2020,alizadehComprehensiveSurveySpace2024}
|
| 54 |
+
\citation{sannemanStateIndustrialRobotics2020}
|
| 55 |
+
\citation{koberReinforcementLearningRobotics}
|
| 56 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 57 |
+
\citation{aldacoALOHA2Enhanced}
|
| 58 |
+
\citation{knightStandardOpenSO100}
|
| 59 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {3}{\ignorespaces Different kinds of motions are achieved with potentially very different robotic platforms. From left to right, top to bottom: ViperX, SO-100, Boston Dynamics' Spot, Open-Duck, 1X's NEO, Boston Dynamics' Atlas. This is an example list of robotic platforms and is (very) far from being exhaustive.}}{10}{figure.caption.3}\protected@file@percent }
|
| 60 |
+
\newlabel{fig:robotics-platforms-atlas}{{3}{10}{Different kinds of motions are achieved with potentially very different robotic platforms. From left to right, top to bottom: ViperX, SO-100, Boston Dynamics' Spot, Open-Duck, 1X's NEO, Boston Dynamics' Atlas. This is an example list of robotic platforms and is (very) far from being exhaustive}{figure.caption.3}{}}
|
| 61 |
+
\newlabel{fig:robotics-platforms-atlas@cref}{{[figure][3][]3}{[1][10][]10}{}{}{}}
|
| 62 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {2.2}Different Types of Motion}{10}{subsection.2.2}\protected@file@percent }
|
| 63 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {2.3}Example: Planar Manipulation}{10}{subsection.2.3}\protected@file@percent }
|
| 64 |
+
\citation{sicilianoSpringerHandbookRobotics2016,lynchModernRoboticsMechanics2017,tedrakeRoboticManipulationPerception,tedrakeUnderactuatedRoboticsAlgorithms}
|
| 65 |
+
\citation{lynchModernRoboticsMechanics2017}
|
| 66 |
+
\citation{tedrakeRoboticManipulationPerception}
|
| 67 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {4}{\ignorespaces Cheaper, more accessible robots are starting to rival traditional platforms like the Panda arm platforms in adoption in resource-constrained scenarios. The SO-100, in particular, has a cost in the 100s of Euros, and can be entirely 3D-printed in hours, while the industrially-manufactured Panda arm costs tens of thousands of Euros and is not openly available.}}{11}{figure.caption.4}\protected@file@percent }
|
| 68 |
+
\newlabel{fig:robotic-platforms-costs}{{4}{11}{Cheaper, more accessible robots are starting to rival traditional platforms like the Panda arm platforms in adoption in resource-constrained scenarios. The SO-100, in particular, has a cost in the 100s of Euros, and can be entirely 3D-printed in hours, while the industrially-manufactured Panda arm costs tens of thousands of Euros and is not openly available}{figure.caption.4}{}}
|
| 69 |
+
\newlabel{fig:robotic-platforms-costs@cref}{{[figure][4][]4}{[1][11][]11}{}{}{}}
|
| 70 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {5}{\ignorespaces The SO-100 arm is a 6-dof manipulator arm. Preventing some of its joints (shoulder pane, wrist flex and wrist roll) from actuating, it can be represented as a traditional 2-dof planar manipulator (the gripper joint in the end-effector is not considered towards the count of the degrees of freedom used to produce motion).}}{11}{figure.caption.5}\protected@file@percent }
|
| 71 |
+
\newlabel{fig:make-so100-planar-manipulator}{{5}{11}{The SO-100 arm is a 6-dof manipulator arm. Preventing some of its joints (shoulder pane, wrist flex and wrist roll) from actuating, it can be represented as a traditional 2-dof planar manipulator (the gripper joint in the end-effector is not considered towards the count of the degrees of freedom used to produce motion)}{figure.caption.5}{}}
|
| 72 |
+
\newlabel{fig:make-so100-planar-manipulator@cref}{{[figure][5][]5}{[1][11][]11}{}{}{}}
|
| 73 |
+
\newlabel{fig:planar-manipulation-simple}{{6a}{12}{Free to move}{figure.caption.6}{}}
|
| 74 |
+
\newlabel{fig:planar-manipulation-simple@cref}{{[subfigure][1][6]6a}{[1][12][]12}{}{}{}}
|
| 75 |
+
\newlabel{sub@fig:planar-manipulation-simple}{{a}{12}{Free to move}{figure.caption.6}{}}
|
| 76 |
+
\newlabel{sub@fig:planar-manipulation-simple@cref}{{[subfigure][1][6]6a}{[1][12][]12}{}{}{}}
|
| 77 |
+
\newlabel{fig:planar-manipulator-floor}{{6b}{12}{Constrained by the surface}{figure.caption.6}{}}
|
| 78 |
+
\newlabel{fig:planar-manipulator-floor@cref}{{[subfigure][2][6]6b}{[1][12][]12}{}{}{}}
|
| 79 |
+
\newlabel{sub@fig:planar-manipulator-floor}{{b}{12}{Constrained by the surface}{figure.caption.6}{}}
|
| 80 |
+
\newlabel{sub@fig:planar-manipulator-floor@cref}{{[subfigure][2][6]6b}{[1][12][]12}{}{}{}}
|
| 81 |
+
\newlabel{fig:planar-manipulator-floor-shelf}{{6c}{12}{Constrained by surface and (fixed) obstacle}{figure.caption.6}{}}
|
| 82 |
+
\newlabel{fig:planar-manipulator-floor-shelf@cref}{{[subfigure][3][6]6c}{[1][12][]12}{}{}{}}
|
| 83 |
+
\newlabel{sub@fig:planar-manipulator-floor-shelf}{{c}{12}{Constrained by surface and (fixed) obstacle}{figure.caption.6}{}}
|
| 84 |
+
\newlabel{sub@fig:planar-manipulator-floor-shelf@cref}{{[subfigure][3][6]6c}{[1][12][]12}{}{}{}}
|
| 85 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {6}{\ignorespaces Planar, 2-dof schematic representation of the SO-100 manipulator under diverse deployment settings. From left to right: completely free of moving; constrained by the presence of the surface; constrained by the surface and presence of obstacles. Circular arrows around each joint indicate the maximal rotation feasible at that joint.}}{12}{figure.caption.6}\protected@file@percent }
|
| 86 |
+
\newlabel{eq:ik_problem}{{1}{12}{Example: Planar Manipulation}{equation.1}{}}
|
| 87 |
+
\newlabel{eq:ik_problem@cref}{{[equation][1][]1}{[1][12][]12}{}{}{}}
|
| 88 |
+
\citation{sicilianoSpringerHandbookRobotics2016}
|
| 89 |
+
\citation{tedrakeRoboticManipulationPerception}
|
| 90 |
+
\citation{sicilianoSpringerHandbookRobotics2016}
|
| 91 |
+
\citation{lynchModernRoboticsMechanics2017}
|
| 92 |
+
\citation{tedrakeRoboticManipulationPerception}
|
| 93 |
+
\newlabel{eq:reg_ik_velocity}{{2}{13}{Example: Planar Manipulation}{equation.2}{}}
|
| 94 |
+
\newlabel{eq:reg_ik_velocity@cref}{{[equation][2][]2}{[1][13][]13}{}{}{}}
|
| 95 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {2.3.1}Adding Feedback Loops}{13}{subsubsection.2.3.1}\protected@file@percent }
|
| 96 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {7}{\ignorespaces Planar manipulator robot in the presence of a moving obstacle.}}{13}{figure.caption.7}\protected@file@percent }
|
| 97 |
+
\newlabel{fig:planar-manipulator-box-velocity}{{7}{13}{Planar manipulator robot in the presence of a moving obstacle}{figure.caption.7}{}}
|
| 98 |
+
\newlabel{fig:planar-manipulator-box-velocity@cref}{{[figure][7][]7}{[1][13][]13}{}{}{}}
|
| 99 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {2.4}Limitations of Dynamics-based Robotics}{13}{subsection.2.4}\protected@file@percent }
|
| 100 |
+
\citation{antonovaReinforcementLearningPivoting2017}
|
| 101 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {8}{\ignorespaces Dynamics-based approaches to robotics suffer from several limitations: (1) orchestrating multiple components poses integration challenges; (2) the need to develop custom processing pipelines for the sensing modalities and tasks considered hinders scalability; (3) simplified analytical models of physical phenomena (here friction at the gripper; credits to~\citet {antonovaReinforcementLearningPivoting2017}) limit real-world performance. Lastly, (4) dynamics-based methods overlook trends in the availability and growth of robotics data.}}{14}{figure.caption.8}\protected@file@percent }
|
| 102 |
+
\newlabel{fig:classical-limitations}{{8}{14}{Dynamics-based approaches to robotics suffer from several limitations: (1) orchestrating multiple components poses integration challenges; (2) the need to develop custom processing pipelines for the sensing modalities and tasks considered hinders scalability; (3) simplified analytical models of physical phenomena (here friction at the gripper; credits to~\citet {antonovaReinforcementLearningPivoting2017}) limit real-world performance. Lastly, (4) dynamics-based methods overlook trends in the availability and growth of robotics data}{figure.caption.8}{}}
|
| 103 |
+
\newlabel{fig:classical-limitations@cref}{{[figure][8][]8}{[1][14][]14}{}{}{}}
|
| 104 |
+
\citation{oneillOpenXEmbodimentRobotic2025,khazatskyDROIDLargeScaleInTheWild2025}
|
| 105 |
+
\citation{alayracFlamingoVisualLanguage2022}
|
| 106 |
+
\citation{brownLanguageModelsAre2020}
|
| 107 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {9}{\ignorespaces Learning-based robotics streamlines perception-to-action by learning a (1) unified high-level controller capable to take (2) high-dimensional, unstructured sensorimotor information. Learning (3) does not require a dynamics model and instead focuses on interaction data, and (4) empirically correlates with the scale of the data used. }}{16}{figure.caption.9}\protected@file@percent }
|
| 108 |
+
\newlabel{fig:robot-learning-upsides}{{9}{16}{Learning-based robotics streamlines perception-to-action by learning a (1) unified high-level controller capable to take (2) high-dimensional, unstructured sensorimotor information. Learning (3) does not require a dynamics model and instead focuses on interaction data, and (4) empirically correlates with the scale of the data used}{figure.caption.9}{}}
|
| 109 |
+
\newlabel{fig:robot-learning-upsides@cref}{{[figure][9][]9}{[1][16][]16}{}{}{}}
|
| 110 |
+
\@writefile{toc}{\contentsline {section}{\numberline {3}Robot (Reinforcement) Learning}{16}{section.3}\protected@file@percent }
|
| 111 |
+
\newlabel{sec:learning-rl}{{3}{16}{Robot (Reinforcement) Learning}{section.3}{}}
|
| 112 |
+
\newlabel{sec:learning-rl@cref}{{[section][3][]3}{[1][16][]16}{}{}{}}
|
| 113 |
+
\citation{zhaoLearningFineGrainedBimanual2023,chiDiffusionPolicyVisuomotor2024,leeBehaviorGenerationLatent2024,black$p_0$VisionLanguageActionFlow2024,shukorSmolVLAVisionLanguageActionModel2025,luoPreciseDexterousRobotic2024,hansenTemporalDifferenceLearning2022}
|
| 114 |
+
\citation{black$p_0$VisionLanguageActionFlow2024,shukorSmolVLAVisionLanguageActionModel2025}
|
| 115 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 116 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 117 |
+
\citation{leeBehaviorGenerationLatent2024}
|
| 118 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 119 |
+
\citation{shukorSmolVLAVisionLanguageActionModel2025}
|
| 120 |
+
\citation{luoPreciseDexterousRobotic2024}
|
| 121 |
+
\citation{hansenTemporalDifferenceLearning2022}
|
| 122 |
+
\citation{koberReinforcementLearningRobotics}
|
| 123 |
+
\citation{suttonReinforcementLearningIntroduction2018}
|
| 124 |
+
\citation{koberReinforcementLearningRobotics}
|
| 125 |
+
\citation{suttonReinforcementLearningIntroduction2018}
|
| 126 |
+
\citation{bellmanMarkovianDecisionProcess1957}
|
| 127 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {11}{\ignorespaces Examples of two different robotics tasks performed using RL. In the manipulation task (A) an agent learns to reach for a yellow plastic block in its environment, and to put it inside of a box. In the locomotion task (B) an agent learns to move its center of mass sideways without falling.}}{17}{figure.caption.11}\protected@file@percent }
|
| 128 |
+
\newlabel{fig:robotics-with-rl-examples}{{11}{17}{Examples of two different robotics tasks performed using RL. In the manipulation task (A) an agent learns to reach for a yellow plastic block in its environment, and to put it inside of a box. In the locomotion task (B) an agent learns to move its center of mass sideways without falling}{figure.caption.11}{}}
|
| 129 |
+
\newlabel{fig:robotics-with-rl-examples@cref}{{[figure][11][]11}{[1][17][]17}{}{}{}}
|
| 130 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {10}{\ignorespaces Overview of the robot learning methods implemented in \texttt {lerobot}. All algorithms are implemented in Pytorch. References:~\citet {zhaoLearningFineGrainedBimanual2023,chiDiffusionPolicyVisuomotor2024,leeBehaviorGenerationLatent2024,black$p_0$VisionLanguageActionFlow2024,shukorSmolVLAVisionLanguageActionModel2025,luoPreciseDexterousRobotic2024,hansenTemporalDifferenceLearning2022} (top-to-bottom, left-to-right).}}{17}{figure.caption.10}\protected@file@percent }
|
| 131 |
+
\newlabel{fig:robot-learning-atlas}{{10}{17}{Overview of the robot learning methods implemented in \lerobot . All algorithms are implemented in Pytorch. References:~\citet {zhaoLearningFineGrainedBimanual2023,chiDiffusionPolicyVisuomotor2024,leeBehaviorGenerationLatent2024,black$p_0$VisionLanguageActionFlow2024,shukorSmolVLAVisionLanguageActionModel2025,luoPreciseDexterousRobotic2024,hansenTemporalDifferenceLearning2022} (top-to-bottom, left-to-right)}{figure.caption.10}{}}
|
| 132 |
+
\newlabel{fig:robot-learning-atlas@cref}{{[figure][10][]10}{[1][17][]17}{}{}{}}
|
| 133 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {3.1}A (Concise) Introduction to RL}{17}{subsection.3.1}\protected@file@percent }
|
| 134 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {12}{\ignorespaces Agent-Environment interaction diagram (image credits to~\citet {suttonReinforcementLearningIntroduction2018}).}}{18}{figure.caption.12}\protected@file@percent }
|
| 135 |
+
\newlabel{fig:rl-most-famous-pic}{{12}{18}{Agent-Environment interaction diagram (image credits to~\citet {suttonReinforcementLearningIntroduction2018})}{figure.caption.12}{}}
|
| 136 |
+
\newlabel{fig:rl-most-famous-pic@cref}{{[figure][12][]12}{[1][17][]18}{}{}{}}
|
| 137 |
+
\newlabel{eq:trajectory_definition}{{3}{18}{A (Concise) Introduction to RL}{equation.3}{}}
|
| 138 |
+
\newlabel{eq:trajectory_definition@cref}{{[equation][3][]3}{[1][18][]18}{}{}{}}
|
| 139 |
+
\newlabel{eq:dynamics_markovian}{{4}{18}{A (Concise) Introduction to RL}{equation.4}{}}
|
| 140 |
+
\newlabel{eq:dynamics_markovian@cref}{{[equation][4][]4}{[1][18][]18}{}{}{}}
|
| 141 |
+
\newlabel{eq:policy_markovian}{{5}{18}{A (Concise) Introduction to RL}{equation.5}{}}
|
| 142 |
+
\newlabel{eq:policy_markovian@cref}{{[equation][5][]5}{[1][18][]18}{}{}{}}
|
| 143 |
+
\newlabel{eq:traj_prob}{{6}{18}{A (Concise) Introduction to RL}{equation.6}{}}
|
| 144 |
+
\newlabel{eq:traj_prob@cref}{{[equation][6][]6}{[1][18][]18}{}{}{}}
|
| 145 |
+
\citation{SpinningUp2018}
|
| 146 |
+
\citation{schulmanTrustRegionPolicy2017}
|
| 147 |
+
\citation{schulmanProximalPolicyOptimization2017}
|
| 148 |
+
\citation{haarnojaSoftActorCriticOffPolicy2018}
|
| 149 |
+
\citation{akkayaSolvingRubiksCube2019}
|
| 150 |
+
\citation{leeLearningQuadrupedalLocomotion2020}
|
| 151 |
+
\citation{koberReinforcementLearningRobotics,tangDeepReinforcementLearning2025}
|
| 152 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {13}{\ignorespaces Popular RL algorithms. See~\citet {SpinningUp2018} for a complete list of citations.}}{19}{figure.caption.13}\protected@file@percent }
|
| 153 |
+
\newlabel{fig:rl-algos-atlas}{{13}{19}{Popular RL algorithms. See~\citet {SpinningUp2018} for a complete list of citations}{figure.caption.13}{}}
|
| 154 |
+
\newlabel{fig:rl-algos-atlas@cref}{{[figure][13][]13}{[1][19][]19}{}{}{}}
|
| 155 |
+
\newlabel{eq:RL-j-function}{{7}{19}{A (Concise) Introduction to RL}{equation.7}{}}
|
| 156 |
+
\newlabel{eq:RL-j-function@cref}{{[equation][7][]7}{[1][19][]19}{}{}{}}
|
| 157 |
+
\newlabel{eq:traj-probabilities-for-policies}{{8}{19}{A (Concise) Introduction to RL}{equation.8}{}}
|
| 158 |
+
\newlabel{eq:traj-probabilities-for-policies@cref}{{[equation][8][]8}{[1][19][]19}{}{}{}}
|
| 159 |
+
\newlabel{eq:q-as-v}{{9}{19}{A (Concise) Introduction to RL}{equation.9}{}}
|
| 160 |
+
\newlabel{eq:q-as-v@cref}{{[equation][9][]9}{[1][19][]19}{}{}{}}
|
| 161 |
+
\newlabel{eq:v-as-q}{{10}{19}{A (Concise) Introduction to RL}{equation.10}{}}
|
| 162 |
+
\newlabel{eq:v-as-q@cref}{{[equation][10][]10}{[1][19][]19}{}{}{}}
|
| 163 |
+
\citation{haarnojaSoftActorCriticOffPolicy2018}
|
| 164 |
+
\citation{tobinDomainRandomizationTransferring2017}
|
| 165 |
+
\citation{tobinDomainRandomizationTransferring2017}
|
| 166 |
+
\citation{akkayaSolvingRubiksCube2019,antonovaReinforcementLearningPivoting2017,jiDribbleBotDynamicLegged2023}
|
| 167 |
+
\citation{tobinDomainRandomizationTransferring2017,akkayaSolvingRubiksCube2019,jiDribbleBotDynamicLegged2023,tiboniDomainRandomizationEntropy2024}
|
| 168 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {14}{\ignorespaces Simulated (left) vs. real-world (right) OpenDuck. Discrepancies in the simulation dynamics (\emph {reality gap}) pose risks to policy transfer.}}{20}{figure.caption.14}\protected@file@percent }
|
| 169 |
+
\newlabel{fig:synthetic-vs-real-duck}{{14}{20}{Simulated (left) vs. real-world (right) OpenDuck. Discrepancies in the simulation dynamics (\emph {reality gap}) pose risks to policy transfer}{figure.caption.14}{}}
|
| 170 |
+
\newlabel{fig:synthetic-vs-real-duck@cref}{{[figure][14][]14}{[1][20][]20}{}{}{}}
|
| 171 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {3.2}Real-world RL for Robotics}{20}{subsection.3.2}\protected@file@percent }
|
| 172 |
+
\citation{margolisRapidLocomotionReinforcement2022}
|
| 173 |
+
\citation{akkayaSolvingRubiksCube2019}
|
| 174 |
+
\citation{tiboniDomainRandomizationEntropy2024}
|
| 175 |
+
\citation{tiboniDomainRandomizationEntropy2024}
|
| 176 |
+
\citation{chebotarClosingSimtorealLoop2019}
|
| 177 |
+
\citation{tiboniDROPOSimtoRealTransfer2023}
|
| 178 |
+
\citation{haarnojaSoftActorCriticOffPolicy2018}
|
| 179 |
+
\citation{schulmanProximalPolicyOptimization2017}
|
| 180 |
+
\citation{ballEfficientOnlineReinforcement2023}
|
| 181 |
+
\citation{luoSERLSoftwareSuite2025}
|
| 182 |
+
\citation{luoPreciseDexterousRobotic2024}
|
| 183 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {15}{\ignorespaces The same locomotion task can be carried out in different (simulated) domains (exemplified by the difference in terrains) at training time, resulting to increased robustness over diverse environment dynamics.}}{21}{figure.caption.15}\protected@file@percent }
|
| 184 |
+
\newlabel{fig:ducks-on-terrains}{{15}{21}{The same locomotion task can be carried out in different (simulated) domains (exemplified by the difference in terrains) at training time, resulting to increased robustness over diverse environment dynamics}{figure.caption.15}{}}
|
| 185 |
+
\newlabel{fig:ducks-on-terrains@cref}{{[figure][15][]15}{[1][20][]21}{}{}{}}
|
| 186 |
+
\citation{mnihPlayingAtariDeep2013}
|
| 187 |
+
\citation{mnihPlayingAtariDeep2013}
|
| 188 |
+
\citation{pmlr-v32-silver14}
|
| 189 |
+
\citation{pmlr-v32-silver14}
|
| 190 |
+
\citation{lillicrapContinuousControlDeep2019a}
|
| 191 |
+
\citation{haarnojaSoftActorCriticOffPolicy2018}
|
| 192 |
+
\citation{haarnojaReinforcementLearningDeep2017b}
|
| 193 |
+
\@writefile{toc}{\contentsline {paragraph}{Sample-efficient RL}{22}{figure.caption.15}\protected@file@percent }
|
| 194 |
+
\newlabel{eq:dqn-loss}{{11}{22}{Sample-efficient RL}{equation.11}{}}
|
| 195 |
+
\newlabel{eq:dqn-loss@cref}{{[equation][11][]11}{[1][22][]22}{}{}{}}
|
| 196 |
+
\newlabel{eq:TD-target}{{12}{22}{Sample-efficient RL}{equation.12}{}}
|
| 197 |
+
\newlabel{eq:TD-target@cref}{{[equation][12][]12}{[1][22][]22}{}{}{}}
|
| 198 |
+
\newlabel{eq:deterministic-pg}{{13}{22}{Sample-efficient RL}{equation.13}{}}
|
| 199 |
+
\newlabel{eq:deterministic-pg@cref}{{[equation][13][]13}{[1][22][]22}{}{}{}}
|
| 200 |
+
\newlabel{eq:TD-target-ddpg}{{14}{22}{Sample-efficient RL}{equation.14}{}}
|
| 201 |
+
\newlabel{eq:TD-target-ddpg@cref}{{[equation][14][]14}{[1][22][]22}{}{}{}}
|
| 202 |
+
\citation{haarnojaReinforcementLearningDeep2017b}
|
| 203 |
+
\citation{ballEfficientOnlineReinforcement2023}
|
| 204 |
+
\citation{luoSERLSoftwareSuite2025}
|
| 205 |
+
\citation{luoSERLSoftwareSuite2025}
|
| 206 |
+
\citation{luoSERLSoftwareSuite2025}
|
| 207 |
+
\citation{luoSERLSoftwareSuite2025}
|
| 208 |
+
\citation{luoSERLSoftwareSuite2025}
|
| 209 |
+
\citation{luoPreciseDexterousRobotic2024}
|
| 210 |
+
\newlabel{eq:J-soft}{{15}{23}{Sample-efficient RL}{equation.15}{}}
|
| 211 |
+
\newlabel{eq:J-soft@cref}{{[equation][15][]15}{[1][23][]23}{}{}{}}
|
| 212 |
+
\newlabel{eq:soft-td-target}{{16}{23}{Sample-efficient RL}{equation.16}{}}
|
| 213 |
+
\newlabel{eq:soft-td-target@cref}{{[equation][16][]16}{[1][23][]23}{}{}{}}
|
| 214 |
+
\newlabel{eq:sac-policy-update}{{17}{23}{Sample-efficient RL}{equation.17}{}}
|
| 215 |
+
\newlabel{eq:sac-policy-update@cref}{{[equation][17][]17}{[1][23][]23}{}{}{}}
|
| 216 |
+
\@writefile{toc}{\contentsline {paragraph}{Sample-efficient, data-driven RL}{23}{equation.17}\protected@file@percent }
|
| 217 |
+
\@writefile{toc}{\contentsline {paragraph}{Sample-efficient, data-driven, real-world RL}{23}{equation.17}\protected@file@percent }
|
| 218 |
+
\citation{luoPreciseDexterousRobotic2024}
|
| 219 |
+
\citation{luoPreciseDexterousRobotic2024}
|
| 220 |
+
\citation{luoPreciseDexterousRobotic2024}
|
| 221 |
+
\citation{ballEfficientOnlineReinforcement2023}
|
| 222 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {16}{\ignorespaces (A) HIL-SERL allows for real-world training of high performance RL agents by building on top advancements presented by of SAC, RLPD and SERL. (B) Example of human intervention during a HIL-SERL training process on a real-world SO-100.}}{24}{figure.caption.16}\protected@file@percent }
|
| 223 |
+
\newlabel{fig:hil-serl-blocks}{{16}{24}{(A) HIL-SERL allows for real-world training of high performance RL agents by building on top advancements presented by of SAC, RLPD and SERL. (B) Example of human intervention during a HIL-SERL training process on a real-world SO-100}{figure.caption.16}{}}
|
| 224 |
+
\newlabel{fig:hil-serl-blocks@cref}{{[figure][16][]16}{[1][23][]24}{}{}{}}
|
| 225 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {3.2.1}Code Example: Real-world RL}{24}{subsubsection.3.2.1}\protected@file@percent }
|
| 226 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {17}{\ignorespaces HIL-SERL is a SOTA RL algorithm for training control policies directly in the real-world. Its implementation in \texttt {lerobot}~relies on a decoupled actor-learner architecture, communicating over processes (and possibly networks) with queues used to share (1) transitions \( (s_t, a_t, r_t, s_{t+1})\) and (2) parameters \( \theta \).}}{25}{figure.caption.17}\protected@file@percent }
|
| 227 |
+
\newlabel{fig:ch3-hil-serl-architecture}{{17}{25}{HIL-SERL is a SOTA RL algorithm for training control policies directly in the real-world. Its implementation in \lerobot ~relies on a decoupled actor-learner architecture, communicating over processes (and possibly networks) with queues used to share (1) transitions \( \sars \) and (2) parameters \( \theta \)}{figure.caption.17}{}}
|
| 228 |
+
\newlabel{fig:ch3-hil-serl-architecture@cref}{{[figure][17][]17}{[1][24][]25}{}{}{}}
|
| 229 |
+
\newlabel{ex:train_reward_classifier}{{3}{25}{Code Example: Real-world RL}{tcb@cnt@pbox.3}{}}
|
| 230 |
+
\newlabel{ex:train_reward_classifier@cref}{{[tcb@cnt@pbox][3][]3}{[1][25][]25}{}{}{}}
|
| 231 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch3/01\textunderscore reward\textunderscore classifier.py}{25}{lstlisting.-3}\protected@file@percent }
|
| 232 |
+
\newlabel{ex:hil_serl_defining_actor}{{4}{26}{Code Example: Real-world RL}{tcb@cnt@pbox.4}{}}
|
| 233 |
+
\newlabel{ex:hil_serl_defining_actor@cref}{{[tcb@cnt@pbox][4][]4}{[1][26][]26}{}{}{}}
|
| 234 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch3/02\textunderscore actor.py}{26}{lstlisting.-4}\protected@file@percent }
|
| 235 |
+
\newlabel{ex:hil_serl_defining_learner}{{5}{28}{Code Example: Real-world RL}{tcb@cnt@pbox.5}{}}
|
| 236 |
+
\newlabel{ex:hil_serl_defining_learner@cref}{{[tcb@cnt@pbox][5][]5}{[1][28][]28}{}{}{}}
|
| 237 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch3/03\textunderscore learner.py}{28}{lstlisting.-5}\protected@file@percent }
|
| 238 |
+
\newlabel{ex:hil_serl_full}{{6}{30}{Code Example: Real-world RL}{tcb@cnt@pbox.6}{}}
|
| 239 |
+
\newlabel{ex:hil_serl_full@cref}{{[tcb@cnt@pbox][6][]6}{[1][30][]30}{}{}{}}
|
| 240 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch3/04\textunderscore hil\textunderscore serl.py}{30}{lstlisting.-6}\protected@file@percent }
|
| 241 |
+
\citation{degraveMagneticControlTokamak2022}
|
| 242 |
+
\citation{bellemareAutonomousNavigationStratospheric2020}
|
| 243 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {3.2.2}Limitations of RL in Real-World Robotics: Simulators and Reward Design}{32}{subsubsection.3.2.2}\protected@file@percent }
|
| 244 |
+
\citation{pomerleauALVINNAutonomousLand1988}
|
| 245 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {18}{\ignorespaces (A) Average (with standard deviation) evolution of the actuation levels over the first 5 recorded episodes in \url {lerobot/svla_so101_pickplace}. Proprioperceptive states provide invaluable to determine the robot's state during an episode. (B) Camera frames are also recorded alongside measurements on the robot's state, capturing information about the robot's interaction with its environment.}}{33}{figure.caption.18}\protected@file@percent }
|
| 246 |
+
\newlabel{fig:ch4-bc-trajectories}{{18}{33}{(A) Average (with standard deviation) evolution of the actuation levels over the first 5 recorded episodes in \url {lerobot/svla_so101_pickplace}. Proprioperceptive states provide invaluable to determine the robot's state during an episode. (B) Camera frames are also recorded alongside measurements on the robot's state, capturing information about the robot's interaction with its environment}{figure.caption.18}{}}
|
| 247 |
+
\newlabel{fig:ch4-bc-trajectories@cref}{{[figure][18][]18}{[1][33][]33}{}{}{}}
|
| 248 |
+
\@writefile{toc}{\contentsline {section}{\numberline {4}Robot (Imitation) Learning}{33}{section.4}\protected@file@percent }
|
| 249 |
+
\newlabel{sec:learning-imitation}{{4}{33}{Robot (Imitation) Learning}{section.4}{}}
|
| 250 |
+
\newlabel{sec:learning-imitation@cref}{{[section][4][]4}{[1][33][]33}{}{}{}}
|
| 251 |
+
\citation{shalev-shwartzUnderstandingMachineLearning2014}
|
| 252 |
+
\citation{rossReductionImitationLearning2011}
|
| 253 |
+
\citation{heessEmergenceLocomotionBehaviours2017}
|
| 254 |
+
\citation{rossReductionImitationLearning2011}
|
| 255 |
+
\citation{florenceImplicitBehavioralCloning2022,keGraspingChopsticksCombating2020}
|
| 256 |
+
\citation{florenceImplicitBehavioralCloning2022}
|
| 257 |
+
\citation{florenceImplicitBehavioralCloning2022}
|
| 258 |
+
\citation{florenceImplicitBehavioralCloning2022}
|
| 259 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {19}{\ignorespaces Sample observations and action pairs over the course of a given trajectory recorded in \url {lerobot/svla_so101_pickplace}. Observations, comprising of both proprioperceptive and visual information, are recorded alongside the configuration of a second, leader robot controlled by a human expert, providing complete information for regressing actions given observations.}}{34}{figure.caption.19}\protected@file@percent }
|
| 260 |
+
\newlabel{fig:ch4-observation-action-mapping}{{19}{34}{Sample observations and action pairs over the course of a given trajectory recorded in \url {lerobot/svla_so101_pickplace}. Observations, comprising of both proprioperceptive and visual information, are recorded alongside the configuration of a second, leader robot controlled by a human expert, providing complete information for regressing actions given observations}{figure.caption.19}{}}
|
| 261 |
+
\newlabel{fig:ch4-observation-action-mapping@cref}{{[figure][19][]19}{[1][33][]34}{}{}{}}
|
| 262 |
+
\newlabel{eq:loss-minimization-SL}{{18}{34}{Robot (Imitation) Learning}{equation.18}{}}
|
| 263 |
+
\newlabel{eq:loss-minimization-SL@cref}{{[equation][18][]18}{[1][33][]34}{}{}{}}
|
| 264 |
+
\citation{prince2023understanding}
|
| 265 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {20}{\ignorespaces Point-wise policies suffer from limitations due to (A) covariate shifts and (B) poor approximation of multimodal demonstrations. (A) Small errors may drive the policy out of distribution, incuring in a vicious circle ultimately resulting in failure. (B) Both modes of reaching for a target object in the scene---either left or right-first---are equally as good and thus equally as likely to be present in a dataset of human demonstrations, ultimately resulting in multimodal demonstrations.}}{35}{figure.caption.20}\protected@file@percent }
|
| 266 |
+
\newlabel{fig:ch4-issues-with-bc}{{20}{35}{Point-wise policies suffer from limitations due to (A) covariate shifts and (B) poor approximation of multimodal demonstrations. (A) Small errors may drive the policy out of distribution, incuring in a vicious circle ultimately resulting in failure. (B) Both modes of reaching for a target object in the scene---either left or right-first---are equally as good and thus equally as likely to be present in a dataset of human demonstrations, ultimately resulting in multimodal demonstrations}{figure.caption.20}{}}
|
| 267 |
+
\newlabel{fig:ch4-issues-with-bc@cref}{{[figure][20][]20}{[1][34][]35}{}{}{}}
|
| 268 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {21}{\ignorespaces Intuitively, latent variable in a single latent model may contain information regarding the task being performed, which directly results in the likelihood of the same observation-action pair being different for two different tasks. When (A) picking a block the likelihood of a wide gripper's opening should be higher than narrower one, while it should be the opposite when (B) pushing the block.}}{35}{figure.caption.21}\protected@file@percent }
|
| 269 |
+
\newlabel{fig:ch4-task-effect-on-pairs}{{21}{35}{Intuitively, latent variable in a single latent model may contain information regarding the task being performed, which directly results in the likelihood of the same observation-action pair being different for two different tasks. When (A) picking a block the likelihood of a wide gripper's opening should be higher than narrower one, while it should be the opposite when (B) pushing the block}{figure.caption.21}{}}
|
| 270 |
+
\newlabel{fig:ch4-task-effect-on-pairs@cref}{{[figure][21][]21}{[1][35][]35}{}{}{}}
|
| 271 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {4.1}A (Concise) Introduction to Generative Models}{35}{subsection.4.1}\protected@file@percent }
|
| 272 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {4.1.1}Variational Auto-Encoders}{35}{subsubsection.4.1.1}\protected@file@percent }
|
| 273 |
+
\newlabel{eq:BC-latent-variable}{{19}{35}{Variational Auto-Encoders}{equation.19}{}}
|
| 274 |
+
\newlabel{eq:BC-latent-variable@cref}{{[equation][19][]19}{[1][35][]35}{}{}{}}
|
| 275 |
+
\citation{kingma2013auto}
|
| 276 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {22}{\ignorespaces (A) The latent variable model in a robotics application regulates influence between observed (\(o,a) \) variables and an unobservable latent variable. (B) VAEs approximate exact latent variable models by means of variational inference. }}{36}{figure.caption.22}\protected@file@percent }
|
| 277 |
+
\newlabel{fig:ch4-latent-variable-model}{{22}{36}{(A) The latent variable model in a robotics application regulates influence between observed (\(o,a) \) variables and an unobservable latent variable. (B) VAEs approximate exact latent variable models by means of variational inference}{figure.caption.22}{}}
|
| 278 |
+
\newlabel{fig:ch4-latent-variable-model@cref}{{[figure][22][]22}{[1][36][]36}{}{}{}}
|
| 279 |
+
\newlabel{eq:evidence-definition-1}{{20}{36}{Variational Auto-Encoders}{equation.20}{}}
|
| 280 |
+
\newlabel{eq:evidence-definition-1@cref}{{[equation][20][]20}{[1][36][]36}{}{}{}}
|
| 281 |
+
\newlabel{eq:evidence-definition-2}{{21}{36}{Variational Auto-Encoders}{equation.21}{}}
|
| 282 |
+
\newlabel{eq:evidence-definition-2@cref}{{[equation][21][]21}{[1][36][]36}{}{}{}}
|
| 283 |
+
\newlabel{eq:evidence-definition-3}{{22}{36}{Variational Auto-Encoders}{equation.22}{}}
|
| 284 |
+
\newlabel{eq:evidence-definition-3@cref}{{[equation][22][]22}{[1][36][]36}{}{}{}}
|
| 285 |
+
\newlabel{eq:evidence-definition}{{23}{36}{Variational Auto-Encoders}{equation.23}{}}
|
| 286 |
+
\newlabel{eq:evidence-definition@cref}{{[equation][23][]23}{[1][36][]36}{}{}{}}
|
| 287 |
+
\citation{kingma2013auto}
|
| 288 |
+
\citation{hoDenoisingDiffusionProbabilistic2020}
|
| 289 |
+
\newlabel{eq:ELBO-intractable}{{25}{37}{Variational Auto-Encoders}{equation.25}{}}
|
| 290 |
+
\newlabel{eq:ELBO-intractable@cref}{{[equation][25][]25}{[1][36][]37}{}{}{}}
|
| 291 |
+
\newlabel{eq:ELBO}{{26}{37}{Variational Auto-Encoders}{equation.26}{}}
|
| 292 |
+
\newlabel{eq:ELBO@cref}{{[equation][26][]26}{[1][37][]37}{}{}{}}
|
| 293 |
+
\newlabel{eq:VAE-min-neg-ELBO}{{27}{37}{Variational Auto-Encoders}{equation.27}{}}
|
| 294 |
+
\newlabel{eq:VAE-min-neg-ELBO@cref}{{[equation][27][]27}{[1][37][]37}{}{}{}}
|
| 295 |
+
\newlabel{eq:VAE-Lrec}{{28}{37}{Variational Auto-Encoders}{equation.28}{}}
|
| 296 |
+
\newlabel{eq:VAE-Lrec@cref}{{[equation][28][]28}{[1][37][]37}{}{}{}}
|
| 297 |
+
\newlabel{eq:VAE-Lreg}{{29}{37}{Variational Auto-Encoders}{equation.29}{}}
|
| 298 |
+
\newlabel{eq:VAE-Lreg@cref}{{[equation][29][]29}{[1][37][]37}{}{}{}}
|
| 299 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {4.1.2}Diffusion Models}{37}{subsubsection.4.1.2}\protected@file@percent }
|
| 300 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {23}{\ignorespaces HMLV models posit the data generation process is influenced by a stack of Markov-dependent latent variables, with samples from the posterior distribution being progressively higher up in the hierarchy.}}{38}{figure.caption.23}\protected@file@percent }
|
| 301 |
+
\newlabel{fig:ch4-many-latents}{{23}{38}{HMLV models posit the data generation process is influenced by a stack of Markov-dependent latent variables, with samples from the posterior distribution being progressively higher up in the hierarchy}{figure.caption.23}{}}
|
| 302 |
+
\newlabel{fig:ch4-many-latents@cref}{{[figure][23][]23}{[1][38][]38}{}{}{}}
|
| 303 |
+
\newlabel{eq:BC-multi-latent-model-1}{{30}{38}{Diffusion Models}{equation.30}{}}
|
| 304 |
+
\newlabel{eq:BC-multi-latent-model-1@cref}{{[equation][30][]30}{[1][37][]38}{}{}{}}
|
| 305 |
+
\newlabel{eq:BC-multi-latent-model-2}{{31}{38}{Diffusion Models}{equation.31}{}}
|
| 306 |
+
\newlabel{eq:BC-multi-latent-model-2@cref}{{[equation][31][]31}{[1][37][]38}{}{}{}}
|
| 307 |
+
\citation{hoDenoisingDiffusionProbabilistic2020}
|
| 308 |
+
\citation{hoDenoisingDiffusionProbabilistic2020}
|
| 309 |
+
\citation{sohnLearningStructuredOutput2015}
|
| 310 |
+
\citation{permenterInterpretingImprovingDiffusion2024}
|
| 311 |
+
\citation{luoUnderstandingDiffusionModels2022}
|
| 312 |
+
\newlabel{eq:diffusion-1}{{33}{39}{Diffusion Models}{equation.33}{}}
|
| 313 |
+
\newlabel{eq:diffusion-1@cref}{{[equation][33][]33}{[1][38][]39}{}{}{}}
|
| 314 |
+
\newlabel{eq:diffusion-jensen}{{35}{39}{Diffusion Models}{equation.35}{}}
|
| 315 |
+
\newlabel{eq:diffusion-jensen@cref}{{[equation][35][]35}{[1][38][]39}{}{}{}}
|
| 316 |
+
\newlabel{eq:diffusion-2}{{36}{39}{Diffusion Models}{equation.36}{}}
|
| 317 |
+
\newlabel{eq:diffusion-2@cref}{{[equation][36][]36}{[1][38][]39}{}{}{}}
|
| 318 |
+
\newlabel{eq:diffusion-3}{{37}{39}{Diffusion Models}{equation.37}{}}
|
| 319 |
+
\newlabel{eq:diffusion-3@cref}{{[equation][37][]37}{[1][38][]39}{}{}{}}
|
| 320 |
+
\newlabel{eq:diffusion-4}{{38}{39}{Diffusion Models}{equation.38}{}}
|
| 321 |
+
\newlabel{eq:diffusion-4@cref}{{[equation][38][]38}{[1][38][]39}{}{}{}}
|
| 322 |
+
\newlabel{eq:diffusion-5}{{39}{39}{Diffusion Models}{equation.39}{}}
|
| 323 |
+
\newlabel{eq:diffusion-5@cref}{{[equation][39][]39}{[1][38][]39}{}{}{}}
|
| 324 |
+
\newlabel{eq:diffusion-6}{{40}{39}{Diffusion Models}{equation.40}{}}
|
| 325 |
+
\newlabel{eq:diffusion-6@cref}{{[equation][40][]40}{[1][38][]39}{}{}{}}
|
| 326 |
+
\newlabel{eq:diffusion-expectation-indices}{{41}{39}{Diffusion Models}{equation.41}{}}
|
| 327 |
+
\newlabel{eq:diffusion-expectation-indices@cref}{{[equation][41][]41}{[1][38][]39}{}{}{}}
|
| 328 |
+
\newlabel{eq:diffusion-likelihood}{{42}{39}{Diffusion Models}{equation.42}{}}
|
| 329 |
+
\newlabel{eq:diffusion-likelihood@cref}{{[equation][42][]42}{[1][38][]39}{}{}{}}
|
| 330 |
+
\newlabel{eq:diffusion-likelihood-gradient}{{43}{39}{Diffusion Models}{equation.43}{}}
|
| 331 |
+
\newlabel{eq:diffusion-likelihood-gradient@cref}{{[equation][43][]43}{[1][39][]39}{}{}{}}
|
| 332 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {24}{\ignorespaces DMs iteratively corrupt samples (left) from an unknown distribution into a quasi-standard Gaussian (center), learning the displacement field (right) that permits to reconstruct samples from the unknown target distribution by iteratively denoising samples of a tractable, easy-to-sample distribution.}}{40}{figure.caption.24}\protected@file@percent }
|
| 333 |
+
\newlabel{fig:diffusion-robot-actions}{{24}{40}{DMs iteratively corrupt samples (left) from an unknown distribution into a quasi-standard Gaussian (center), learning the displacement field (right) that permits to reconstruct samples from the unknown target distribution by iteratively denoising samples of a tractable, easy-to-sample distribution}{figure.caption.24}{}}
|
| 334 |
+
\newlabel{fig:diffusion-robot-actions@cref}{{[figure][24][]24}{[1][39][]40}{}{}{}}
|
| 335 |
+
\citation{hoDenoisingDiffusionProbabilistic2020}
|
| 336 |
+
\citation{permenterInterpretingImprovingDiffusion2024}
|
| 337 |
+
\citation{lipmanFlowMatchingGenerative2023}
|
| 338 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {25}{\ignorespaces A joint action-observation distribution, in the simplified case where the observation is the elbow-flex actuation in a SO-100, and the action is the recorded position for the same joint from the teleoperator arm. The motion recorded being teleoperated, the points distribute along a the diagonal.}}{41}{figure.caption.25}\protected@file@percent }
|
| 339 |
+
\newlabel{fig:ch4-action-vs-observation-distribution}{{25}{41}{A joint action-observation distribution, in the simplified case where the observation is the elbow-flex actuation in a SO-100, and the action is the recorded position for the same joint from the teleoperator arm. The motion recorded being teleoperated, the points distribute along a the diagonal}{figure.caption.25}{}}
|
| 340 |
+
\newlabel{fig:ch4-action-vs-observation-distribution@cref}{{[figure][25][]25}{[1][41][]41}{}{}{}}
|
| 341 |
+
\newlabel{eq:diffusion-simplified-loss}{{44}{41}{Diffusion Models}{equation.44}{}}
|
| 342 |
+
\newlabel{eq:diffusion-simplified-loss@cref}{{[equation][44][]44}{[1][41][]41}{}{}{}}
|
| 343 |
+
\newlabel{eq:diffusion-denoising-definition}{{45}{41}{Diffusion Models}{equation.45}{}}
|
| 344 |
+
\newlabel{eq:diffusion-denoising-definition@cref}{{[equation][45][]45}{[1][41][]41}{}{}{}}
|
| 345 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {4.1.3}Flow Matching}{41}{subsubsection.4.1.3}\protected@file@percent }
|
| 346 |
+
\newlabel{sec:ch4-flow-matching}{{4.1.3}{41}{Flow Matching}{subsubsection.4.1.3}{}}
|
| 347 |
+
\newlabel{sec:ch4-flow-matching@cref}{{[subsubsection][3][4,1]4.1.3}{[1][41][]41}{}{}{}}
|
| 348 |
+
\citation{esserScalingRectifiedFlow2024}
|
| 349 |
+
\citation{polyakMovieGenCast2025}
|
| 350 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 351 |
+
\citation{lipmanFlowMatchingGenerative2023}
|
| 352 |
+
\citation{lipmanFlowMatchingGenerative2023}
|
| 353 |
+
\citation{lipmanFlowMatchingGenerative2023}
|
| 354 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {26}{\ignorespaces Probability distributions can be modified differently by applying different vector fields, inducing different flows of mass across the same support (top versus bottom, using two different time-invariant 2D-fields \( u_1(x,y) = (x,0) \) and \( u_2(x,y) = (x/\sqrt {2}, y/\sqrt {2}) \)). Notice time flows \emph {continuously} in \( [0,1] \). FM models learn to approximate a target vector field, thereby producing arbitrary (goal) transformations of an easy-to-sample initial distribution.}}{42}{figure.caption.26}\protected@file@percent }
|
| 355 |
+
\newlabel{fig:ch4-normalizing-flows}{{26}{42}{Probability distributions can be modified differently by applying different vector fields, inducing different flows of mass across the same support (top versus bottom, using two different time-invariant 2D-fields \( u_1(x,y) = (x,0) \) and \( u_2(x,y) = (x/\sqrt {2}, y/\sqrt {2}) \)). Notice time flows \emph {continuously} in \( [0,1] \). FM models learn to approximate a target vector field, thereby producing arbitrary (goal) transformations of an easy-to-sample initial distribution}{figure.caption.26}{}}
|
| 356 |
+
\newlabel{fig:ch4-normalizing-flows@cref}{{[figure][26][]26}{[1][42][]42}{}{}{}}
|
| 357 |
+
\newlabel{eq:fm-diffusion-vector-field}{{48}{42}{Flow Matching}{equation.48}{}}
|
| 358 |
+
\newlabel{eq:fm-diffusion-vector-field@cref}{{[equation][48][]48}{[1][42][]42}{}{}{}}
|
| 359 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 360 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 361 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 362 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 363 |
+
\citation{sohnLearningStructuredOutput2015}
|
| 364 |
+
\citation{florenceImplicitBehavioralCloning2022,jannerPlanningDiffusionFlexible2022}
|
| 365 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 366 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 367 |
+
\citation{sohnLearningStructuredOutput2015}
|
| 368 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {27}{\ignorespaces Compared to diffusion, flow matching distorts distribution along a less randomic pattern, resulting in a clearer interpolation between source and target distribution. The visualization shows an example comparison between these two methods on joint distribution of robot observations and actions over \( T=50 \) steps.}}{43}{figure.caption.27}\protected@file@percent }
|
| 369 |
+
\newlabel{fig:ch4-diffusion-paths-versus-fm}{{27}{43}{Compared to diffusion, flow matching distorts distribution along a less randomic pattern, resulting in a clearer interpolation between source and target distribution. The visualization shows an example comparison between these two methods on joint distribution of robot observations and actions over \( T=50 \) steps}{figure.caption.27}{}}
|
| 370 |
+
\newlabel{fig:ch4-diffusion-paths-versus-fm@cref}{{[figure][27][]27}{[1][43][]43}{}{}{}}
|
| 371 |
+
\newlabel{eq:flow-matching-objective}{{49}{43}{Flow Matching}{equation.49}{}}
|
| 372 |
+
\newlabel{eq:flow-matching-objective@cref}{{[equation][49][]49}{[1][43][]43}{}{}{}}
|
| 373 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {4.2}Action Chunking with Transformers}{43}{subsection.4.2}\protected@file@percent }
|
| 374 |
+
\citation{higgins2017beta}
|
| 375 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 376 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 377 |
+
\citation{florenceImplicitBehavioralCloning2022}
|
| 378 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 379 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 380 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 381 |
+
\newlabel{eq:c-ELBO}{{50}{44}{Action Chunking with Transformers}{equation.50}{}}
|
| 382 |
+
\newlabel{eq:c-ELBO@cref}{{[equation][50][]50}{[1][44][]44}{}{}{}}
|
| 383 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {28}{\ignorespaces The CVAE encoder used in ACT. Input action chunks are first embedded and aggregated with positional embeddings, before being processed alongside embedded proprioperceptive information, and a learned \texttt {[CLS]} token used to aggregate input level information, and predict the style variable \( z \). The encoder is exclusively used to \emph {train} the decoder, and it is entirely disregarded at inference time.}}{45}{figure.caption.28}\protected@file@percent }
|
| 384 |
+
\newlabel{fig:ch4-act-encoder}{{28}{45}{The CVAE encoder used in ACT. Input action chunks are first embedded and aggregated with positional embeddings, before being processed alongside embedded proprioperceptive information, and a learned \texttt {[CLS]} token used to aggregate input level information, and predict the style variable \( z \). The encoder is exclusively used to \emph {train} the decoder, and it is entirely disregarded at inference time}{figure.caption.28}{}}
|
| 385 |
+
\newlabel{fig:ch4-act-encoder@cref}{{[figure][28][]28}{[1][44][]45}{}{}{}}
|
| 386 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {29}{\ignorespaces The CVAE decoder used in ACT, comprising of a full encoder-decoder Transformer architecture. Camera observations from all \( n \) camera views are first embedded using pre-trained visual encoders, and then aggregated with the corresponding positional embeddings. Then, the proprioperceptive information and style variable \( z \) retrieved from the CVAE encoder, are fed to the encoder-decoder Transformer for inference. The encoder shares the matrices \( K,V \) with the decoder, and is trained to decode fixed position embeddings into action chunks.}}{45}{figure.caption.29}\protected@file@percent }
|
| 387 |
+
\newlabel{fig:ch4-act-decoder}{{29}{45}{The CVAE decoder used in ACT, comprising of a full encoder-decoder Transformer architecture. Camera observations from all \( n \) camera views are first embedded using pre-trained visual encoders, and then aggregated with the corresponding positional embeddings. Then, the proprioperceptive information and style variable \( z \) retrieved from the CVAE encoder, are fed to the encoder-decoder Transformer for inference. The encoder shares the matrices \( K,V \) with the decoder, and is trained to decode fixed position embeddings into action chunks}{figure.caption.29}{}}
|
| 388 |
+
\newlabel{fig:ch4-act-decoder@cref}{{[figure][29][]29}{[1][44][]45}{}{}{}}
|
| 389 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {30}{\ignorespaces Action Chunking with Transformer (ACT), as in~\citet {zhaoLearningFineGrainedBimanual2023}. ACT introduces an action chunking paradigm to cope with high-dimensional multi-modal demonstration data, and a transformer-based CVAE architecture.}}{46}{figure.caption.30}\protected@file@percent }
|
| 390 |
+
\newlabel{fig:ch4-act}{{30}{46}{Action Chunking with Transformer (ACT), as in~\citet {zhaoLearningFineGrainedBimanual2023}. ACT introduces an action chunking paradigm to cope with high-dimensional multi-modal demonstration data, and a transformer-based CVAE architecture}{figure.caption.30}{}}
|
| 391 |
+
\newlabel{fig:ch4-act@cref}{{[figure][30][]30}{[1][44][]46}{}{}{}}
|
| 392 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {4.2.1}Code Example: Training and Using ACT in Practice}{46}{subsubsection.4.2.1}\protected@file@percent }
|
| 393 |
+
\newlabel{ex:act_training}{{7}{46}{Code Example: Training and Using ACT in Practice}{tcb@cnt@pbox.7}{}}
|
| 394 |
+
\newlabel{ex:act_training@cref}{{[tcb@cnt@pbox][7][]7}{[1][44][]46}{}{}{}}
|
| 395 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch4/01\textunderscore training\textunderscore act.py}{46}{lstlisting.-7}\protected@file@percent }
|
| 396 |
+
\citation{hoDenoisingDiffusionProbabilistic2020}
|
| 397 |
+
\citation{polyakMovieGenCast2025}
|
| 398 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 399 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 400 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 401 |
+
\newlabel{ex:act_using}{{8}{48}{Code Example: Training and Using ACT in Practice}{tcb@cnt@pbox.8}{}}
|
| 402 |
+
\newlabel{ex:act_using@cref}{{[tcb@cnt@pbox][8][]8}{[1][47][]48}{}{}{}}
|
| 403 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch4/02\textunderscore using\textunderscore act.py}{48}{lstlisting.-8}\protected@file@percent }
|
| 404 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {4.3}Diffusion Policy}{48}{subsection.4.3}\protected@file@percent }
|
| 405 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 406 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 407 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 408 |
+
\citation{ronnebergerUNetConvolutionalNetworks2015}
|
| 409 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 410 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 411 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 412 |
+
\citation{songDenoisingDiffusionImplicit2022}
|
| 413 |
+
\citation{hoDenoisingDiffusionProbabilistic2020}
|
| 414 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 415 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 416 |
+
\citation{chiDiffusionPolicyVisuomotor2024}
|
| 417 |
+
\citation{tancikFourierFeaturesLet2020}
|
| 418 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {31}{\ignorespaces The Diffusion Policy archicture, as in~\citet {chiDiffusionPolicyVisuomotor2024}. A stack of \( H_o \) previous observations is used as external conditioning to denoise a group of \( H_a \) actions. Conditioning is performed at every layer of a U-Net block. Diffusion Policy allows to obtain fully-formed action chunks with as little as \(T=10\) denoising steps.}}{49}{figure.caption.31}\protected@file@percent }
|
| 419 |
+
\newlabel{fig:diffusion-policy-architecture}{{31}{49}{The Diffusion Policy archicture, as in~\citet {chiDiffusionPolicyVisuomotor2024}. A stack of \( H_o \) previous observations is used as external conditioning to denoise a group of \( H_a \) actions. Conditioning is performed at every layer of a U-Net block. Diffusion Policy allows to obtain fully-formed action chunks with as little as \(T=10\) denoising steps}{figure.caption.31}{}}
|
| 420 |
+
\newlabel{fig:diffusion-policy-architecture@cref}{{[figure][31][]31}{[1][49][]49}{}{}{}}
|
| 421 |
+
\newlabel{eq:diffusion-policy-objective}{{51}{49}{Diffusion Policy}{equation.51}{}}
|
| 422 |
+
\newlabel{eq:diffusion-policy-objective@cref}{{[equation][51][]51}{[1][49][]49}{}{}{}}
|
| 423 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {4.3.1}Code Example: Training and Using Diffusion Policies in Practice}{50}{subsubsection.4.3.1}\protected@file@percent }
|
| 424 |
+
\newlabel{ex:diffusion_training}{{9}{50}{Code Example: Training and Using Diffusion Policies in Practice}{tcb@cnt@pbox.9}{}}
|
| 425 |
+
\newlabel{ex:diffusion_training@cref}{{[tcb@cnt@pbox][9][]9}{[1][50][]50}{}{}{}}
|
| 426 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch4/03\textunderscore training\textunderscore diffusion.py}{50}{lstlisting.-9}\protected@file@percent }
|
| 427 |
+
\newlabel{ex:diffusion_using}{{10}{51}{Code Example: Training and Using Diffusion Policies in Practice}{tcb@cnt@pbox.10}{}}
|
| 428 |
+
\newlabel{ex:diffusion_using@cref}{{[tcb@cnt@pbox][10][]10}{[1][51][]51}{}{}{}}
|
| 429 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch4/04\textunderscore using\textunderscore diffusion.py}{51}{lstlisting.-10}\protected@file@percent }
|
| 430 |
+
\citation{zhaoLearningFineGrainedBimanual2023,chiDiffusionPolicyVisuomotor2024}
|
| 431 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 432 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {4.4}Optimized Inference}{52}{subsection.4.4}\protected@file@percent }
|
| 433 |
+
\newlabel{sec:ch4-async-inference}{{4.4}{52}{Optimized Inference}{subsection.4.4}{}}
|
| 434 |
+
\newlabel{sec:ch4-async-inference@cref}{{[subsection][4][4]4.4}{[1][52][]52}{}{}{}}
|
| 435 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {32}{\ignorespaces \textbf {Asynchronous inference}. Illustration of the asynchronous inference stack. Note that the policy can be run on a remote server, possibly with GPUs.}}{53}{figure.caption.32}\protected@file@percent }
|
| 436 |
+
\newlabel{fig:ch4-async-inference}{{32}{53}{\textbf {Asynchronous inference}. Illustration of the asynchronous inference stack. Note that the policy can be run on a remote server, possibly with GPUs}{figure.caption.32}{}}
|
| 437 |
+
\newlabel{fig:ch4-async-inference@cref}{{[figure][32][]32}{[1][53][]53}{}{}{}}
|
| 438 |
+
\@writefile{loa}{\contentsline {algorithm}{\numberline {1}{\ignorespaces Asynchronous inference control-loop}}{53}{algorithm.1}\protected@file@percent }
|
| 439 |
+
\newlabel{alg:async-inference}{{1}{53}{Asynchronous inference control-loop}{algorithm.1}{}}
|
| 440 |
+
\newlabel{alg:async-inference@cref}{{[algorithm][1][]1}{[1][53][]53}{}{}{}}
|
| 441 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 442 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {33}{\ignorespaces Action queue size evolution at runtime for various levels of \( g\) when (A) not filtering out observation based on joint-space similarity and (B) filtering out near-duplicates observation, measuring their similarity in joint-space.}}{54}{figure.caption.33}\protected@file@percent }
|
| 443 |
+
\newlabel{fig:ch4-queues}{{33}{54}{Action queue size evolution at runtime for various levels of \( g\) when (A) not filtering out observation based on joint-space similarity and (B) filtering out near-duplicates observation, measuring their similarity in joint-space}{figure.caption.33}{}}
|
| 444 |
+
\newlabel{fig:ch4-queues@cref}{{[figure][33][]33}{[1][54][]54}{}{}{}}
|
| 445 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {4.4.1}Code Example: Using Async Inference}{55}{subsubsection.4.4.1}\protected@file@percent }
|
| 446 |
+
\newlabel{ex:spinning-up-server}{{11}{55}{Code Example: Using Async Inference}{tcb@cnt@pbox.11}{}}
|
| 447 |
+
\newlabel{ex:spinning-up-server@cref}{{[tcb@cnt@pbox][11][]11}{[1][55][]55}{}{}{}}
|
| 448 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch4/05\textunderscore policy\textunderscore server.py}{55}{lstlisting.-11}\protected@file@percent }
|
| 449 |
+
\newlabel{ex:latching-a-robot-client}{{12}{55}{Code Example: Using Async Inference}{tcb@cnt@pbox.12}{}}
|
| 450 |
+
\newlabel{ex:latching-a-robot-client@cref}{{[tcb@cnt@pbox][12][]12}{[1][55][]55}{}{}{}}
|
| 451 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch4/06\textunderscore robot\textunderscore client.py}{55}{lstlisting.-12}\protected@file@percent }
|
| 452 |
+
\citation{oquabDINOv2LearningRobust2024}
|
| 453 |
+
\citation{devlinBERTPretrainingDeep2019}
|
| 454 |
+
\citation{oneillOpenXEmbodimentRobotic2025,khazatskyDROIDLargeScaleInTheWild2025}
|
| 455 |
+
\citation{raffelExploringLimitsTransfer2023,ImageNet_VSS09}
|
| 456 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {34}{\ignorespaces Fields within ML such as Computer Vision and NLP converged on the development of foundation models, trained on a variety of large scale models and capable to perform multiple downstream tasks (top). Conversely, robotics suffered from limited standardization in terms of the architectures used, and siloed, task specific datasets, incurring in a high degree of fragmentation which traditionally hindered the development of generalist models for robotics in favour of task-specific models (bottom).}}{57}{figure.caption.34}\protected@file@percent }
|
| 457 |
+
\newlabel{fig:ch5-ml-vs-robotics-foundation}{{34}{57}{Fields within ML such as Computer Vision and NLP converged on the development of foundation models, trained on a variety of large scale models and capable to perform multiple downstream tasks (top). Conversely, robotics suffered from limited standardization in terms of the architectures used, and siloed, task specific datasets, incurring in a high degree of fragmentation which traditionally hindered the development of generalist models for robotics in favour of task-specific models (bottom)}{figure.caption.34}{}}
|
| 458 |
+
\newlabel{fig:ch5-ml-vs-robotics-foundation@cref}{{[figure][34][]34}{[1][57][]57}{}{}{}}
|
| 459 |
+
\@writefile{toc}{\contentsline {section}{\numberline {5}Generalist Robot Policies}{57}{section.5}\protected@file@percent }
|
| 460 |
+
\newlabel{sec:learning-foundation}{{5}{57}{Generalist Robot Policies}{section.5}{}}
|
| 461 |
+
\newlabel{sec:learning-foundation@cref}{{[section][5][]5}{[1][57][]57}{}{}{}}
|
| 462 |
+
\citation{jangBCZZeroShotTask2022}
|
| 463 |
+
\citation{brohanRT1RoboticsTransformer2023}
|
| 464 |
+
\citation{brohanRT2VisionLanguageActionModels2023}
|
| 465 |
+
\citation{oneillOpenXEmbodimentRobotic2025}
|
| 466 |
+
\citation{khazatskyDROIDLargeScaleInTheWild2025}
|
| 467 |
+
\citation{kimOpenVLAOpenSourceVisionLanguageAction2024}
|
| 468 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 469 |
+
\citation{shukorSmolVLAVisionLanguageActionModel2025}
|
| 470 |
+
\citation{brohanRT1RoboticsTransformer2023}
|
| 471 |
+
\citation{jangBCZZeroShotTask2022}
|
| 472 |
+
\citation{reedGeneralistAgent2022}
|
| 473 |
+
\citation{brohanRT1RoboticsTransformer2023}
|
| 474 |
+
\citation{brohanRT2VisionLanguageActionModels2023}
|
| 475 |
+
\citation{brohanRT2VisionLanguageActionModels2023}
|
| 476 |
+
\citation{brohanRT2VisionLanguageActionModels2023}
|
| 477 |
+
\citation{chenPaLIXScalingMultilingual2023}
|
| 478 |
+
\citation{driessPaLMEEmbodiedMultimodal2023}
|
| 479 |
+
\citation{brohanRT2VisionLanguageActionModels2023}
|
| 480 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {35}{\ignorespaces Early efforts in the development of generalist models for robotics include BC-Zero~\citep {jangBCZZeroShotTask2022}, RT-1~\citep {brohanRT1RoboticsTransformer2023}, and RT-2~\citep {brohanRT2VisionLanguageActionModels2023}: large scale models trained on thousands of demonstrations. The open release of the Open-X~\citep {oneillOpenXEmbodimentRobotic2025} and DROID datasets~\citep {khazatskyDROIDLargeScaleInTheWild2025} fostered the development of open source models: OpenVLA~\citep {kimOpenVLAOpenSourceVisionLanguageAction2024}, \( \pi _0 \)~\citep {black$p_0$VisionLanguageActionFlow2024} and SmolVLA~\citep {shukorSmolVLAVisionLanguageActionModel2025}.}}{58}{figure.caption.35}\protected@file@percent }
|
| 481 |
+
\newlabel{fig:ch5-generalist-policies-timeline}{{35}{58}{Early efforts in the development of generalist models for robotics include BC-Zero~\citep {jangBCZZeroShotTask2022}, RT-1~\citep {brohanRT1RoboticsTransformer2023}, and RT-2~\citep {brohanRT2VisionLanguageActionModels2023}: large scale models trained on thousands of demonstrations. The open release of the Open-X~\citep {oneillOpenXEmbodimentRobotic2025} and DROID datasets~\citep {khazatskyDROIDLargeScaleInTheWild2025} fostered the development of open source models: OpenVLA~\citep {kimOpenVLAOpenSourceVisionLanguageAction2024}, \pizero ~\citep {black$p_0$VisionLanguageActionFlow2024} and SmolVLA~\citep {shukorSmolVLAVisionLanguageActionModel2025}}{figure.caption.35}{}}
|
| 482 |
+
\newlabel{fig:ch5-generalist-policies-timeline@cref}{{[figure][35][]35}{[1][58][]58}{}{}{}}
|
| 483 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {5.1}Preliminaries: Models and Data}{58}{subsection.5.1}\protected@file@percent }
|
| 484 |
+
\citation{oneillOpenXEmbodimentRobotic2025}
|
| 485 |
+
\citation{oneillOpenXEmbodimentRobotic2025}
|
| 486 |
+
\citation{khazatskyDROIDLargeScaleInTheWild2025}
|
| 487 |
+
\citation{kimOpenVLAOpenSourceVisionLanguageAction2024}
|
| 488 |
+
\citation{kimOpenVLAOpenSourceVisionLanguageAction2024}
|
| 489 |
+
\citation{touvronLlama2Open2023}
|
| 490 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {36}{\ignorespaces Robot learning is undergoing a paradigmatic shift: centralized data collections (A, left) are increasingly larger, often comprising millions of demonstrations, while (A, right) decentralized data collection efforts are becoming an alternative for large scale data collection. (B) Generalist models are also becoming increasingly smaller and easier to run on limited hardware.}}{59}{figure.caption.36}\protected@file@percent }
|
| 491 |
+
\newlabel{fig:ch5-trends}{{36}{59}{Robot learning is undergoing a paradigmatic shift: centralized data collections (A, left) are increasingly larger, often comprising millions of demonstrations, while (A, right) decentralized data collection efforts are becoming an alternative for large scale data collection. (B) Generalist models are also becoming increasingly smaller and easier to run on limited hardware}{figure.caption.36}{}}
|
| 492 |
+
\newlabel{fig:ch5-trends@cref}{{[figure][36][]36}{[1][59][]59}{}{}{}}
|
| 493 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 494 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 495 |
+
\citation{shukorSmolVLAVisionLanguageActionModel2025}
|
| 496 |
+
\citation{zhaoLearningFineGrainedBimanual2023}
|
| 497 |
+
\citation{fedusReviewSparseExpert2022}
|
| 498 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 499 |
+
\citation{alayracFlamingoVisualLanguage2022,laurenconWhatMattersWhen2024,linVILAPretrainingVisual2024}
|
| 500 |
+
\citation{radfordLearningTransferableVisual2021,zhaiSigmoidLossLanguage2023,finiMultimodalAutoregressivePretraining2024}
|
| 501 |
+
\citation{grattafioriLlama3Herd2024,jiangMistral7B2023}
|
| 502 |
+
\citation{LAION-COCO,kakaobrain2022coyo700m}
|
| 503 |
+
\citation{OBELICS,MMC4}
|
| 504 |
+
\citation{LLaVA-1.5,tong2024cambrian,laurenconWhatMattersWhen2024}
|
| 505 |
+
\citation{LLaVA-1.5,InstructBLIP,bai2025qwen25vl,zhu2024minigpt,tong2024cambrian}
|
| 506 |
+
\citation{marafiotiSmolVLMRedefiningSmall2025,moondream,minicmpv2024}
|
| 507 |
+
\citation{shukor2023epalm,vallaeys2024improveddepalm,MAPL,FROMAGe,tsimpoukelli2021multimodalfrozen,BLIP-2}
|
| 508 |
+
\citation{wang2025internvideo2,liu2024kangaroo,zhang2025videollama,kong2024audioflam}
|
| 509 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 510 |
+
\citation{teamGemma2Improving2024}
|
| 511 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {5.2}VLAs}{60}{subsection.5.2}\protected@file@percent }
|
| 512 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {5.2.1}VLMs for VLAs}{60}{subsubsection.5.2.1}\protected@file@percent }
|
| 513 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 514 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {37}{\ignorespaces The \( \pi _0 \)~architecture, as in~\citet {black$p_0$VisionLanguageActionFlow2024}. Vision and language tokens are routed to a VLM backbone which is prevented from attending robot proprioperceptive states and action tokens, which are instead routed to a smaller subset of weights within the architecture referred to as "action expert". The architecture is trained with Flow Matching on 10M+ trajectories from a mixture of closed and openly available datasets.}}{61}{figure.caption.37}\protected@file@percent }
|
| 515 |
+
\newlabel{fig:ch5-pi0}{{37}{61}{The \pizero ~architecture, as in~\citet {black$p_0$VisionLanguageActionFlow2024}. Vision and language tokens are routed to a VLM backbone which is prevented from attending robot proprioperceptive states and action tokens, which are instead routed to a smaller subset of weights within the architecture referred to as "action expert". The architecture is trained with Flow Matching on 10M+ trajectories from a mixture of closed and openly available datasets}{figure.caption.37}{}}
|
| 516 |
+
\newlabel{fig:ch5-pi0@cref}{{[figure][37][]37}{[1][61][]61}{}{}{}}
|
| 517 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {5.3}\( \pi _0 \)}{61}{subsection.5.3}\protected@file@percent }
|
| 518 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 519 |
+
\citation{driessKnowledgeInsulatingVisionLanguageAction2025}
|
| 520 |
+
\citation{lipmanFlowMatchingGenerative2023}
|
| 521 |
+
\citation{lipmanFlowMatchingGenerative2023}
|
| 522 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 523 |
+
\citation{esserScalingRectifiedFlow2024}
|
| 524 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 525 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 526 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 527 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 528 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 529 |
+
\newlabel{eq:pi0-loss}{{52}{62}{\( \pi _0 \)}{equation.52}{}}
|
| 530 |
+
\newlabel{eq:pi0-loss@cref}{{[equation][52][]52}{[1][62][]62}{}{}{}}
|
| 531 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {38}{\ignorespaces Unlike more traditional flow-matching algorithms, \( \pi _0 \)~uses a modified distribution to sample the timestep \( \tau \) from during training and inference, favouring earlier timestamps corresponding to noisier chunks.}}{62}{figure.caption.38}\protected@file@percent }
|
| 532 |
+
\newlabel{fig:ch5-pi0-sampling-timesteps}{{38}{62}{Unlike more traditional flow-matching algorithms, \pizero ~uses a modified distribution to sample the timestep \( \tau \) from during training and inference, favouring earlier timestamps corresponding to noisier chunks}{figure.caption.38}{}}
|
| 533 |
+
\newlabel{fig:ch5-pi0-sampling-timesteps@cref}{{[figure][38][]38}{[1][62][]62}{}{}{}}
|
| 534 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 535 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {5.3.1}Code Example: Using \( \pi _0 \)}{63}{subsubsection.5.3.1}\protected@file@percent }
|
| 536 |
+
\newlabel{ex:using-pizero}{{13}{63}{Code Example: Using \pizero }{tcb@cnt@pbox.13}{}}
|
| 537 |
+
\newlabel{ex:using-pizero@cref}{{[tcb@cnt@pbox][13][]13}{[1][63][]63}{}{}{}}
|
| 538 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch5/01\textunderscore using\textunderscore pi0.py}{63}{lstlisting.-13}\protected@file@percent }
|
| 539 |
+
\citation{shukorSmolVLAVisionLanguageActionModel2025}
|
| 540 |
+
\citation{shukorSmolVLAVisionLanguageActionModel2025}
|
| 541 |
+
\citation{black$p_0$VisionLanguageActionFlow2024}
|
| 542 |
+
\citation{marafiotiSmolVLMRedefiningSmall2025}
|
| 543 |
+
\citation{zhaiSigmoidLossLanguage2023}
|
| 544 |
+
\citation{allalSmolLM2WhenSmol2025}
|
| 545 |
+
\citation{shukorSmolVLAVisionLanguageActionModel2025}
|
| 546 |
+
\@writefile{lof}{\contentsline {figure}{\numberline {39}{\ignorespaces The SmolVLA architecture, as in~\citet {shukorSmolVLAVisionLanguageActionModel2025}. SmolVLA is a compact MoE model trained with flow matching to denoise action chunks. Vision and language tokens are fed to a VLM backbone, and share information with the proprioperceptive and action tokens via the attention mechanism. The attention expert interleaves SA and CA layers for further conditioning on the visual features from the VLM backbone. SmolVLA skips computations and reduces the visual tokens, resulting in 7x less memory usage than \( \pi _0 \)~(450M parameters vs. \( \pi _0 \)'s 3.3B).}}{64}{figure.caption.39}\protected@file@percent }
|
| 547 |
+
\newlabel{fig:ch5-smolvla}{{39}{64}{The SmolVLA architecture, as in~\citet {shukorSmolVLAVisionLanguageActionModel2025}. SmolVLA is a compact MoE model trained with flow matching to denoise action chunks. Vision and language tokens are fed to a VLM backbone, and share information with the proprioperceptive and action tokens via the attention mechanism. The attention expert interleaves SA and CA layers for further conditioning on the visual features from the VLM backbone. SmolVLA skips computations and reduces the visual tokens, resulting in 7x less memory usage than \pizero ~(450M parameters vs. \pizero 's 3.3B)}{figure.caption.39}{}}
|
| 548 |
+
\newlabel{fig:ch5-smolvla@cref}{{[figure][39][]39}{[1][64][]64}{}{}{}}
|
| 549 |
+
\@writefile{toc}{\contentsline {subsection}{\numberline {5.4}SmolVLA}{64}{subsection.5.4}\protected@file@percent }
|
| 550 |
+
\citation{shukorSmolVLAVisionLanguageActionModel2025}
|
| 551 |
+
\@writefile{toc}{\contentsline {subsubsection}{\numberline {5.4.1}Code Example: Using SmolVLA}{65}{subsubsection.5.4.1}\protected@file@percent }
|
| 552 |
+
\newlabel{ex:using-smolvla}{{14}{65}{Code Example: Using SmolVLA}{tcb@cnt@pbox.14}{}}
|
| 553 |
+
\newlabel{ex:using-smolvla@cref}{{[tcb@cnt@pbox][14][]14}{[1][65][]65}{}{}{}}
|
| 554 |
+
\@writefile{lol}{\contentsline {lstlisting}{snippets/ch5/02\textunderscore using\textunderscore smolvla.py}{65}{lstlisting.-14}\protected@file@percent }
|
| 555 |
+
\bibstyle{hfstyle/plainnat}
|
| 556 |
+
\bibdata{main}
|
| 557 |
+
\bibcite{SpinningUp2018}{{1}{2018}{{Achiam}}{{}}}
|
| 558 |
+
\bibcite{agrawalComputationalSensorimotorLearning}{{2}{}{{Agrawal}}{{}}}
|
| 559 |
+
\bibcite{akkayaSolvingRubiksCube2019}{{3}{2019}{{Akkaya et~al.}}{{Akkaya, Andrychowicz, Chociej, Litwin, McGrew, Petron, Paino, Plappert, Powell, Ribas, Schneider, Tezak, Tworek, Welinder, Weng, Yuan, Zaremba, and Zhang}}}
|
| 560 |
+
\bibcite{alayracFlamingoVisualLanguage2022}{{4}{2022}{{Alayrac et~al.}}{{Alayrac, Donahue, Luc, Miech, Barr, Hasson, Lenc, Mensch, Millican, Reynolds, Ring, Rutherford, Cabi, Han, Gong, Samangooei, Monteiro, Menick, Borgeaud, Brock, Nematzadeh, Sharifzadeh, Binkowski, Barreira, Vinyals, Zisserman, and Simonyan}}}
|
| 561 |
+
\bibcite{aldacoALOHA2Enhanced}{{5}{}{{Aldaco et~al.}}{{Aldaco, Armstrong, Baruch, Bingham, Chan, Dwibedi, Finn, Florence, Goodrich, Gramlich, Herzog, Hoech, Nguyen, Storz, Tabanpour, Tompson, Wahid, Wahrburg, Xu, Yaroshenko, and Zhao}}}
|
| 562 |
+
\bibcite{alizadehComprehensiveSurveySpace2024}{{6}{2024}{{Alizadeh and Zhu}}{{}}}
|
| 563 |
+
\bibcite{allalSmolLM2WhenSmol2025}{{7}{2025}{{Allal et~al.}}{{Allal, Lozhkov, Bakouch, Bl{\'a}zquez, Penedo, Tunstall, Marafioti, Kydl{\'i}{\v c}ek, Lajar{\'i}n, Srivastav, Lochner, Fahlgren, Nguyen, Fourrier, Burtenshaw, Larcher, Zhao, Zakka, Morlon, Raffel, von Werra, and Wolf}}}
|
| 564 |
+
\bibcite{antonovaReinforcementLearningPivoting2017}{{8}{2017}{{Antonova et~al.}}{{Antonova, Cruciani, Smith, and Kragic}}}
|
| 565 |
+
\@writefile{toc}{\contentsline {section}{\numberline {6}Conclusions}{67}{section.6}\protected@file@percent }
|
| 566 |
+
\newlabel{sec:conclusions}{{6}{67}{Conclusions}{section.6}{}}
|
| 567 |
+
\newlabel{sec:conclusions@cref}{{[section][6][]6}{[1][67][]67}{}{}{}}
|
| 568 |
+
\bibcite{bai2025qwen25vl}{{9}{2025}{{Bai et~al.}}{{Bai, Chen, Liu, Wang, Ge, Song, Dang, Wang, Wang, Tang, Zhong, Zhu, Yang, Li, Wan, Wang, Ding, Fu, Xu, Ye, Zhang, Xie, Cheng, Zhang, Yang, Xu, and Lin}}}
|
| 569 |
+
\bibcite{ballEfficientOnlineReinforcement2023}{{10}{2023}{{Ball et~al.}}{{Ball, Smith, Kostrikov, and Levine}}}
|
| 570 |
+
\bibcite{bekrisStateRobotMotion2024}{{11}{2024}{{Bekris et~al.}}{{Bekris, Doerr, Meng, and Tangirala}}}
|
| 571 |
+
\bibcite{bellemareAutonomousNavigationStratospheric2020}{{12}{2020}{{Bellemare et~al.}}{{Bellemare, Candido, Castro, Gong, Machado, Moitra, Ponda, and Wang}}}
|
| 572 |
+
\bibcite{bellmanMarkovianDecisionProcess1957}{{13}{1957}{{Bellman}}{{}}}
|
| 573 |
+
\bibcite{bjorckGR00TN1Open2025}{{14}{2025}{{Bjorck et~al.}}{{Bjorck, Casta{\~n}eda, Cherniadev, Da, Ding, Fan, Fang, Fox, Hu, Huang, Jang, Jiang, Kautz, Kundalia, Lao, Li, Lin, Lin, Liu, Llontop, Magne, Mandlekar, Narayan, Nasiriany, Reed, Tan, Wang, Wang, Wang, Wang, Xiang, Xie, Xu, Xu, Ye, Yu, Zhang, Zhang, Zhao, Zheng, and Zhu}}}
|
| 574 |
+
\bibcite{black$p_0$VisionLanguageActionFlow2024}{{15}{2024}{{Black et~al.}}{{Black, Brown, Driess, Esmail, Equi, Finn, Fusai, Groom, Hausman, Ichter, Jakubczak, Jones, Ke, Levine, {Li-Bell}, Mothukuri, Nair, Pertsch, Shi, Tanner, Vuong, Walling, Wang, and Zhilinsky}}}
|
| 575 |
+
\bibcite{brohanRT2VisionLanguageActionModels2023}{{16}{2023{a}}{{Brohan et~al.}}{{Brohan, Brown, Carbajal, Chebotar, Chen, Choromanski, Ding, Driess, Dubey, Finn, Florence, Fu, Arenas, Gopalakrishnan, Han, Hausman, Herzog, Hsu, Ichter, Irpan, Joshi, Julian, Kalashnikov, Kuang, Leal, Lee, Lee, Levine, Lu, Michalewski, Mordatch, Pertsch, Rao, Reymann, Ryoo, Salazar, Sanketi, Sermanet, Singh, Singh, Soricut, Tran, Vanhoucke, Vuong, Wahid, Welker, Wohlhart, Wu, Xia, Xiao, Xu, Xu, Yu, and Zitkovich}}}
|
| 576 |
+
\bibcite{brohanRT1RoboticsTransformer2023}{{17}{2023{b}}{{Brohan et~al.}}{{Brohan, Brown, Carbajal, Chebotar, Dabis, Finn, Gopalakrishnan, Hausman, Herzog, Hsu, Ibarz, Ichter, Irpan, Jackson, Jesmonth, Joshi, Julian, Kalashnikov, Kuang, Leal, Lee, Levine, Lu, Malla, Manjunath, Mordatch, Nachum, Parada, Peralta, Perez, Pertsch, Quiambao, Rao, Ryoo, Salazar, Sanketi, Sayed, Singh, Sontakke, Stone, Tan, Tran, Vanhoucke, Vega, Vuong, Xia, Xiao, Xu, Xu, Yu, and Zitkovich}}}
|
| 577 |
+
\bibcite{brownLanguageModelsAre2020}{{18}{2020}{{Brown et~al.}}{{Brown, Mann, Ryder, Subbiah, Kaplan, Dhariwal, Neelakantan, Shyam, Sastry, Askell, Agarwal, {Herbert-Voss}, Krueger, Henighan, Child, Ramesh, Ziegler, Wu, Winter, Hesse, Chen, Sigler, Litwin, Gray, Chess, Clark, Berner, McCandlish, Radford, Sutskever, and Amodei}}}
|
| 578 |
+
\bibcite{kakaobrain2022coyo700m}{{19}{2022}{{Byeon et~al.}}{{Byeon, Park, Kim, Lee, Baek, and Kim}}}
|
| 579 |
+
\bibcite{chebotarClosingSimtorealLoop2019}{{20}{2019}{{Chebotar et~al.}}{{Chebotar, Handa, Makoviychuk, Macklin, Issac, Ratliff, and Fox}}}
|
| 580 |
+
\bibcite{chenPaLIXScalingMultilingual2023}{{21}{2023}{{Chen et~al.}}{{Chen, Djolonga, Padlewski, Mustafa, Changpinyo, Wu, Ruiz, Goodman, Wang, Tay, Shakeri, Dehghani, Salz, Lucic, Tschannen, Nagrani, Hu, Joshi, Pang, Montgomery, Pietrzyk, Ritter, Piergiovanni, Minderer, Pavetic, Waters, Li, Alabdulmohsin, Beyer, Amelot, Lee, Steiner, Li, Keysers, Arnab, Xu, Rong, Kolesnikov, Seyedhosseini, Angelova, Zhai, Houlsby, and Soricut}}}
|
| 581 |
+
\bibcite{chiDiffusionPolicyVisuomotor2024}{{22}{2024}{{Chi et~al.}}{{Chi, Xu, Feng, Cousineau, Du, Burchfiel, Tedrake, and Song}}}
|
| 582 |
+
\bibcite{connellRobotLearning1993}{{23}{1993}{{Connell and Mahadevan}}{{}}}
|
| 583 |
+
\bibcite{InstructBLIP}{{24}{2023}{{Dai et~al.}}{{Dai, Li, Li, Tiong, Zhao, Wang, Li, Fung, and Hoi}}}
|
| 584 |
+
\bibcite{degraveMagneticControlTokamak2022}{{25}{2022}{{Degrave et~al.}}{{Degrave, Felici, Buchli, Neunert, Tracey, Carpanese, Ewalds, Hafner, Abdolmaleki, {de las Casas}, Donner, Fritz, Galperti, Huber, Keeling, Tsimpoukelli, Kay, Merle, Moret, Noury, Pesamosca, Pfau, Sauter, Sommariva, Coda, Duval, Fasoli, Kohli, Kavukcuoglu, Hassabis, and Riedmiller}}}
|
| 585 |
+
\bibcite{ImageNet_VSS09}{{26}{2009}{{Deng et~al.}}{{Deng, Li, Do, Su, and {Fei-Fei}}}}
|
| 586 |
+
\bibcite{devlinBERTPretrainingDeep2019}{{27}{2019}{{Devlin et~al.}}{{Devlin, Chang, Lee, and Toutanova}}}
|
| 587 |
+
\bibcite{driessPaLMEEmbodiedMultimodal2023}{{28}{2023}{{Driess et~al.}}{{Driess, Xia, Sajjadi, Lynch, Chowdhery, Ichter, Wahid, Tompson, Vuong, Yu, Huang, Chebotar, Sermanet, Duckworth, Levine, Vanhoucke, Hausman, Toussaint, Greff, Zeng, Mordatch, and Florence}}}
|
| 588 |
+
\bibcite{driessKnowledgeInsulatingVisionLanguageAction2025}{{29}{2025}{{Driess et~al.}}{{Driess, Springenberg, Ichter, Yu, {Li-Bell}, Pertsch, Ren, Walke, Vuong, Shi, and Levine}}}
|
| 589 |
+
\bibcite{esserScalingRectifiedFlow2024}{{30}{2024}{{Esser et~al.}}{{Esser, Kulal, Blattmann, Entezari, M{\"u}ller, Saini, Levi, Lorenz, Sauer, Boesel, Podell, Dockhorn, English, Lacey, Goodwin, Marek, and Rombach}}}
|
| 590 |
+
\bibcite{fedusReviewSparseExpert2022}{{31}{2022}{{Fedus et~al.}}{{Fedus, Dean, and Zoph}}}
|
| 591 |
+
\bibcite{finiMultimodalAutoregressivePretraining2024}{{32}{2024}{{Fini et~al.}}{{Fini, Shukor, Li, Dufter, Klein, Haldimann, Aitharaju, da~Costa, B{\'e}thune, Gan, Toshev, Eichner, Nabi, Yang, Susskind, and {El-Nouby}}}}
|
| 592 |
+
\bibcite{florenceImplicitBehavioralCloning2022}{{33}{2022}{{Florence et~al.}}{{Florence, Lynch, Zeng, Ramirez, Wahid, Downs, Wong, Lee, Mordatch, and Tompson}}}
|
| 593 |
+
\bibcite{fujitaDevelopmentRobotsNuclear2020}{{34}{2020}{{Fujita et~al.}}{{Fujita, Soda, Murata, and Tsuhari}}}
|
| 594 |
+
\bibcite{grattafioriLlama3Herd2024}{{35}{2024}{{Grattafiori et~al.}}{{Grattafiori, Dubey, Jauhri, Pandey, Kadian, {Al-Dahle}, Letman, Mathur, Schelten, Vaughan, Yang, Fan, Goyal, Hartshorn, Yang, Mitra, Sravankumar, Korenev, Hinsvark, Rao, Zhang, Rodriguez, Gregerson, Spataru, Roziere, Biron, Tang, Chern, Caucheteux, Nayak, Bi, Marra, McConnell, Keller, Touret, Wu, Wong, Ferrer, Nikolaidis, Allonsius, Song, Pintz, Livshits, Wyatt, Esiobu, Choudhary, Mahajan, {Garcia-Olano}, Perino, Hupkes, Lakomkin, AlBadawy, Lobanova, Dinan, Smith, Radenovic, Guzm{\'a}n, Zhang, Synnaeve, Lee, Anderson, Thattai, Nail, Mialon, Pang, Cucurell, Nguyen, Korevaar, Xu, Touvron, Zarov, Ibarra, Kloumann, Misra, Evtimov, Zhang, Copet, Lee, Geffert, Vranes, Park, Mahadeokar, Shah, van~der Linde, Billock, Hong, Lee, Fu, Chi, Huang, Liu, Wang, Yu, Bitton, Spisak, Park, Rocca, Johnstun, Saxe, Jia, Alwala, Prasad, Upasani, Plawiak, Li, Heafield, Stone, {El-Arini}, Iyer, Malik, Chiu, Bhalla, Lakhotia, {Rantala-Yeary}, van~der Maaten, Chen, Tan, Jenkins, Martin, Madaan, Malo, Blecher, Landzaat, de~Oliveira, Muzzi, Pasupuleti, Singh, Paluri, Kardas, Tsimpoukelli, Oldham, Rita, Pavlova, Kambadur, Lewis, Si, Singh, Hassan, Goyal, Torabi, Bashlykov, Bogoychev, Chatterji, Zhang, Duchenne, {\c C}elebi, Alrassy, Zhang, Li, Vasic, Weng, Bhargava, Dubal, Krishnan, Koura, Xu, He, Dong, Srinivasan, Ganapathy, Calderer, Cabral, Stojnic, Raileanu, Maheswari, Girdhar, Patel, Sauvestre, Polidoro, Sumbaly, Taylor, Silva, Hou, Wang, Hosseini, Chennabasappa, Singh, Bell, Kim, Edunov, Nie, Narang, Raparthy, Shen, Wan, Bhosale, Zhang, Vandenhende, Batra, Whitman, Sootla, Collot, Gururangan, Borodinsky, Herman, Fowler, Sheasha, Georgiou, Scialom, Speckbacher, Mihaylov, Xiao, Karn, Goswami, Gupta, Ramanathan, Kerkez, Gonguet, Do, Vogeti, Albiero, Petrovic, Chu, Xiong, Fu, Meers, Martinet, Wang, Wang, Tan, Xia, Xie, Jia, Wang, Goldschlag, Gaur, Babaei, Wen, Song, Zhang, Li, Mao, Coudert, Yan, Chen, Papakipos, Singh, Srivastava, Jain, Kelsey, Shajnfeld, Gangidi, Victoria, Goldstand, Menon, Sharma, Boesenberg, Baevski, Feinstein, Kallet, Sangani, Teo, Yunus, Lupu, Alvarado, Caples, Gu, Ho, Poulton, Ryan, Ramchandani, Dong, Franco, Goyal, Saraf, Chowdhury, Gabriel, Bharambe, Eisenman, Yazdan, James, Maurer, Leonhardi, Huang, Loyd, Paola, Paranjape, Liu, Wu, Ni, Hancock, Wasti, Spence, Stojkovic, Gamido, Montalvo, Parker, Burton, Mejia, Liu, Wang, Kim, Zhou, Hu, Chu, Cai, Tindal, Feichtenhofer, Gao, Civin, Beaty, Kreymer, Li, Adkins, Xu, Testuggine, David, Parikh, Liskovich, Foss, Wang, Le, Holland, Dowling, Jamil, Montgomery, Presani, Hahn, Wood, Le, Brinkman, Arcaute, Dunbar, Smothers, Sun, Kreuk, Tian, Kokkinos, Ozgenel, Caggioni, Kanayet, Seide, Florez, Schwarz, Badeer, Swee, Halpern, Herman, Sizov, Guangyi, Zhang, Lakshminarayanan, Inan, Shojanazeri, Zou, Wang, Zha, Habeeb, Rudolph, Suk, Aspegren, Goldman, Zhan, Damlaj, Molybog, Tufanov, Leontiadis, Veliche, Gat, Weissman, Geboski, Kohli, Lam, Asher, Gaya, Marcus, Tang, Chan, Zhen, Reizenstein, Teboul, Zhong, Jin, Yang, Cummings, Carvill, Shepard, McPhie, Torres, Ginsburg, Wang, Wu, U, Saxena, Khandelwal, Zand, Matosich, Veeraraghavan, Michelena, Li, Jagadeesh, Huang, Chawla, Huang, Chen, Garg, A, Silva, Bell, Zhang, Guo, Yu, Moshkovich, Wehrstedt, Khabsa, Avalani, Bhatt, Mankus, Hasson, Lennie, Reso, Groshev, Naumov, Lathi, Keneally, Liu, Seltzer, Valko, Restrepo, Patel, Vyatskov, Samvelyan, Clark, Macey, Wang, Hermoso, Metanat, Rastegari, Bansal, Santhanam, Parks, White, Bawa, Singhal, Egebo, Usunier, Mehta, Laptev, Dong, Cheng, Chernoguz, Hart, Salpekar, Kalinli, Kent, Parekh, Saab, Balaji, Rittner, Bontrager, Roux, Dollar, Zvyagina, Ratanchandani, Yuvraj, Liang, Alao, Rodriguez, Ayub, Murthy, Nayani, Mitra, Parthasarathy, Li, Hogan, Battey, Wang, Howes, Rinott, Mehta, Siby, Bondu, Datta, Chugh, Hunt, Dhillon, Sidorov, Pan, Mahajan, Verma, Yamamoto, Ramaswamy, Lindsay, Lindsay, Feng, Lin, Zha, Patil, Shankar, Zhang, Zhang, Wang, Agarwal, Sajuyigbe, Chintala, Max, Chen, Kehoe, Satterfield, Govindaprasad, Gupta, Deng, Cho, Virk, Subramanian, Choudhury, Goldman, Remez, Glaser, Best, Koehler, Robinson, Li, Zhang, Matthews, Chou, Shaked, Vontimitta, Ajayi, Montanez, Mohan, Kumar, Mangla, Ionescu, Poenaru, Mihailescu, Ivanov, Li, Wang, Jiang, Bouaziz, Constable, Tang, Wu, Wang, Wu, Gao, Kleinman, Chen, Hu, Jia, Qi, Li, Zhang, Zhang, Adi, Nam, Yu, Wang, Zhao, Hao, Qian, Li, He, Rait, DeVito, Rosnbrick, Wen, Yang, Zhao, and Ma}}}
|
| 595 |
+
\bibcite{griffinWalkingStabilizationUsing2017}{{36}{2017}{{Griffin et~al.}}{{Griffin, Wiedebach, Bertrand, Leonessa, and Pratt}}}
|
| 596 |
+
\bibcite{haarnojaReinforcementLearningDeep2017b}{{37}{2017}{{Haarnoja et~al.}}{{Haarnoja, Tang, Abbeel, and Levine}}}
|
| 597 |
+
\bibcite{haarnojaSoftActorCriticOffPolicy2018}{{38}{2018}{{Haarnoja et~al.}}{{Haarnoja, Zhou, Abbeel, and Levine}}}
|
| 598 |
+
\bibcite{hansenTemporalDifferenceLearning2022}{{39}{2022}{{Hansen et~al.}}{{Hansen, Wang, and Su}}}
|
| 599 |
+
\bibcite{heessEmergenceLocomotionBehaviours2017}{{40}{2017}{{Heess et~al.}}{{Heess, TB, Sriram, Lemmon, Merel, Wayne, Tassa, Erez, Wang, Eslami, Riedmiller, and Silver}}}
|
| 600 |
+
\bibcite{higgins2017beta}{{41}{2017}{{Higgins et~al.}}{{Higgins, Matthey, Pal, Burgess, Glorot, Botvinick, Mohamed, and Lerchner}}}
|
| 601 |
+
\bibcite{hoDenoisingDiffusionProbabilistic2020}{{42}{2020}{{Ho et~al.}}{{Ho, Jain, and Abbeel}}}
|
| 602 |
+
\bibcite{jangBCZZeroShotTask2022}{{43}{2022}{{Jang et~al.}}{{Jang, Irpan, Khansari, Kappler, Ebert, Lynch, Levine, and Finn}}}
|
| 603 |
+
\bibcite{jannerPlanningDiffusionFlexible2022}{{44}{2022}{{Janner et~al.}}{{Janner, Du, Tenenbaum, and Levine}}}
|
| 604 |
+
\bibcite{jiDribbleBotDynamicLegged2023}{{45}{2023}{{Ji et~al.}}{{Ji, Margolis, and Agrawal}}}
|
| 605 |
+
\bibcite{jiangMistral7B2023}{{46}{2023}{{Jiang et~al.}}{{Jiang, Sablayrolles, Mensch, Bamford, Chaplot, de~las Casas, Bressand, Lengyel, Lample, Saulnier, Lavaud, Lachaux, Stock, Scao, Lavril, Wang, Lacroix, and Sayed}}}
|
| 606 |
+
\bibcite{keGraspingChopsticksCombating2020}{{47}{2020}{{Ke et~al.}}{{Ke, Wang, Bhattacharjee, Boots, and Srinivasa}}}
|
| 607 |
+
\bibcite{khazatskyDROIDLargeScaleInTheWild2025}{{48}{2025}{{Khazatsky et~al.}}{{Khazatsky, Pertsch, Nair, Balakrishna, Dasari, Karamcheti, Nasiriany, Srirama, Chen, Ellis, Fagan, Hejna, Itkina, Lepert, Ma, Miller, Wu, Belkhale, Dass, Ha, Jain, Lee, Lee, Memmel, Park, Radosavovic, Wang, Zhan, Black, Chi, Hatch, Lin, Lu, Mercat, Rehman, Sanketi, Sharma, Simpson, Vuong, Walke, Wulfe, Xiao, Yang, Yavary, Zhao, Agia, Baijal, Castro, Chen, Chen, Chung, Drake, Foster, Gao, Guizilini, Herrera, Heo, Hsu, Hu, Irshad, Jackson, Le, Li, Lin, Lin, Ma, Maddukuri, Mirchandani, Morton, Nguyen, O'Neill, Scalise, Seale, Son, Tian, Tran, Wang, Wu, Xie, Yang, Yin, Zhang, Bastani, Berseth, Bohg, Goldberg, Gupta, Gupta, Jayaraman, Lim, Malik, {Mart{\'i}n-Mart{\'i}n}, Ramamoorthy, Sadigh, Song, Wu, Yip, Zhu, Kollar, Levine, and Finn}}}
|
| 608 |
+
\bibcite{kimOpenVLAOpenSourceVisionLanguageAction2024}{{49}{2024}{{Kim et~al.}}{{Kim, Pertsch, Karamcheti, Xiao, Balakrishna, Nair, Rafailov, Foster, Lam, Sanketi, Vuong, Kollar, Burchfiel, Tedrake, Sadigh, Levine, Liang, and Finn}}}
|
| 609 |
+
\bibcite{kingma2013auto}{{50}{2013}{{Kingma and Welling}}{{}}}
|
| 610 |
+
\bibcite{knightStandardOpenSO100}{{51}{}{{Knight et~al.}}{{Knight, Kooijmans, Wolf, Alibert, Aractingi, Aubakirova, Zouitine, Martino, Palma, Pascal, and Cadene}}}
|
| 611 |
+
\bibcite{koberReinforcementLearningRobotics}{{52}{}{{Kober et~al.}}{{Kober, Bagnell, and Peters}}}
|
| 612 |
+
\bibcite{FROMAGe}{{53}{2023}{{Koh et~al.}}{{Koh, Salakhutdinov, and Fried}}}
|
| 613 |
+
\bibcite{kong2024audioflam}{{54}{2024}{{Kong et~al.}}{{Kong, Goel, Badlani, Ping, Valle, and Catanzaro}}}
|
| 614 |
+
\bibcite{moondream}{{55}{2024}{{Korrapati}}{{}}}
|
| 615 |
+
\bibcite{OBELICS}{{56}{2023}{{Lauren{\c c}on et~al.}}{{Lauren{\c c}on, Saulnier, Tronchon, Bekman, Singh, Lozhkov, Wang, Karamcheti, Rush, Kiela, Cord, and Sanh}}}
|
| 616 |
+
\bibcite{laurenconWhatMattersWhen2024}{{57}{2024}{{Lauren{\c c}on et~al.}}{{Lauren{\c c}on, Tronchon, Cord, and Sanh}}}
|
| 617 |
+
\bibcite{leeLearningQuadrupedalLocomotion2020}{{58}{2020}{{Lee et~al.}}{{Lee, Hwangbo, Wellhausen, Koltun, and Hutter}}}
|
| 618 |
+
\bibcite{leeBehaviorGenerationLatent2024}{{59}{2024}{{Lee et~al.}}{{Lee, Wang, Etukuru, Kim, Shafiullah, and Pinto}}}
|
| 619 |
+
\bibcite{BLIP-2}{{60}{2023}{{Li et~al.}}{{Li, Li, Savarese, and Hoi}}}
|
| 620 |
+
\bibcite{lillicrapContinuousControlDeep2019a}{{61}{2019}{{Lillicrap et~al.}}{{Lillicrap, Hunt, Pritzel, Heess, Erez, Tassa, Silver, and Wierstra}}}
|
| 621 |
+
\bibcite{linVILAPretrainingVisual2024}{{62}{2024}{{Lin et~al.}}{{Lin, Yin, Ping, Lu, Molchanov, Tao, Mao, Kautz, Shoeybi, and Han}}}
|
| 622 |
+
\bibcite{lipmanFlowMatchingGenerative2023}{{63}{2023}{{Lipman et~al.}}{{Lipman, Chen, {Ben-Hamu}, Nickel, and Le}}}
|
| 623 |
+
\bibcite{lipmanFlowMatchingGuide2024}{{64}{2024}{{Lipman et~al.}}{{Lipman, Havasi, Holderrieth, Shaul, Le, Karrer, Chen, {Lopez-Paz}, {Ben-Hamu}, and Gat}}}
|
| 624 |
+
\bibcite{LLaVA-1.5}{{65}{2023}{{Liu et~al.}}{{Liu, Li, Li, and Lee}}}
|
| 625 |
+
\bibcite{liu2024kangaroo}{{66}{2024}{{Liu et~al.}}{{Liu, Wang, Ma, Wu, Ma, Wei, Jiao, Wu, and Hu}}}
|
| 626 |
+
\bibcite{luoUnderstandingDiffusionModels2022}{{67}{2022}{{Luo}}{{}}}
|
| 627 |
+
\bibcite{luoPreciseDexterousRobotic2024}{{68}{2024}{{Luo et~al.}}{{Luo, Xu, Wu, and Levine}}}
|
| 628 |
+
\bibcite{luoSERLSoftwareSuite2025}{{69}{2025}{{Luo et~al.}}{{Luo, Hu, Xu, Tan, Berg, Sharma, Schaal, Finn, Gupta, and Levine}}}
|
| 629 |
+
\bibcite{lynchModernRoboticsMechanics2017}{{70}{2017}{{Lynch and Park}}{{}}}
|
| 630 |
+
\bibcite{MAPL}{{71}{2023}{{Ma{\~n}as et~al.}}{{Ma{\~n}as, Rodriguez~Lopez, Ahmadi, Nematzadeh, Goyal, and Agrawal}}}
|
| 631 |
+
\bibcite{marafiotiSmolVLMRedefiningSmall2025}{{72}{2025}{{Marafioti et~al.}}{{Marafioti, Zohar, Farr{\'e}, Noyan, Bakouch, Cuenca, Zakka, Allal, Lozhkov, Tazi, Srivastav, Lochner, Larcher, Morlon, Tunstall, von Werra, and Wolf}}}
|
| 632 |
+
\bibcite{margolisRapidLocomotionReinforcement2022}{{73}{2022}{{Margolis et~al.}}{{Margolis, Yang, Paigwar, Chen, and Agrawal}}}
|
| 633 |
+
\bibcite{mccormacSemanticFusionDense3D2016}{{74}{2016}{{McCormac et~al.}}{{McCormac, Handa, Davison, and Leutenegger}}}
|
| 634 |
+
\bibcite{mnihPlayingAtariDeep2013}{{75}{2013}{{Mnih et~al.}}{{Mnih, Kavukcuoglu, Silver, Graves, Antonoglou, Wierstra, and Riedmiller}}}
|
| 635 |
+
\bibcite{nakkiranStepbyStepDiffusionElementary2024}{{76}{2024}{{Nakkiran et~al.}}{{Nakkiran, Bradley, Zhou, and Advani}}}
|
| 636 |
+
\bibcite{oneillOpenXEmbodimentRobotic2025}{{77}{2025}{{O'Neill et~al.}}{{O'Neill, Rehman, Gupta, Maddukuri, Gupta, Padalkar, Lee, Pooley, Gupta, Mandlekar, Jain, Tung, Bewley, Herzog, Irpan, Khazatsky, Rai, Gupta, Wang, Kolobov, Singh, Garg, Kembhavi, Xie, Brohan, Raffin, Sharma, Yavary, Jain, Balakrishna, Wahid, {Burgess-Limerick}, Kim, Sch{\"o}lkopf, Wulfe, Ichter, Lu, Xu, Le, Finn, Wang, Xu, Chi, Huang, Chan, Agia, Pan, Fu, Devin, Xu, Morton, Driess, Chen, Pathak, Shah, B{\"u}chler, Jayaraman, Kalashnikov, Sadigh, Johns, Foster, Liu, Ceola, Xia, Zhao, Frujeri, Stulp, Zhou, Sukhatme, Salhotra, Yan, Feng, Schiavi, Berseth, Kahn, Yang, Wang, Su, Fang, Shi, Bao, Amor, Christensen, Furuta, Bharadhwaj, Walke, Fang, Ha, Mordatch, Radosavovic, Leal, Liang, {Abou-Chakra}, Kim, Drake, Peters, Schneider, Hsu, Vakil, Bohg, Bingham, Wu, Gao, Hu, Wu, Wu, Sun, Luo, Gu, Tan, Oh, Wu, Lu, Yang, Malik, Silv{\'e}rio, Hejna, Booher, Tompson, Yang, Salvador, Lim, Han, Wang, Rao, Pertsch, Hausman, Go, Gopalakrishnan, Goldberg, Byrne, Oslund, Kawaharazuka, Black, Lin, Zhang, Ehsani, Lekkala, Ellis, Rana, Srinivasan, Fang, Singh, Zeng, Hatch, Hsu, Itti, Chen, Pinto, {Fei-Fei}, Tan, Fan, Ott, Lee, Weihs, Chen, Lepert, Memmel, Tomizuka, Itkina, Castro, Spero, Du, Ahn, Yip, Zhang, Ding, Heo, Srirama, Sharma, Kim, Irshad, Kanazawa, Hansen, Heess, Joshi, Suenderhauf, Liu, Palo, Shafiullah, Mees, Kroemer, Bastani, Sanketi, Miller, Yin, Wohlhart, Xu, Fagan, Mitrano, Sermanet, Abbeel, Sundaresan, Chen, Vuong, Rafailov, Tian, Doshi, {Mart{\'i}n-Mart{\'i}n}, Baijal, Scalise, Hendrix, Lin, Qian, Zhang, Mendonca, Shah, Hoque, Julian, Bustamante, Kirmani, Levine, Lin, Moore, Bahl, Dass, Sonawani, Tulsiani, Song, Xu, Haldar, Karamcheti, Adebola, Guist, Nasiriany, Schaal, Welker, Tian, Ramamoorthy, Dasari, Belkhale, Park, Nair, Mirchandani, Osa, Gupta, Harada, Matsushima, Xiao, Kollar, Yu, Ding, Davchev, Zhao, Armstrong, Darrell, Chung, Jain, Kumar, Vanhoucke, Guizilini, Zhan, Zhou, Burgard, Chen, Chen, Wang, Zhu, Geng, Liu, Liangwei, Li, Pang, Lu, Ma, Kim, Chebotar, Zhou, Zhu, Wu, Xu, Wang, Bisk, Dou, Cho, Lee, Cui, Cao, Wu, Tang, Zhu, Zhang, Jiang, Li, Li, Iwasawa, Matsuo, Ma, Xu, Cui, Zhang, Fu, and Lin}}}
|
| 637 |
+
\bibcite{oquabDINOv2LearningRobust2024}{{78}{2024}{{Oquab et~al.}}{{Oquab, Darcet, Moutakanni, Vo, Szafraniec, Khalidov, Fernandez, Haziza, Massa, {El-Nouby}, Assran, Ballas, Galuba, Howes, Huang, Li, Misra, Rabbat, Sharma, Synnaeve, Xu, Jegou, Mairal, Labatut, Joulin, and Bojanowski}}}
|
| 638 |
+
\bibcite{permenterInterpretingImprovingDiffusion2024}{{79}{2024}{{Permenter and Yuan}}{{}}}
|
| 639 |
+
\bibcite{polyakMovieGenCast2025}{{80}{2025}{{Polyak et~al.}}{{Polyak, Zohar, Brown, Tjandra, Sinha, Lee, Vyas, Shi, Ma, Chuang, Yan, Choudhary, Wang, Sethi, Pang, Ma, Misra, Hou, Wang, Jagadeesh, Li, Zhang, Singh, Williamson, Le, Yu, Singh, Zhang, Vajda, Duval, Girdhar, Sumbaly, Rambhatla, Tsai, Azadi, Datta, Chen, Bell, Ramaswamy, Sheynin, Bhattacharya, Motwani, Xu, Li, Hou, Hsu, Yin, Dai, Taigman, Luo, Liu, Wu, Zhao, Kirstain, He, He, Pumarola, Thabet, Sanakoyeu, Mallya, Guo, Araya, Kerr, Wood, Liu, Peng, Vengertsev, Schonfeld, Blanchard, {Juefei-Xu}, Nord, Liang, Hoffman, Kohler, Fire, Sivakumar, Chen, Yu, Gao, Georgopoulos, Moritz, Sampson, Li, Parmeggiani, Fine, Fowler, Petrovic, and Du}}}
|
| 640 |
+
\bibcite{pomerleauALVINNAutonomousLand1988}{{81}{1988}{{Pomerleau}}{{}}}
|
| 641 |
+
\bibcite{prince2023understanding}{{82}{2023}{{Prince}}{{}}}
|
| 642 |
+
\bibcite{radfordLearningTransferableVisual2021}{{83}{2021}{{Radford et~al.}}{{Radford, Kim, Hallacy, Ramesh, Goh, Agarwal, Sastry, Askell, Mishkin, Clark, Krueger, and Sutskever}}}
|
| 643 |
+
\bibcite{raffelExploringLimitsTransfer2023}{{84}{2023}{{Raffel et~al.}}{{Raffel, Shazeer, Roberts, Lee, Narang, Matena, Zhou, Li, and Liu}}}
|
| 644 |
+
\bibcite{reedGeneralistAgent2022}{{85}{2022}{{Reed et~al.}}{{Reed, Zolna, Parisotto, Colmenarejo, Novikov, {Barth-Maron}, Gimenez, Sulsky, Kay, Springenberg, Eccles, Bruce, Razavi, Edwards, Heess, Chen, Hadsell, Vinyals, Bordbar, and de~Freitas}}}
|
| 645 |
+
\bibcite{ronnebergerUNetConvolutionalNetworks2015}{{86}{2015}{{Ronneberger et~al.}}{{Ronneberger, Fischer, and Brox}}}
|
| 646 |
+
\bibcite{rossReductionImitationLearning2011}{{87}{2011}{{Ross et~al.}}{{Ross, Gordon, and Bagnell}}}
|
| 647 |
+
\bibcite{sannemanStateIndustrialRobotics2020}{{88}{2020}{{Sanneman et~al.}}{{Sanneman, Fourie, and Shah}}}
|
| 648 |
+
\bibcite{LAION-COCO}{{89}{2022}{{Schuhmann et~al.}}{{Schuhmann, K{\"o}pf, Vencu, Coombes, and Beaumont}}}
|
| 649 |
+
\bibcite{schulmanTrustRegionPolicy2017}{{90}{2017{a}}{{Schulman et~al.}}{{Schulman, Levine, Moritz, Jordan, and Abbeel}}}
|
| 650 |
+
\bibcite{schulmanProximalPolicyOptimization2017}{{91}{2017{b}}{{Schulman et~al.}}{{Schulman, Wolski, Dhariwal, Radford, and Klimov}}}
|
| 651 |
+
\bibcite{shalev-shwartzUnderstandingMachineLearning2014}{{92}{2014}{{{Shalev-Shwartz} and {Ben-David}}}{{}}}
|
| 652 |
+
\bibcite{shukor2023epalm}{{93}{2023}{{Shukor et~al.}}{{Shukor, Dancette, and Cord}}}
|
| 653 |
+
\bibcite{shukorSmolVLAVisionLanguageActionModel2025}{{94}{2025}{{Shukor et~al.}}{{Shukor, Aubakirova, Capuano, Kooijmans, Palma, Zouitine, Aractingi, Pascal, Russi, Marafioti, Alibert, Cord, Wolf, and Cadene}}}
|
| 654 |
+
\bibcite{sicilianoSpringerHandbookRobotics2016}{{95}{2016}{{Siciliano and Khatib}}{{}}}
|
| 655 |
+
\bibcite{pmlr-v32-silver14}{{96}{2014}{{Silver et~al.}}{{Silver, Lever, Heess, Degris, Wierstra, and Riedmiller}}}
|
| 656 |
+
\bibcite{sohnLearningStructuredOutput2015}{{97}{2015}{{Sohn et~al.}}{{Sohn, Lee, and Yan}}}
|
| 657 |
+
\bibcite{songDenoisingDiffusionImplicit2022}{{98}{2022}{{Song et~al.}}{{Song, Meng, and Ermon}}}
|
| 658 |
+
\bibcite{suttonReinforcementLearningIntroduction2018}{{99}{2018}{{Sutton and Barto}}{{}}}
|
| 659 |
+
\bibcite{tancikFourierFeaturesLet2020}{{100}{2020}{{Tancik et~al.}}{{Tancik, Srinivasan, Mildenhall, {Fridovich-Keil}, Raghavan, Singhal, Ramamoorthi, Barron, and Ng}}}
|
| 660 |
+
\bibcite{tangDeepReinforcementLearning2025}{{101}{2025}{{Tang et~al.}}{{Tang, Abbatematteo, Hu, Chandra, {Mart{\'i}n-Mart{\'i}n}, and Stone}}}
|
| 661 |
+
\bibcite{tangPerceptionNavigationAutonomous2023}{{102}{2023}{{Tang et~al.}}{{Tang, Zhao, Wang, Zhang, Sun, Zheng, Du, Qian, and Kurths}}}
|
| 662 |
+
\bibcite{teamGemma2Improving2024}{{103}{2024}{{Team et~al.}}{{Team, Riviere, Pathak, Sessa, Hardin, Bhupatiraju, Hussenot, Mesnard, Shahriari, Ram{\'e}, Ferret, Liu, Tafti, Friesen, Casbon, Ramos, Kumar, Lan, Jerome, Tsitsulin, Vieillard, Stanczyk, Girgin, Momchev, Hoffman, Thakoor, Grill, Neyshabur, Bachem, Walton, Severyn, Parrish, Ahmad, Hutchison, Abdagic, Carl, Shen, Brock, Coenen, Laforge, Paterson, Bastian, Piot, Wu, Royal, Chen, Kumar, Perry, Welty, {Choquette-Choo}, Sinopalnikov, Weinberger, Vijaykumar, Rogozi{\'n}ska, Herbison, Bandy, Wang, Noland, Moreira, Senter, Eltyshev, Visin, Rasskin, Wei, Cameron, Martins, Hashemi, {Klimczak-Pluci{\'n}ska}, Batra, Dhand, Nardini, Mein, Zhou, Svensson, Stanway, Chan, Zhou, Carrasqueira, Iljazi, Becker, Fernandez, van Amersfoort, Gordon, Lipschultz, Newlan, Ji, Mohamed, Badola, Black, Millican, McDonell, Nguyen, Sodhia, Greene, Sjoesund, Usui, Sifre, Heuermann, Lago, McNealus, Soares, Kilpatrick, Dixon, Martins, Reid, Singh, Iverson, G{\"o}rner, Velloso, Wirth, Davidow, Miller, Rahtz, Watson, Risdal, Kazemi, Moynihan, Zhang, Kahng, Park, Rahman, Khatwani, Dao, Bardoliwalla, Devanathan, Dumai, Chauhan, Wahltinez, Botarda, Barnes, Barham, Michel, Jin, Georgiev, Culliton, Kuppala, Comanescu, Merhej, Jana, Rokni, Agarwal, Mullins, Saadat, Carthy, Perrin, Arnold, Krause, Dai, Garg, Sheth, Ronstrom, Chan, Jordan, Yu, Eccles, Hennigan, Kocisky, Doshi, Jain, Yadav, Meshram, Dharmadhikari, Barkley, Wei, Ye, Han, Kwon, Xu, Shen, Gong, Wei, Cotruta, Kirk, Rao, Giang, Peran, Warkentin, Collins, Barral, Ghahramani, Hadsell, Sculley, Banks, Dragan, Petrov, Vinyals, Dean, Hassabis, Kavukcuoglu, Farabet, Buchatskaya, Borgeaud, Fiedel, Joulin, Kenealy, Dadashi, and Andreev}}}
|
| 663 |
+
\bibcite{tedrakeRoboticManipulationPerception}{{104}{a}{{Tedrake}}{{}}}
|
| 664 |
+
\bibcite{tedrakeUnderactuatedRoboticsAlgorithms}{{105}{b}{{Tedrake}}{{}}}
|
| 665 |
+
\bibcite{tiboniDROPOSimtoRealTransfer2023}{{106}{2023}{{Tiboni et~al.}}{{Tiboni, Arndt, and Kyrki}}}
|
| 666 |
+
\bibcite{tiboniDomainRandomizationEntropy2024}{{107}{2024}{{Tiboni et~al.}}{{Tiboni, Klink, Peters, Tommasi, D'Eramo, and Chalvatzaki}}}
|
| 667 |
+
\bibcite{tobinDomainRandomizationTransferring2017}{{108}{2017}{{Tobin et~al.}}{{Tobin, Fong, Ray, Schneider, Zaremba, and Abbeel}}}
|
| 668 |
+
\bibcite{tong2024cambrian}{{109}{2024}{{Tong et~al.}}{{Tong, Brown, Wu, Woo, IYER, Akula, Yang, Yang, Middepogu, Wang, et~al.}}}
|
| 669 |
+
\bibcite{touvronLlama2Open2023}{{110}{2023}{{Touvron et~al.}}{{Touvron, Martin, Stone, Albert, Almahairi, Babaei, Bashlykov, Batra, Bhargava, Bhosale, Bikel, Blecher, Ferrer, Chen, Cucurull, Esiobu, Fernandes, Fu, Fu, Fuller, Gao, Goswami, Goyal, Hartshorn, Hosseini, Hou, Inan, Kardas, Kerkez, Khabsa, Kloumann, Korenev, Koura, Lachaux, Lavril, Lee, Liskovich, Lu, Mao, Martinet, Mihaylov, Mishra, Molybog, Nie, Poulton, Reizenstein, Rungta, Saladi, Schelten, Silva, Smith, Subramanian, Tan, Tang, Taylor, Williams, Kuan, Xu, Yan, Zarov, Zhang, Fan, Kambadur, Narang, Rodriguez, Stojnic, Edunov, and Scialom}}}
|
| 670 |
+
\bibcite{tsimpoukelli2021multimodalfrozen}{{111}{2021}{{Tsimpoukelli et~al.}}{{Tsimpoukelli, Menick, Cabi, Eslami, Vinyals, and Hill}}}
|
| 671 |
+
\bibcite{vallaeys2024improveddepalm}{{112}{2024}{{Vallaeys et~al.}}{{Vallaeys, Shukor, Cord, and Verbeek}}}
|
| 672 |
+
\bibcite{wang2025internvideo2}{{113}{2025}{{Wang et~al.}}{{Wang, Li, Yan, He, Yu, Zeng, Wang, Ma, Huang, Gao, et~al.}}}
|
| 673 |
+
\bibcite{minicmpv2024}{{114}{2024}{{Yao et~al.}}{{Yao, Yu, Zhang, Wang, Cui, Zhu, Cai, Li, Zhao, He, Chen, Zhou, Zou, Zhang, Hu, Zheng, Zhou, Cai, Han, Zeng, Li, Liu, and Sun}}}
|
| 674 |
+
\bibcite{zhaiSigmoidLossLanguage2023}{{115}{2023}{{Zhai et~al.}}{{Zhai, Mustafa, Kolesnikov, and Beyer}}}
|
| 675 |
+
\bibcite{zhang2025videollama}{{116}{2025}{{Zhang et~al.}}{{Zhang, Li, Cheng, Hu, Yuan, Chen, Leng, Jiang, Zhang, Li, et~al.}}}
|
| 676 |
+
\bibcite{zhangWoCoCoLearningWholeBody2024}{{117}{2024}{{Zhang et~al.}}{{Zhang, Xiao, He, and Shi}}}
|
| 677 |
+
\bibcite{zhaoLearningFineGrainedBimanual2023}{{118}{2023}{{Zhao et~al.}}{{Zhao, Kumar, Levine, and Finn}}}
|
| 678 |
+
\bibcite{zhu2024minigpt}{{119}{2024}{{Zhu et~al.}}{{Zhu, Chen, Shen, Li, and Elhoseiny}}}
|
| 679 |
+
\bibcite{MMC4}{{120}{2023}{{Zhu et~al.}}{{Zhu, Hessel, Awadalla, Gadre, Dodge, Fang, Yu, Schmidt, Wang, and Choi}}}
|
| 680 |
+
\xdef \mintedoldcachechecksum{\detokenize{\minted@cachechecksum }}
|
| 681 |
+
\gdef \@abspage@last{76}
|
app/scripts/latex-to-mdx/input/main.bbl
CHANGED
|
@@ -1,23 +1,416 @@
|
|
| 1 |
-
\begin{thebibliography}{
|
| 2 |
\providecommand{\natexlab}[1]{#1}
|
| 3 |
\providecommand{\url}[1]{\texttt{#1}}
|
| 4 |
\expandafter\ifx\csname urlstyle\endcsname\relax
|
| 5 |
\providecommand{\doi}[1]{doi: #1}\else
|
| 6 |
\providecommand{\doi}{doi: \begingroup \urlstyle{rm}\Url}\fi
|
| 7 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 8 |
\bibitem[Lipman et~al.(2024)Lipman, Havasi, Holderrieth, Shaul, Le, Karrer, Chen, {Lopez-Paz}, {Ben-Hamu}, and Gat]{lipmanFlowMatchingGuide2024}
|
| 9 |
Yaron Lipman, Marton Havasi, Peter Holderrieth, Neta Shaul, Matt Le, Brian Karrer, Ricky T.~Q. Chen, David {Lopez-Paz}, Heli {Ben-Hamu}, and Itai Gat.
|
| 10 |
\newblock Flow {{Matching Guide}} and {{Code}}, December 2024.
|
| 11 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 12 |
\bibitem[Nakkiran et~al.(2024)Nakkiran, Bradley, Zhou, and Advani]{nakkiranStepbyStepDiffusionElementary2024}
|
| 13 |
Preetum Nakkiran, Arwen Bradley, Hattie Zhou, and Madhu Advani.
|
| 14 |
\newblock Step-by-{{Step Diffusion}}: {{An Elementary Tutorial}}, June 2024.
|
| 15 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 16 |
\bibitem[Prince(2023)]{prince2023understanding}
|
| 17 |
Simon~J.D. Prince.
|
| 18 |
\newblock \emph{Understanding Deep Learning}.
|
| 19 |
\newblock The MIT Press, 2023.
|
| 20 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 21 |
\bibitem[{Shalev-Shwartz} and {Ben-David}(2014)]{shalev-shwartzUnderstandingMachineLearning2014}
|
| 22 |
Shai {Shalev-Shwartz} and Shai {Ben-David}.
|
| 23 |
\newblock \emph{Understanding {{Machine Learning}}: {{From Theory}} to {{Algorithms}}}.
|
|
@@ -25,6 +418,15 @@ Shai {Shalev-Shwartz} and Shai {Ben-David}.
|
|
| 25 |
\newblock ISBN 978-1-107-05713-5 978-1-107-29801-9.
|
| 26 |
\newblock \doi{10.1017/CBO9781107298019}.
|
| 27 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 28 |
\bibitem[Siciliano and Khatib(2016)]{sicilianoSpringerHandbookRobotics2016}
|
| 29 |
Bruno Siciliano and Oussama Khatib, editors.
|
| 30 |
\newblock \emph{Springer {{Handbook}} of {{Robotics}}}.
|
|
@@ -32,12 +434,48 @@ Bruno Siciliano and Oussama Khatib, editors.
|
|
| 32 |
\newblock ISBN 978-3-319-32550-7 978-3-319-32552-1.
|
| 33 |
\newblock \doi{10.1007/978-3-319-32552-1}.
|
| 34 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 35 |
\bibitem[Sutton and Barto(2018)]{suttonReinforcementLearningIntroduction2018}
|
| 36 |
Richard~S. Sutton and Andrew~G. Barto.
|
| 37 |
\newblock \emph{Reinforcement Learning: An Introduction}.
|
| 38 |
\newblock Adaptive Computation and Machine Learning Series. The MIT Press, Cambridge, Massachusetts, second edition edition, 2018.
|
| 39 |
\newblock ISBN 978-0-262-03924-6.
|
| 40 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 41 |
\bibitem[Tedrake({\natexlab{a}})]{tedrakeRoboticManipulationPerception}
|
| 42 |
Russ Tedrake.
|
| 43 |
\newblock Robotic {{Manipulation}}. {{Perception}}, {{Planning}} and {{Control}}., {\natexlab{a}}.
|
|
@@ -46,4 +484,71 @@ Russ Tedrake.
|
|
| 46 |
Russ Tedrake.
|
| 47 |
\newblock Underactuated {{Robotics}}. {{Algorithms}} for {{Walking}}, {{Running}}, {{Swimming}}, {{Flying}}, and {{Manipulation}}, {\natexlab{b}}.
|
| 48 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 49 |
\end{thebibliography}
|
|
|
|
| 1 |
+
\begin{thebibliography}{120}
|
| 2 |
\providecommand{\natexlab}[1]{#1}
|
| 3 |
\providecommand{\url}[1]{\texttt{#1}}
|
| 4 |
\expandafter\ifx\csname urlstyle\endcsname\relax
|
| 5 |
\providecommand{\doi}[1]{doi: #1}\else
|
| 6 |
\providecommand{\doi}{doi: \begingroup \urlstyle{rm}\Url}\fi
|
| 7 |
|
| 8 |
+
\bibitem[Achiam(2018)]{SpinningUp2018}
|
| 9 |
+
Joshua Achiam.
|
| 10 |
+
\newblock Spinning up in deep reinforcement learning.
|
| 11 |
+
\newblock 2018.
|
| 12 |
+
|
| 13 |
+
\bibitem[Agrawal()]{agrawalComputationalSensorimotorLearning}
|
| 14 |
+
Pulkit Agrawal.
|
| 15 |
+
\newblock Computational {{Sensorimotor Learning}}.
|
| 16 |
+
|
| 17 |
+
\bibitem[Akkaya et~al.(2019)Akkaya, Andrychowicz, Chociej, Litwin, McGrew, Petron, Paino, Plappert, Powell, Ribas, Schneider, Tezak, Tworek, Welinder, Weng, Yuan, Zaremba, and Zhang]{akkayaSolvingRubiksCube2019}
|
| 18 |
+
Ilge Akkaya, Marcin Andrychowicz, Maciek Chociej, Mateusz Litwin, Bob McGrew, Arthur Petron, Alex Paino, Matthias Plappert, Glenn Powell, Raphael Ribas, Jonas Schneider, Nikolas Tezak, Jerry Tworek, Peter Welinder, Lilian Weng, Qiming Yuan, Wojciech Zaremba, and Lei Zhang.
|
| 19 |
+
\newblock Solving {{Rubik}}'s {{Cube}} with a {{Robot Hand}}, October 2019.
|
| 20 |
+
|
| 21 |
+
\bibitem[Alayrac et~al.(2022)Alayrac, Donahue, Luc, Miech, Barr, Hasson, Lenc, Mensch, Millican, Reynolds, Ring, Rutherford, Cabi, Han, Gong, Samangooei, Monteiro, Menick, Borgeaud, Brock, Nematzadeh, Sharifzadeh, Binkowski, Barreira, Vinyals, Zisserman, and Simonyan]{alayracFlamingoVisualLanguage2022}
|
| 22 |
+
Jean-Baptiste Alayrac, Jeff Donahue, Pauline Luc, Antoine Miech, Iain Barr, Yana Hasson, Karel Lenc, Arthur Mensch, Katie Millican, Malcolm Reynolds, Roman Ring, Eliza Rutherford, Serkan Cabi, Tengda Han, Zhitao Gong, Sina Samangooei, Marianne Monteiro, Jacob Menick, Sebastian Borgeaud, Andrew Brock, Aida Nematzadeh, Sahand Sharifzadeh, Mikolaj Binkowski, Ricardo Barreira, Oriol Vinyals, Andrew Zisserman, and Karen Simonyan.
|
| 23 |
+
\newblock Flamingo: A {{Visual Language Model}} for {{Few-Shot Learning}}, November 2022.
|
| 24 |
+
|
| 25 |
+
\bibitem[Aldaco et~al.()Aldaco, Armstrong, Baruch, Bingham, Chan, Dwibedi, Finn, Florence, Goodrich, Gramlich, Herzog, Hoech, Nguyen, Storz, Tabanpour, Tompson, Wahid, Wahrburg, Xu, Yaroshenko, and Zhao]{aldacoALOHA2Enhanced}
|
| 26 |
+
Jorge Aldaco, Travis Armstrong, Robert Baruch, Jeff Bingham, Sanky Chan, Debidatta Dwibedi, Chelsea Finn, Pete Florence, Spencer Goodrich, Wayne Gramlich, Alexander Herzog, Jonathan Hoech, Thinh Nguyen, Ian Storz, Baruch Tabanpour, Jonathan Tompson, Ayzaan Wahid, Ted Wahrburg, Sichun Xu, Sergey Yaroshenko, and Tony~Z Zhao.
|
| 27 |
+
\newblock {{ALOHA}} 2: {{An Enhanced Low-Cost Hardware}} for {{Bimanual Teleoperation}}.
|
| 28 |
+
|
| 29 |
+
\bibitem[Alizadeh and Zhu(2024)]{alizadehComprehensiveSurveySpace2024}
|
| 30 |
+
Mohammad Alizadeh and Zheng~H. Zhu.
|
| 31 |
+
\newblock A comprehensive survey of space robotic manipulators for on-orbit servicing.
|
| 32 |
+
\newblock \emph{Frontiers in Robotics and AI}, 11, October 2024.
|
| 33 |
+
\newblock ISSN 2296-9144.
|
| 34 |
+
\newblock \doi{10.3389/frobt.2024.1470950}.
|
| 35 |
+
|
| 36 |
+
\bibitem[Allal et~al.(2025)Allal, Lozhkov, Bakouch, Bl{\'a}zquez, Penedo, Tunstall, Marafioti, Kydl{\'i}{\v c}ek, Lajar{\'i}n, Srivastav, Lochner, Fahlgren, Nguyen, Fourrier, Burtenshaw, Larcher, Zhao, Zakka, Morlon, Raffel, von Werra, and Wolf]{allalSmolLM2WhenSmol2025}
|
| 37 |
+
Loubna~Ben Allal, Anton Lozhkov, Elie Bakouch, Gabriel~Mart{\'i}n Bl{\'a}zquez, Guilherme Penedo, Lewis Tunstall, Andr{\'e}s Marafioti, Hynek Kydl{\'i}{\v c}ek, Agust{\'i}n~Piqueres Lajar{\'i}n, Vaibhav Srivastav, Joshua Lochner, Caleb Fahlgren, Xuan-Son Nguyen, Cl{\'e}mentine Fourrier, Ben Burtenshaw, Hugo Larcher, Haojun Zhao, Cyril Zakka, Mathieu Morlon, Colin Raffel, Leandro von Werra, and Thomas Wolf.
|
| 38 |
+
\newblock {{SmolLM2}}: {{When Smol Goes Big}} -- {{Data-Centric Training}} of a {{Small Language Model}}, February 2025.
|
| 39 |
+
|
| 40 |
+
\bibitem[Antonova et~al.(2017)Antonova, Cruciani, Smith, and Kragic]{antonovaReinforcementLearningPivoting2017}
|
| 41 |
+
Rika Antonova, Silvia Cruciani, Christian Smith, and Danica Kragic.
|
| 42 |
+
\newblock Reinforcement {{Learning}} for {{Pivoting Task}}, March 2017.
|
| 43 |
+
|
| 44 |
+
\bibitem[Bai et~al.(2025)Bai, Chen, Liu, Wang, Ge, Song, Dang, Wang, Wang, Tang, Zhong, Zhu, Yang, Li, Wan, Wang, Ding, Fu, Xu, Ye, Zhang, Xie, Cheng, Zhang, Yang, Xu, and Lin]{bai2025qwen25vl}
|
| 45 |
+
Shuai Bai, Keqin Chen, Xuejing Liu, Jialin Wang, Wenbin Ge, Sibo Song, Kai Dang, Peng Wang, Shijie Wang, Jun Tang, Humen Zhong, Yuanzhi Zhu, Mingkun Yang, Zhaohai Li, Jianqiang Wan, Pengfei Wang, Wei Ding, Zheren Fu, Yiheng Xu, Jiabo Ye, Xi~Zhang, Tianbao Xie, Zesen Cheng, Hang Zhang, Zhibo Yang, Haiyang Xu, and Junyang Lin.
|
| 46 |
+
\newblock Qwen2.5-{{VL}} technical report, 2025.
|
| 47 |
+
|
| 48 |
+
\bibitem[Ball et~al.(2023)Ball, Smith, Kostrikov, and Levine]{ballEfficientOnlineReinforcement2023}
|
| 49 |
+
Philip~J. Ball, Laura Smith, Ilya Kostrikov, and Sergey Levine.
|
| 50 |
+
\newblock Efficient {{Online Reinforcement Learning}} with {{Offline Data}}, May 2023.
|
| 51 |
+
|
| 52 |
+
\bibitem[Bekris et~al.(2024)Bekris, Doerr, Meng, and Tangirala]{bekrisStateRobotMotion2024}
|
| 53 |
+
Kostas~E. Bekris, Joe Doerr, Patrick Meng, and Sumanth Tangirala.
|
| 54 |
+
\newblock The {{State}} of {{Robot Motion Generation}}, October 2024.
|
| 55 |
+
|
| 56 |
+
\bibitem[Bellemare et~al.(2020)Bellemare, Candido, Castro, Gong, Machado, Moitra, Ponda, and Wang]{bellemareAutonomousNavigationStratospheric2020}
|
| 57 |
+
Marc~G. Bellemare, Salvatore Candido, Pablo~Samuel Castro, Jun Gong, Marlos~C. Machado, Subhodeep Moitra, Sameera~S. Ponda, and Ziyu Wang.
|
| 58 |
+
\newblock Autonomous navigation of stratospheric balloons using reinforcement learning.
|
| 59 |
+
\newblock \emph{Nature}, 588\penalty0 (7836):\penalty0 77--82, December 2020.
|
| 60 |
+
\newblock ISSN 1476-4687.
|
| 61 |
+
\newblock \doi{10.1038/s41586-020-2939-8}.
|
| 62 |
+
|
| 63 |
+
\bibitem[Bellman(1957)]{bellmanMarkovianDecisionProcess1957}
|
| 64 |
+
Richard Bellman.
|
| 65 |
+
\newblock A {{Markovian Decision Process}}.
|
| 66 |
+
\newblock \emph{Journal of Mathematics and Mechanics}, 6\penalty0 (5):\penalty0 679--684, 1957.
|
| 67 |
+
\newblock ISSN 0095-9057.
|
| 68 |
+
|
| 69 |
+
\bibitem[Bjorck et~al.(2025)Bjorck, Casta{\~n}eda, Cherniadev, Da, Ding, Fan, Fang, Fox, Hu, Huang, Jang, Jiang, Kautz, Kundalia, Lao, Li, Lin, Lin, Liu, Llontop, Magne, Mandlekar, Narayan, Nasiriany, Reed, Tan, Wang, Wang, Wang, Wang, Xiang, Xie, Xu, Xu, Ye, Yu, Zhang, Zhang, Zhao, Zheng, and Zhu]{bjorckGR00TN1Open2025}
|
| 70 |
+
Johan Bjorck, Fernando Casta{\~n}eda, Nikita Cherniadev, Xingye Da, Runyu Ding, Linxi~"Jim" Fan, Yu~Fang, Dieter Fox, Fengyuan Hu, Spencer Huang, Joel Jang, Zhenyu Jiang, Jan Kautz, Kaushil Kundalia, Lawrence Lao, Zhiqi Li, Zongyu Lin, Kevin Lin, Guilin Liu, Edith Llontop, Loic Magne, Ajay Mandlekar, Avnish Narayan, Soroush Nasiriany, Scott Reed, You~Liang Tan, Guanzhi Wang, Zu~Wang, Jing Wang, Qi~Wang, Jiannan Xiang, Yuqi Xie, Yinzhen Xu, Zhenjia Xu, Seonghyeon Ye, Zhiding Yu, Ao~Zhang, Hao Zhang, Yizhou Zhao, Ruijie Zheng, and Yuke Zhu.
|
| 71 |
+
\newblock {{GR00T N1}}: {{An Open Foundation Model}} for {{Generalist Humanoid Robots}}, March 2025.
|
| 72 |
+
|
| 73 |
+
\bibitem[Black et~al.(2024)Black, Brown, Driess, Esmail, Equi, Finn, Fusai, Groom, Hausman, Ichter, Jakubczak, Jones, Ke, Levine, {Li-Bell}, Mothukuri, Nair, Pertsch, Shi, Tanner, Vuong, Walling, Wang, and Zhilinsky]{black$p_0$VisionLanguageActionFlow2024}
|
| 74 |
+
Kevin Black, Noah Brown, Danny Driess, Adnan Esmail, Michael Equi, Chelsea Finn, Niccolo Fusai, Lachy Groom, Karol Hausman, Brian Ichter, Szymon Jakubczak, Tim Jones, Liyiming Ke, Sergey Levine, Adrian {Li-Bell}, Mohith Mothukuri, Suraj Nair, Karl Pertsch, Lucy~Xiaoyang Shi, James Tanner, Quan Vuong, Anna Walling, Haohuan Wang, and Ury Zhilinsky.
|
| 75 |
+
\newblock \${$\pi\_$}0\$: {{A Vision-Language-Action Flow Model}} for {{General Robot Control}}, October 2024.
|
| 76 |
+
|
| 77 |
+
\bibitem[Brohan et~al.(2023{\natexlab{a}})Brohan, Brown, Carbajal, Chebotar, Chen, Choromanski, Ding, Driess, Dubey, Finn, Florence, Fu, Arenas, Gopalakrishnan, Han, Hausman, Herzog, Hsu, Ichter, Irpan, Joshi, Julian, Kalashnikov, Kuang, Leal, Lee, Lee, Levine, Lu, Michalewski, Mordatch, Pertsch, Rao, Reymann, Ryoo, Salazar, Sanketi, Sermanet, Singh, Singh, Soricut, Tran, Vanhoucke, Vuong, Wahid, Welker, Wohlhart, Wu, Xia, Xiao, Xu, Xu, Yu, and Zitkovich]{brohanRT2VisionLanguageActionModels2023}
|
| 78 |
+
Anthony Brohan, Noah Brown, Justice Carbajal, Yevgen Chebotar, Xi~Chen, Krzysztof Choromanski, Tianli Ding, Danny Driess, Avinava Dubey, Chelsea Finn, Pete Florence, Chuyuan Fu, Montse~Gonzalez Arenas, Keerthana Gopalakrishnan, Kehang Han, Karol Hausman, Alexander Herzog, Jasmine Hsu, Brian Ichter, Alex Irpan, Nikhil Joshi, Ryan Julian, Dmitry Kalashnikov, Yuheng Kuang, Isabel Leal, Lisa Lee, Tsang-Wei~Edward Lee, Sergey Levine, Yao Lu, Henryk Michalewski, Igor Mordatch, Karl Pertsch, Kanishka Rao, Krista Reymann, Michael Ryoo, Grecia Salazar, Pannag Sanketi, Pierre Sermanet, Jaspiar Singh, Anikait Singh, Radu Soricut, Huong Tran, Vincent Vanhoucke, Quan Vuong, Ayzaan Wahid, Stefan Welker, Paul Wohlhart, Jialin Wu, Fei Xia, Ted Xiao, Peng Xu, Sichun Xu, Tianhe Yu, and Brianna Zitkovich.
|
| 79 |
+
\newblock {{RT-2}}: {{Vision-Language-Action Models Transfer Web Knowledge}} to {{Robotic Control}}, July 2023{\natexlab{a}}.
|
| 80 |
+
|
| 81 |
+
\bibitem[Brohan et~al.(2023{\natexlab{b}})Brohan, Brown, Carbajal, Chebotar, Dabis, Finn, Gopalakrishnan, Hausman, Herzog, Hsu, Ibarz, Ichter, Irpan, Jackson, Jesmonth, Joshi, Julian, Kalashnikov, Kuang, Leal, Lee, Levine, Lu, Malla, Manjunath, Mordatch, Nachum, Parada, Peralta, Perez, Pertsch, Quiambao, Rao, Ryoo, Salazar, Sanketi, Sayed, Singh, Sontakke, Stone, Tan, Tran, Vanhoucke, Vega, Vuong, Xia, Xiao, Xu, Xu, Yu, and Zitkovich]{brohanRT1RoboticsTransformer2023}
|
| 82 |
+
Anthony Brohan, Noah Brown, Justice Carbajal, Yevgen Chebotar, Joseph Dabis, Chelsea Finn, Keerthana Gopalakrishnan, Karol Hausman, Alex Herzog, Jasmine Hsu, Julian Ibarz, Brian Ichter, Alex Irpan, Tomas Jackson, Sally Jesmonth, Nikhil~J. Joshi, Ryan Julian, Dmitry Kalashnikov, Yuheng Kuang, Isabel Leal, Kuang-Huei Lee, Sergey Levine, Yao Lu, Utsav Malla, Deeksha Manjunath, Igor Mordatch, Ofir Nachum, Carolina Parada, Jodilyn Peralta, Emily Perez, Karl Pertsch, Jornell Quiambao, Kanishka Rao, Michael Ryoo, Grecia Salazar, Pannag Sanketi, Kevin Sayed, Jaspiar Singh, Sumedh Sontakke, Austin Stone, Clayton Tan, Huong Tran, Vincent Vanhoucke, Steve Vega, Quan Vuong, Fei Xia, Ted Xiao, Peng Xu, Sichun Xu, Tianhe Yu, and Brianna Zitkovich.
|
| 83 |
+
\newblock {{RT-1}}: {{Robotics Transformer}} for {{Real-World Control}} at {{Scale}}, August 2023{\natexlab{b}}.
|
| 84 |
+
|
| 85 |
+
\bibitem[Brown et~al.(2020)Brown, Mann, Ryder, Subbiah, Kaplan, Dhariwal, Neelakantan, Shyam, Sastry, Askell, Agarwal, {Herbert-Voss}, Krueger, Henighan, Child, Ramesh, Ziegler, Wu, Winter, Hesse, Chen, Sigler, Litwin, Gray, Chess, Clark, Berner, McCandlish, Radford, Sutskever, and Amodei]{brownLanguageModelsAre2020}
|
| 86 |
+
Tom~B. Brown, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared Kaplan, Prafulla Dhariwal, Arvind Neelakantan, Pranav Shyam, Girish Sastry, Amanda Askell, Sandhini Agarwal, Ariel {Herbert-Voss}, Gretchen Krueger, Tom Henighan, Rewon Child, Aditya Ramesh, Daniel~M. Ziegler, Jeffrey Wu, Clemens Winter, Christopher Hesse, Mark Chen, Eric Sigler, Mateusz Litwin, Scott Gray, Benjamin Chess, Jack Clark, Christopher Berner, Sam McCandlish, Alec Radford, Ilya Sutskever, and Dario Amodei.
|
| 87 |
+
\newblock Language {{Models}} are {{Few-Shot Learners}}, July 2020.
|
| 88 |
+
|
| 89 |
+
\bibitem[Byeon et~al.(2022)Byeon, Park, Kim, Lee, Baek, and Kim]{kakaobrain2022coyo700m}
|
| 90 |
+
Minwoo Byeon, Beomhee Park, Haecheon Kim, Sungjun Lee, Woonhyuk Baek, and Saehoon Kim.
|
| 91 |
+
\newblock {{COYO-700M}}: {{Image-text}} pair dataset, 2022.
|
| 92 |
+
|
| 93 |
+
\bibitem[Chebotar et~al.(2019)Chebotar, Handa, Makoviychuk, Macklin, Issac, Ratliff, and Fox]{chebotarClosingSimtorealLoop2019}
|
| 94 |
+
Yevgen Chebotar, Ankur Handa, Viktor Makoviychuk, Miles Macklin, Jan Issac, Nathan Ratliff, and Dieter Fox.
|
| 95 |
+
\newblock Closing the sim-to-real loop: {{Adapting}} simulation randomization with real world experience.
|
| 96 |
+
\newblock In \emph{2019 {{International Conference}} on {{Robotics}} and {{Automation}} ({{ICRA}})}, pages 8973--8979. IEEE, 2019.
|
| 97 |
+
|
| 98 |
+
\bibitem[Chen et~al.(2023)Chen, Djolonga, Padlewski, Mustafa, Changpinyo, Wu, Ruiz, Goodman, Wang, Tay, Shakeri, Dehghani, Salz, Lucic, Tschannen, Nagrani, Hu, Joshi, Pang, Montgomery, Pietrzyk, Ritter, Piergiovanni, Minderer, Pavetic, Waters, Li, Alabdulmohsin, Beyer, Amelot, Lee, Steiner, Li, Keysers, Arnab, Xu, Rong, Kolesnikov, Seyedhosseini, Angelova, Zhai, Houlsby, and Soricut]{chenPaLIXScalingMultilingual2023}
|
| 99 |
+
Xi~Chen, Josip Djolonga, Piotr Padlewski, Basil Mustafa, Soravit Changpinyo, Jialin Wu, Carlos~Riquelme Ruiz, Sebastian Goodman, Xiao Wang, Yi~Tay, Siamak Shakeri, Mostafa Dehghani, Daniel Salz, Mario Lucic, Michael Tschannen, Arsha Nagrani, Hexiang Hu, Mandar Joshi, Bo~Pang, Ceslee Montgomery, Paulina Pietrzyk, Marvin Ritter, A.~J. Piergiovanni, Matthias Minderer, Filip Pavetic, Austin Waters, Gang Li, Ibrahim Alabdulmohsin, Lucas Beyer, Julien Amelot, Kenton Lee, Andreas~Peter Steiner, Yang Li, Daniel Keysers, Anurag Arnab, Yuanzhong Xu, Keran Rong, Alexander Kolesnikov, Mojtaba Seyedhosseini, Anelia Angelova, Xiaohua Zhai, Neil Houlsby, and Radu Soricut.
|
| 100 |
+
\newblock {{PaLI-X}}: {{On Scaling}} up a {{Multilingual Vision}} and {{Language Model}}, May 2023.
|
| 101 |
+
|
| 102 |
+
\bibitem[Chi et~al.(2024)Chi, Xu, Feng, Cousineau, Du, Burchfiel, Tedrake, and Song]{chiDiffusionPolicyVisuomotor2024}
|
| 103 |
+
Cheng Chi, Zhenjia Xu, Siyuan Feng, Eric Cousineau, Yilun Du, Benjamin Burchfiel, Russ Tedrake, and Shuran Song.
|
| 104 |
+
\newblock Diffusion {{Policy}}: {{Visuomotor Policy Learning}} via {{Action Diffusion}}, March 2024.
|
| 105 |
+
|
| 106 |
+
\bibitem[Connell and Mahadevan(1993)]{connellRobotLearning1993}
|
| 107 |
+
Jonathan~H. Connell and Sridhar Mahadevan, editors.
|
| 108 |
+
\newblock \emph{Robot {{Learning}}}.
|
| 109 |
+
\newblock Springer US, Boston, MA, 1993.
|
| 110 |
+
\newblock ISBN 978-1-4613-6396-5 978-1-4615-3184-5.
|
| 111 |
+
\newblock \doi{10.1007/978-1-4615-3184-5}.
|
| 112 |
+
|
| 113 |
+
\bibitem[Dai et~al.(2023)Dai, Li, Li, Tiong, Zhao, Wang, Li, Fung, and Hoi]{InstructBLIP}
|
| 114 |
+
Wenliang Dai, Junnan Li, Dongxu Li, Anthony Tiong, Junqi Zhao, Weisheng Wang, Boyang Li, Pascale Fung, and Steven Hoi.
|
| 115 |
+
\newblock {{InstructBLIP}}: {{Towards}} general-purpose vision-language models with instruction tuning.
|
| 116 |
+
\newblock In \emph{Thirty-Seventh Conference on Neural Information Processing Systems}, 2023.
|
| 117 |
+
|
| 118 |
+
\bibitem[Degrave et~al.(2022)Degrave, Felici, Buchli, Neunert, Tracey, Carpanese, Ewalds, Hafner, Abdolmaleki, {de las Casas}, Donner, Fritz, Galperti, Huber, Keeling, Tsimpoukelli, Kay, Merle, Moret, Noury, Pesamosca, Pfau, Sauter, Sommariva, Coda, Duval, Fasoli, Kohli, Kavukcuoglu, Hassabis, and Riedmiller]{degraveMagneticControlTokamak2022}
|
| 119 |
+
Jonas Degrave, Federico Felici, Jonas Buchli, Michael Neunert, Brendan Tracey, Francesco Carpanese, Timo Ewalds, Roland Hafner, Abbas Abdolmaleki, Diego {de las Casas}, Craig Donner, Leslie Fritz, Cristian Galperti, Andrea Huber, James Keeling, Maria Tsimpoukelli, Jackie Kay, Antoine Merle, Jean-Marc Moret, Seb Noury, Federico Pesamosca, David Pfau, Olivier Sauter, Cristian Sommariva, Stefano Coda, Basil Duval, Ambrogio Fasoli, Pushmeet Kohli, Koray Kavukcuoglu, Demis Hassabis, and Martin Riedmiller.
|
| 120 |
+
\newblock Magnetic control of tokamak plasmas through deep reinforcement learning.
|
| 121 |
+
\newblock \emph{Nature}, 602\penalty0 (7897):\penalty0 414--419, February 2022.
|
| 122 |
+
\newblock ISSN 1476-4687.
|
| 123 |
+
\newblock \doi{10.1038/s41586-021-04301-9}.
|
| 124 |
+
|
| 125 |
+
\bibitem[Deng et~al.(2009)Deng, Li, Do, Su, and {Fei-Fei}]{ImageNet_VSS09}
|
| 126 |
+
J.~Deng, K.~Li, M.~Do, H.~Su, and L.~{Fei-Fei}.
|
| 127 |
+
\newblock Construction and analysis of a large scale image ontology.
|
| 128 |
+
\newblock Vision Sciences Society, 2009.
|
| 129 |
+
|
| 130 |
+
\bibitem[Devlin et~al.(2019)Devlin, Chang, Lee, and Toutanova]{devlinBERTPretrainingDeep2019}
|
| 131 |
+
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova.
|
| 132 |
+
\newblock {{BERT}}: {{Pre-training}} of {{Deep Bidirectional Transformers}} for {{Language Understanding}}, May 2019.
|
| 133 |
+
|
| 134 |
+
\bibitem[Driess et~al.(2023)Driess, Xia, Sajjadi, Lynch, Chowdhery, Ichter, Wahid, Tompson, Vuong, Yu, Huang, Chebotar, Sermanet, Duckworth, Levine, Vanhoucke, Hausman, Toussaint, Greff, Zeng, Mordatch, and Florence]{driessPaLMEEmbodiedMultimodal2023}
|
| 135 |
+
Danny Driess, Fei Xia, Mehdi S.~M. Sajjadi, Corey Lynch, Aakanksha Chowdhery, Brian Ichter, Ayzaan Wahid, Jonathan Tompson, Quan Vuong, Tianhe Yu, Wenlong Huang, Yevgen Chebotar, Pierre Sermanet, Daniel Duckworth, Sergey Levine, Vincent Vanhoucke, Karol Hausman, Marc Toussaint, Klaus Greff, Andy Zeng, Igor Mordatch, and Pete Florence.
|
| 136 |
+
\newblock {{PaLM-E}}: {{An Embodied Multimodal Language Model}}, March 2023.
|
| 137 |
+
|
| 138 |
+
\bibitem[Driess et~al.(2025)Driess, Springenberg, Ichter, Yu, {Li-Bell}, Pertsch, Ren, Walke, Vuong, Shi, and Levine]{driessKnowledgeInsulatingVisionLanguageAction2025}
|
| 139 |
+
Danny Driess, Jost~Tobias Springenberg, Brian Ichter, Lili Yu, Adrian {Li-Bell}, Karl Pertsch, Allen~Z. Ren, Homer Walke, Quan Vuong, Lucy~Xiaoyang Shi, and Sergey Levine.
|
| 140 |
+
\newblock Knowledge {{Insulating Vision-Language-Action Models}}: {{Train Fast}}, {{Run Fast}}, {{Generalize Better}}, May 2025.
|
| 141 |
+
|
| 142 |
+
\bibitem[Esser et~al.(2024)Esser, Kulal, Blattmann, Entezari, M{\"u}ller, Saini, Levi, Lorenz, Sauer, Boesel, Podell, Dockhorn, English, Lacey, Goodwin, Marek, and Rombach]{esserScalingRectifiedFlow2024}
|
| 143 |
+
Patrick Esser, Sumith Kulal, Andreas Blattmann, Rahim Entezari, Jonas M{\"u}ller, Harry Saini, Yam Levi, Dominik Lorenz, Axel Sauer, Frederic Boesel, Dustin Podell, Tim Dockhorn, Zion English, Kyle Lacey, Alex Goodwin, Yannik Marek, and Robin Rombach.
|
| 144 |
+
\newblock Scaling {{Rectified Flow Transformers}} for {{High-Resolution Image Synthesis}}, March 2024.
|
| 145 |
+
|
| 146 |
+
\bibitem[Fedus et~al.(2022)Fedus, Dean, and Zoph]{fedusReviewSparseExpert2022}
|
| 147 |
+
William Fedus, Jeff Dean, and Barret Zoph.
|
| 148 |
+
\newblock A {{Review}} of {{Sparse Expert Models}} in {{Deep Learning}}, September 2022.
|
| 149 |
+
|
| 150 |
+
\bibitem[Fini et~al.(2024)Fini, Shukor, Li, Dufter, Klein, Haldimann, Aitharaju, da~Costa, B{\'e}thune, Gan, Toshev, Eichner, Nabi, Yang, Susskind, and {El-Nouby}]{finiMultimodalAutoregressivePretraining2024}
|
| 151 |
+
Enrico Fini, Mustafa Shukor, Xiujun Li, Philipp Dufter, Michal Klein, David Haldimann, Sai Aitharaju, Victor Guilherme~Turrisi da~Costa, Louis B{\'e}thune, Zhe Gan, Alexander~T. Toshev, Marcin Eichner, Moin Nabi, Yinfei Yang, Joshua~M. Susskind, and Alaaeldin {El-Nouby}.
|
| 152 |
+
\newblock Multimodal {{Autoregressive Pre-training}} of {{Large Vision Encoders}}, November 2024.
|
| 153 |
+
|
| 154 |
+
\bibitem[Florence et~al.(2022)Florence, Lynch, Zeng, Ramirez, Wahid, Downs, Wong, Lee, Mordatch, and Tompson]{florenceImplicitBehavioralCloning2022}
|
| 155 |
+
Pete Florence, Corey Lynch, Andy Zeng, Oscar~A. Ramirez, Ayzaan Wahid, Laura Downs, Adrian Wong, Johnny Lee, Igor Mordatch, and Jonathan Tompson.
|
| 156 |
+
\newblock Implicit {{Behavioral Cloning}}.
|
| 157 |
+
\newblock In \emph{Proceedings of the 5th {{Conference}} on {{Robot Learning}}}, pages 158--168. PMLR, January 2022.
|
| 158 |
+
|
| 159 |
+
\bibitem[Fujita et~al.(2020)Fujita, Soda, Murata, and Tsuhari]{fujitaDevelopmentRobotsNuclear2020}
|
| 160 |
+
Jun Fujita, Daisuke Soda, Chotaro Murata, and Hiroyuki Tsuhari.
|
| 161 |
+
\newblock Development of {{Robots}} for {{Nuclear Power Plants}} and {{Their Application}} to {{New Fields}}.
|
| 162 |
+
\newblock 57\penalty0 (4), 2020.
|
| 163 |
+
|
| 164 |
+
\bibitem[Grattafiori et~al.(2024)Grattafiori, Dubey, Jauhri, Pandey, Kadian, {Al-Dahle}, Letman, Mathur, Schelten, Vaughan, Yang, Fan, Goyal, Hartshorn, Yang, Mitra, Sravankumar, Korenev, Hinsvark, Rao, Zhang, Rodriguez, Gregerson, Spataru, Roziere, Biron, Tang, Chern, Caucheteux, Nayak, Bi, Marra, McConnell, Keller, Touret, Wu, Wong, Ferrer, Nikolaidis, Allonsius, Song, Pintz, Livshits, Wyatt, Esiobu, Choudhary, Mahajan, {Garcia-Olano}, Perino, Hupkes, Lakomkin, AlBadawy, Lobanova, Dinan, Smith, Radenovic, Guzm{\'a}n, Zhang, Synnaeve, Lee, Anderson, Thattai, Nail, Mialon, Pang, Cucurell, Nguyen, Korevaar, Xu, Touvron, Zarov, Ibarra, Kloumann, Misra, Evtimov, Zhang, Copet, Lee, Geffert, Vranes, Park, Mahadeokar, Shah, van~der Linde, Billock, Hong, Lee, Fu, Chi, Huang, Liu, Wang, Yu, Bitton, Spisak, Park, Rocca, Johnstun, Saxe, Jia, Alwala, Prasad, Upasani, Plawiak, Li, Heafield, Stone, {El-Arini}, Iyer, Malik, Chiu, Bhalla, Lakhotia, {Rantala-Yeary}, van~der Maaten, Chen, Tan, Jenkins, Martin, Madaan, Malo, Blecher, Landzaat, de~Oliveira, Muzzi, Pasupuleti, Singh, Paluri, Kardas, Tsimpoukelli, Oldham, Rita, Pavlova, Kambadur, Lewis, Si, Singh, Hassan, Goyal, Torabi, Bashlykov, Bogoychev, Chatterji, Zhang, Duchenne, {\c C}elebi, Alrassy, Zhang, Li, Vasic, Weng, Bhargava, Dubal, Krishnan, Koura, Xu, He, Dong, Srinivasan, Ganapathy, Calderer, Cabral, Stojnic, Raileanu, Maheswari, Girdhar, Patel, Sauvestre, Polidoro, Sumbaly, Taylor, Silva, Hou, Wang, Hosseini, Chennabasappa, Singh, Bell, Kim, Edunov, Nie, Narang, Raparthy, Shen, Wan, Bhosale, Zhang, Vandenhende, Batra, Whitman, Sootla, Collot, Gururangan, Borodinsky, Herman, Fowler, Sheasha, Georgiou, Scialom, Speckbacher, Mihaylov, Xiao, Karn, Goswami, Gupta, Ramanathan, Kerkez, Gonguet, Do, Vogeti, Albiero, Petrovic, Chu, Xiong, Fu, Meers, Martinet, Wang, Wang, Tan, Xia, Xie, Jia, Wang, Goldschlag, Gaur, Babaei, Wen, Song, Zhang, Li, Mao, Coudert, Yan, Chen, Papakipos, Singh, Srivastava, Jain, Kelsey, Shajnfeld, Gangidi, Victoria, Goldstand, Menon, Sharma, Boesenberg, Baevski, Feinstein, Kallet, Sangani, Teo, Yunus, Lupu, Alvarado, Caples, Gu, Ho, Poulton, Ryan, Ramchandani, Dong, Franco, Goyal, Saraf, Chowdhury, Gabriel, Bharambe, Eisenman, Yazdan, James, Maurer, Leonhardi, Huang, Loyd, Paola, Paranjape, Liu, Wu, Ni, Hancock, Wasti, Spence, Stojkovic, Gamido, Montalvo, Parker, Burton, Mejia, Liu, Wang, Kim, Zhou, Hu, Chu, Cai, Tindal, Feichtenhofer, Gao, Civin, Beaty, Kreymer, Li, Adkins, Xu, Testuggine, David, Parikh, Liskovich, Foss, Wang, Le, Holland, Dowling, Jamil, Montgomery, Presani, Hahn, Wood, Le, Brinkman, Arcaute, Dunbar, Smothers, Sun, Kreuk, Tian, Kokkinos, Ozgenel, Caggioni, Kanayet, Seide, Florez, Schwarz, Badeer, Swee, Halpern, Herman, Sizov, Guangyi, Zhang, Lakshminarayanan, Inan, Shojanazeri, Zou, Wang, Zha, Habeeb, Rudolph, Suk, Aspegren, Goldman, Zhan, Damlaj, Molybog, Tufanov, Leontiadis, Veliche, Gat, Weissman, Geboski, Kohli, Lam, Asher, Gaya, Marcus, Tang, Chan, Zhen, Reizenstein, Teboul, Zhong, Jin, Yang, Cummings, Carvill, Shepard, McPhie, Torres, Ginsburg, Wang, Wu, U, Saxena, Khandelwal, Zand, Matosich, Veeraraghavan, Michelena, Li, Jagadeesh, Huang, Chawla, Huang, Chen, Garg, A, Silva, Bell, Zhang, Guo, Yu, Moshkovich, Wehrstedt, Khabsa, Avalani, Bhatt, Mankus, Hasson, Lennie, Reso, Groshev, Naumov, Lathi, Keneally, Liu, Seltzer, Valko, Restrepo, Patel, Vyatskov, Samvelyan, Clark, Macey, Wang, Hermoso, Metanat, Rastegari, Bansal, Santhanam, Parks, White, Bawa, Singhal, Egebo, Usunier, Mehta, Laptev, Dong, Cheng, Chernoguz, Hart, Salpekar, Kalinli, Kent, Parekh, Saab, Balaji, Rittner, Bontrager, Roux, Dollar, Zvyagina, Ratanchandani, Yuvraj, Liang, Alao, Rodriguez, Ayub, Murthy, Nayani, Mitra, Parthasarathy, Li, Hogan, Battey, Wang, Howes, Rinott, Mehta, Siby, Bondu, Datta, Chugh, Hunt, Dhillon, Sidorov, Pan, Mahajan, Verma, Yamamoto, Ramaswamy, Lindsay, Lindsay, Feng, Lin, Zha, Patil, Shankar, Zhang, Zhang, Wang, Agarwal, Sajuyigbe, Chintala, Max, Chen, Kehoe, Satterfield, Govindaprasad, Gupta, Deng, Cho, Virk, Subramanian, Choudhury, Goldman, Remez, Glaser, Best, Koehler, Robinson, Li, Zhang, Matthews, Chou, Shaked, Vontimitta, Ajayi, Montanez, Mohan, Kumar, Mangla, Ionescu, Poenaru, Mihailescu, Ivanov, Li, Wang, Jiang, Bouaziz, Constable, Tang, Wu, Wang, Wu, Gao, Kleinman, Chen, Hu, Jia, Qi, Li, Zhang, Zhang, Adi, Nam, Yu, Wang, Zhao, Hao, Qian, Li, He, Rait, DeVito, Rosnbrick, Wen, Yang, Zhao, and Ma]{grattafioriLlama3Herd2024}
|
| 165 |
+
Aaron Grattafiori, Abhimanyu Dubey, Abhinav Jauhri, Abhinav Pandey, Abhishek Kadian, Ahmad {Al-Dahle}, Aiesha Letman, Akhil Mathur, Alan Schelten, Alex Vaughan, Amy Yang, Angela Fan, Anirudh Goyal, Anthony Hartshorn, Aobo Yang, Archi Mitra, Archie Sravankumar, Artem Korenev, Arthur Hinsvark, Arun Rao, Aston Zhang, Aurelien Rodriguez, Austen Gregerson, Ava Spataru, Baptiste Roziere, Bethany Biron, Binh Tang, Bobbie Chern, Charlotte Caucheteux, Chaya Nayak, Chloe Bi, Chris Marra, Chris McConnell, Christian Keller, Christophe Touret, Chunyang Wu, Corinne Wong, Cristian~Canton Ferrer, Cyrus Nikolaidis, Damien Allonsius, Daniel Song, Danielle Pintz, Danny Livshits, Danny Wyatt, David Esiobu, Dhruv Choudhary, Dhruv Mahajan, Diego {Garcia-Olano}, Diego Perino, Dieuwke Hupkes, Egor Lakomkin, Ehab AlBadawy, Elina Lobanova, Emily Dinan, Eric~Michael Smith, Filip Radenovic, Francisco Guzm{\'a}n, Frank Zhang, Gabriel Synnaeve, Gabrielle Lee, Georgia~Lewis Anderson, Govind Thattai, Graeme Nail, Gregoire Mialon, Guan Pang, Guillem Cucurell, Hailey Nguyen, Hannah Korevaar, Hu~Xu, Hugo Touvron, Iliyan Zarov, Imanol~Arrieta Ibarra, Isabel Kloumann, Ishan Misra, Ivan Evtimov, Jack Zhang, Jade Copet, Jaewon Lee, Jan Geffert, Jana Vranes, Jason Park, Jay Mahadeokar, Jeet Shah, Jelmer van~der Linde, Jennifer Billock, Jenny Hong, Jenya Lee, Jeremy Fu, Jianfeng Chi, Jianyu Huang, Jiawen Liu, Jie Wang, Jiecao Yu, Joanna Bitton, Joe Spisak, Jongsoo Park, Joseph Rocca, Joshua Johnstun, Joshua Saxe, Junteng Jia, Kalyan~Vasuden Alwala, Karthik Prasad, Kartikeya Upasani, Kate Plawiak, Ke~Li, Kenneth Heafield, Kevin Stone, Khalid {El-Arini}, Krithika Iyer, Kshitiz Malik, Kuenley Chiu, Kunal Bhalla, Kushal Lakhotia, Lauren {Rantala-Yeary}, Laurens van~der Maaten, Lawrence Chen, Liang Tan, Liz Jenkins, Louis Martin, Lovish Madaan, Lubo Malo, Lukas Blecher, Lukas Landzaat, Luke de~Oliveira, Madeline Muzzi, Mahesh Pasupuleti, Mannat Singh, Manohar Paluri, Marcin Kardas, Maria Tsimpoukelli, Mathew Oldham, Mathieu Rita, Maya Pavlova, Melanie Kambadur, Mike Lewis, Min Si, Mitesh~Kumar Singh, Mona Hassan, Naman Goyal, Narjes Torabi, Nikolay Bashlykov, Nikolay Bogoychev, Niladri Chatterji, Ning Zhang, Olivier Duchenne, Onur {\c C}elebi, Patrick Alrassy, Pengchuan Zhang, Pengwei Li, Petar Vasic, Peter Weng, Prajjwal Bhargava, Pratik Dubal, Praveen Krishnan, Punit~Singh Koura, Puxin Xu, Qing He, Qingxiao Dong, Ragavan Srinivasan, Raj Ganapathy, Ramon Calderer, Ricardo~Silveira Cabral, Robert Stojnic, Roberta Raileanu, Rohan Maheswari, Rohit Girdhar, Rohit Patel, Romain Sauvestre, Ronnie Polidoro, Roshan Sumbaly, Ross Taylor, Ruan Silva, Rui Hou, Rui Wang, Saghar Hosseini, Sahana Chennabasappa, Sanjay Singh, Sean Bell, Seohyun~Sonia Kim, Sergey Edunov, Shaoliang Nie, Sharan Narang, Sharath Raparthy, Sheng Shen, Shengye Wan, Shruti Bhosale, Shun Zhang, Simon Vandenhende, Soumya Batra, Spencer Whitman, Sten Sootla, Stephane Collot, Suchin Gururangan, Sydney Borodinsky, Tamar Herman, Tara Fowler, Tarek Sheasha, Thomas Georgiou, Thomas Scialom, Tobias Speckbacher, Todor Mihaylov, Tong Xiao, Ujjwal Karn, Vedanuj Goswami, Vibhor Gupta, Vignesh Ramanathan, Viktor Kerkez, Vincent Gonguet, Virginie Do, Vish Vogeti, V{\'i}tor Albiero, Vladan Petrovic, Weiwei Chu, Wenhan Xiong, Wenyin Fu, Whitney Meers, Xavier Martinet, Xiaodong Wang, Xiaofang Wang, Xiaoqing~Ellen Tan, Xide Xia, Xinfeng Xie, Xuchao Jia, Xuewei Wang, Yaelle Goldschlag, Yashesh Gaur, Yasmine Babaei, Yi~Wen, Yiwen Song, Yuchen Zhang, Yue Li, Yuning Mao, Zacharie~Delpierre Coudert, Zheng Yan, Zhengxing Chen, Zoe Papakipos, Aaditya Singh, Aayushi Srivastava, Abha Jain, Adam Kelsey, Adam Shajnfeld, Adithya Gangidi, Adolfo Victoria, Ahuva Goldstand, Ajay Menon, Ajay Sharma, Alex Boesenberg, Alexei Baevski, Allie Feinstein, Amanda Kallet, Amit Sangani, Amos Teo, Anam Yunus, Andrei Lupu, Andres Alvarado, Andrew Caples, Andrew Gu, Andrew Ho, Andrew Poulton, Andrew Ryan, Ankit Ramchandani, Annie Dong, Annie Franco, Anuj Goyal, Aparajita Saraf, Arkabandhu Chowdhury, Ashley Gabriel, Ashwin Bharambe, Assaf Eisenman, Azadeh Yazdan, Beau James, Ben Maurer, Benjamin Leonhardi, Bernie Huang, Beth Loyd, Beto~De Paola, Bhargavi Paranjape, Bing Liu, Bo~Wu, Boyu Ni, Braden Hancock, Bram Wasti, Brandon Spence, Brani Stojkovic, Brian Gamido, Britt Montalvo, Carl Parker, Carly Burton, Catalina Mejia, Ce~Liu, Changhan Wang, Changkyu Kim, Chao Zhou, Chester Hu, Ching-Hsiang Chu, Chris Cai, Chris Tindal, Christoph Feichtenhofer, Cynthia Gao, Damon Civin, Dana Beaty, Daniel Kreymer, Daniel Li, David Adkins, David Xu, Davide Testuggine, Delia David, Devi Parikh, Diana Liskovich, Didem Foss, Dingkang Wang, Duc Le, Dustin Holland, Edward Dowling, Eissa Jamil, Elaine Montgomery, Eleonora Presani, Emily Hahn, Emily Wood, Eric-Tuan Le, Erik Brinkman, Esteban Arcaute, Evan Dunbar, Evan Smothers, Fei Sun, Felix Kreuk, Feng Tian, Filippos Kokkinos, Firat Ozgenel, Francesco Caggioni, Frank Kanayet, Frank Seide, Gabriela~Medina Florez, Gabriella Schwarz, Gada Badeer, Georgia Swee, Gil Halpern, Grant Herman, Grigory Sizov, Guangyi, Zhang, Guna Lakshminarayanan, Hakan Inan, Hamid Shojanazeri, Han Zou, Hannah Wang, Hanwen Zha, Haroun Habeeb, Harrison Rudolph, Helen Suk, Henry Aspegren, Hunter Goldman, Hongyuan Zhan, Ibrahim Damlaj, Igor Molybog, Igor Tufanov, Ilias Leontiadis, Irina-Elena Veliche, Itai Gat, Jake Weissman, James Geboski, James Kohli, Janice Lam, Japhet Asher, Jean-Baptiste Gaya, Jeff Marcus, Jeff Tang, Jennifer Chan, Jenny Zhen, Jeremy Reizenstein, Jeremy Teboul, Jessica Zhong, Jian Jin, Jingyi Yang, Joe Cummings, Jon Carvill, Jon Shepard, Jonathan McPhie, Jonathan Torres, Josh Ginsburg, Junjie Wang, Kai Wu, Kam~Hou U, Karan Saxena, Kartikay Khandelwal, Katayoun Zand, Kathy Matosich, Kaushik Veeraraghavan, Kelly Michelena, Keqian Li, Kiran Jagadeesh, Kun Huang, Kunal Chawla, Kyle Huang, Lailin Chen, Lakshya Garg, Lavender A, Leandro Silva, Lee Bell, Lei Zhang, Liangpeng Guo, Licheng Yu, Liron Moshkovich, Luca Wehrstedt, Madian Khabsa, Manav Avalani, Manish Bhatt, Martynas Mankus, Matan Hasson, Matthew Lennie, Matthias Reso, Maxim Groshev, Maxim Naumov, Maya Lathi, Meghan Keneally, Miao Liu, Michael~L. Seltzer, Michal Valko, Michelle Restrepo, Mihir Patel, Mik Vyatskov, Mikayel Samvelyan, Mike Clark, Mike Macey, Mike Wang, Miquel~Jubert Hermoso, Mo~Metanat, Mohammad Rastegari, Munish Bansal, Nandhini Santhanam, Natascha Parks, Natasha White, Navyata Bawa, Nayan Singhal, Nick Egebo, Nicolas Usunier, Nikhil Mehta, Nikolay~Pavlovich Laptev, Ning Dong, Norman Cheng, Oleg Chernoguz, Olivia Hart, Omkar Salpekar, Ozlem Kalinli, Parkin Kent, Parth Parekh, Paul Saab, Pavan Balaji, Pedro Rittner, Philip Bontrager, Pierre Roux, Piotr Dollar, Polina Zvyagina, Prashant Ratanchandani, Pritish Yuvraj, Qian Liang, Rachad Alao, Rachel Rodriguez, Rafi Ayub, Raghotham Murthy, Raghu Nayani, Rahul Mitra, Rangaprabhu Parthasarathy, Raymond Li, Rebekkah Hogan, Robin Battey, Rocky Wang, Russ Howes, Ruty Rinott, Sachin Mehta, Sachin Siby, Sai~Jayesh Bondu, Samyak Datta, Sara Chugh, Sara Hunt, Sargun Dhillon, Sasha Sidorov, Satadru Pan, Saurabh Mahajan, Saurabh Verma, Seiji Yamamoto, Sharadh Ramaswamy, Shaun Lindsay, Shaun Lindsay, Sheng Feng, Shenghao Lin, Shengxin~Cindy Zha, Shishir Patil, Shiva Shankar, Shuqiang Zhang, Shuqiang Zhang, Sinong Wang, Sneha Agarwal, Soji Sajuyigbe, Soumith Chintala, Stephanie Max, Stephen Chen, Steve Kehoe, Steve Satterfield, Sudarshan Govindaprasad, Sumit Gupta, Summer Deng, Sungmin Cho, Sunny Virk, Suraj Subramanian, Sy~Choudhury, Sydney Goldman, Tal Remez, Tamar Glaser, Tamara Best, Thilo Koehler, Thomas Robinson, Tianhe Li, Tianjun Zhang, Tim Matthews, Timothy Chou, Tzook Shaked, Varun Vontimitta, Victoria Ajayi, Victoria Montanez, Vijai Mohan, Vinay~Satish Kumar, Vishal Mangla, Vlad Ionescu, Vlad Poenaru, Vlad~Tiberiu Mihailescu, Vladimir Ivanov, Wei Li, Wenchen Wang, Wenwen Jiang, Wes Bouaziz, Will Constable, Xiaocheng Tang, Xiaojian Wu, Xiaolan Wang, Xilun Wu, Xinbo Gao, Yaniv Kleinman, Yanjun Chen, Ye~Hu, Ye~Jia, Ye~Qi, Yenda Li, Yilin Zhang, Ying Zhang, Yossi Adi, Youngjin Nam, Yu, Wang, Yu~Zhao, Yuchen Hao, Yundi Qian, Yunlu Li, Yuzi He, Zach Rait, Zachary DeVito, Zef Rosnbrick, Zhaoduo Wen, Zhenyu Yang, Zhiwei Zhao, and Zhiyu Ma.
|
| 166 |
+
\newblock The {{Llama}} 3 {{Herd}} of {{Models}}, November 2024.
|
| 167 |
+
|
| 168 |
+
\bibitem[Griffin et~al.(2017)Griffin, Wiedebach, Bertrand, Leonessa, and Pratt]{griffinWalkingStabilizationUsing2017}
|
| 169 |
+
Robert~J. Griffin, Georg Wiedebach, Sylvain Bertrand, Alexander Leonessa, and Jerry Pratt.
|
| 170 |
+
\newblock Walking {{Stabilization Using Step Timing}} and {{Location Adjustment}} on the {{Humanoid Robot}}, {{Atlas}}.
|
| 171 |
+
\newblock In \emph{2017 {{IEEE}}/{{RSJ International Conference}} on {{Intelligent Robots}} and {{Systems}} ({{IROS}})}, pages 667--673, September 2017.
|
| 172 |
+
\newblock \doi{10.1109/IROS.2017.8202223}.
|
| 173 |
+
|
| 174 |
+
\bibitem[Haarnoja et~al.(2017)Haarnoja, Tang, Abbeel, and Levine]{haarnojaReinforcementLearningDeep2017b}
|
| 175 |
+
Tuomas Haarnoja, Haoran Tang, Pieter Abbeel, and Sergey Levine.
|
| 176 |
+
\newblock Reinforcement {{Learning}} with {{Deep Energy-Based Policies}}.
|
| 177 |
+
\newblock In \emph{Proceedings of the 34th {{International Conference}} on {{Machine Learning}}}, pages 1352--1361. PMLR, July 2017.
|
| 178 |
+
|
| 179 |
+
\bibitem[Haarnoja et~al.(2018)Haarnoja, Zhou, Abbeel, and Levine]{haarnojaSoftActorCriticOffPolicy2018}
|
| 180 |
+
Tuomas Haarnoja, Aurick Zhou, Pieter Abbeel, and Sergey Levine.
|
| 181 |
+
\newblock Soft {{Actor-Critic}}: {{Off-Policy Maximum Entropy Deep Reinforcement Learning}} with a {{Stochastic Actor}}, August 2018.
|
| 182 |
+
|
| 183 |
+
\bibitem[Hansen et~al.(2022)Hansen, Wang, and Su]{hansenTemporalDifferenceLearning2022}
|
| 184 |
+
Nicklas Hansen, Xiaolong Wang, and Hao Su.
|
| 185 |
+
\newblock Temporal {{Difference Learning}} for {{Model Predictive Control}}, July 2022.
|
| 186 |
+
|
| 187 |
+
\bibitem[Heess et~al.(2017)Heess, TB, Sriram, Lemmon, Merel, Wayne, Tassa, Erez, Wang, Eslami, Riedmiller, and Silver]{heessEmergenceLocomotionBehaviours2017}
|
| 188 |
+
Nicolas Heess, Dhruva TB, Srinivasan Sriram, Jay Lemmon, Josh Merel, Greg Wayne, Yuval Tassa, Tom Erez, Ziyu Wang, S.~M.~Ali Eslami, Martin Riedmiller, and David Silver.
|
| 189 |
+
\newblock Emergence of {{Locomotion Behaviours}} in {{Rich Environments}}, July 2017.
|
| 190 |
+
|
| 191 |
+
\bibitem[Higgins et~al.(2017)Higgins, Matthey, Pal, Burgess, Glorot, Botvinick, Mohamed, and Lerchner]{higgins2017beta}
|
| 192 |
+
Irina Higgins, Loic Matthey, Arka Pal, Christopher Burgess, Xavier Glorot, Matthew Botvinick, Shakir Mohamed, and Alexander Lerchner.
|
| 193 |
+
\newblock Beta-vae: {{Learning}} basic visual concepts with a constrained variational framework.
|
| 194 |
+
\newblock In \emph{International Conference on Learning Representations}, 2017.
|
| 195 |
+
|
| 196 |
+
\bibitem[Ho et~al.(2020)Ho, Jain, and Abbeel]{hoDenoisingDiffusionProbabilistic2020}
|
| 197 |
+
Jonathan Ho, Ajay Jain, and Pieter Abbeel.
|
| 198 |
+
\newblock Denoising {{Diffusion Probabilistic Models}}, December 2020.
|
| 199 |
+
|
| 200 |
+
\bibitem[Jang et~al.(2022)Jang, Irpan, Khansari, Kappler, Ebert, Lynch, Levine, and Finn]{jangBCZZeroShotTask2022}
|
| 201 |
+
Eric Jang, Alex Irpan, Mohi Khansari, Daniel Kappler, Frederik Ebert, Corey Lynch, Sergey Levine, and Chelsea Finn.
|
| 202 |
+
\newblock {{BC-Z}}: {{Zero-Shot Task Generalization}} with {{Robotic Imitation Learning}}, February 2022.
|
| 203 |
+
|
| 204 |
+
\bibitem[Janner et~al.(2022)Janner, Du, Tenenbaum, and Levine]{jannerPlanningDiffusionFlexible2022}
|
| 205 |
+
Michael Janner, Yilun Du, Joshua~B. Tenenbaum, and Sergey Levine.
|
| 206 |
+
\newblock Planning with {{Diffusion}} for {{Flexible Behavior Synthesis}}, December 2022.
|
| 207 |
+
|
| 208 |
+
\bibitem[Ji et~al.(2023)Ji, Margolis, and Agrawal]{jiDribbleBotDynamicLegged2023}
|
| 209 |
+
Yandong Ji, Gabriel~B. Margolis, and Pulkit Agrawal.
|
| 210 |
+
\newblock {{DribbleBot}}: {{Dynamic Legged Manipulation}} in the {{Wild}}, April 2023.
|
| 211 |
+
|
| 212 |
+
\bibitem[Jiang et~al.(2023)Jiang, Sablayrolles, Mensch, Bamford, Chaplot, de~las Casas, Bressand, Lengyel, Lample, Saulnier, Lavaud, Lachaux, Stock, Scao, Lavril, Wang, Lacroix, and Sayed]{jiangMistral7B2023}
|
| 213 |
+
Albert~Q. Jiang, Alexandre Sablayrolles, Arthur Mensch, Chris Bamford, Devendra~Singh Chaplot, Diego de~las Casas, Florian Bressand, Gianna Lengyel, Guillaume Lample, Lucile Saulnier, L{\'e}lio~Renard Lavaud, Marie-Anne Lachaux, Pierre Stock, Teven~Le Scao, Thibaut Lavril, Thomas Wang, Timoth{\'e}e Lacroix, and William~El Sayed.
|
| 214 |
+
\newblock Mistral {{7B}}, October 2023.
|
| 215 |
+
|
| 216 |
+
\bibitem[Ke et~al.(2020)Ke, Wang, Bhattacharjee, Boots, and Srinivasa]{keGraspingChopsticksCombating2020}
|
| 217 |
+
Liyiming Ke, Jingqiang Wang, Tapomayukh Bhattacharjee, Byron Boots, and Siddhartha Srinivasa.
|
| 218 |
+
\newblock Grasping with {{Chopsticks}}: {{Combating Covariate Shift}} in {{Model-free Imitation Learning}} for {{Fine Manipulation}}, November 2020.
|
| 219 |
+
|
| 220 |
+
\bibitem[Khazatsky et~al.(2025)Khazatsky, Pertsch, Nair, Balakrishna, Dasari, Karamcheti, Nasiriany, Srirama, Chen, Ellis, Fagan, Hejna, Itkina, Lepert, Ma, Miller, Wu, Belkhale, Dass, Ha, Jain, Lee, Lee, Memmel, Park, Radosavovic, Wang, Zhan, Black, Chi, Hatch, Lin, Lu, Mercat, Rehman, Sanketi, Sharma, Simpson, Vuong, Walke, Wulfe, Xiao, Yang, Yavary, Zhao, Agia, Baijal, Castro, Chen, Chen, Chung, Drake, Foster, Gao, Guizilini, Herrera, Heo, Hsu, Hu, Irshad, Jackson, Le, Li, Lin, Lin, Ma, Maddukuri, Mirchandani, Morton, Nguyen, O'Neill, Scalise, Seale, Son, Tian, Tran, Wang, Wu, Xie, Yang, Yin, Zhang, Bastani, Berseth, Bohg, Goldberg, Gupta, Gupta, Jayaraman, Lim, Malik, {Mart{\'i}n-Mart{\'i}n}, Ramamoorthy, Sadigh, Song, Wu, Yip, Zhu, Kollar, Levine, and Finn]{khazatskyDROIDLargeScaleInTheWild2025}
|
| 221 |
+
Alexander Khazatsky, Karl Pertsch, Suraj Nair, Ashwin Balakrishna, Sudeep Dasari, Siddharth Karamcheti, Soroush Nasiriany, Mohan~Kumar Srirama, Lawrence~Yunliang Chen, Kirsty Ellis, Peter~David Fagan, Joey Hejna, Masha Itkina, Marion Lepert, Yecheng~Jason Ma, Patrick~Tree Miller, Jimmy Wu, Suneel Belkhale, Shivin Dass, Huy Ha, Arhan Jain, Abraham Lee, Youngwoon Lee, Marius Memmel, Sungjae Park, Ilija Radosavovic, Kaiyuan Wang, Albert Zhan, Kevin Black, Cheng Chi, Kyle~Beltran Hatch, Shan Lin, Jingpei Lu, Jean Mercat, Abdul Rehman, Pannag~R. Sanketi, Archit Sharma, Cody Simpson, Quan Vuong, Homer~Rich Walke, Blake Wulfe, Ted Xiao, Jonathan~Heewon Yang, Arefeh Yavary, Tony~Z. Zhao, Christopher Agia, Rohan Baijal, Mateo~Guaman Castro, Daphne Chen, Qiuyu Chen, Trinity Chung, Jaimyn Drake, Ethan~Paul Foster, Jensen Gao, Vitor Guizilini, David~Antonio Herrera, Minho Heo, Kyle Hsu, Jiaheng Hu, Muhammad~Zubair Irshad, Donovon Jackson, Charlotte Le, Yunshuang Li, Kevin Lin, Roy Lin, Zehan Ma, Abhiram Maddukuri, Suvir Mirchandani, Daniel Morton, Tony Nguyen, Abigail O'Neill, Rosario Scalise, Derick Seale, Victor Son, Stephen Tian, Emi Tran, Andrew~E. Wang, Yilin Wu, Annie Xie, Jingyun Yang, Patrick Yin, Yunchu Zhang, Osbert Bastani, Glen Berseth, Jeannette Bohg, Ken Goldberg, Abhinav Gupta, Abhishek Gupta, Dinesh Jayaraman, Joseph~J. Lim, Jitendra Malik, Roberto {Mart{\'i}n-Mart{\'i}n}, Subramanian Ramamoorthy, Dorsa Sadigh, Shuran Song, Jiajun Wu, Michael~C. Yip, Yuke Zhu, Thomas Kollar, Sergey Levine, and Chelsea Finn.
|
| 222 |
+
\newblock {{DROID}}: {{A Large-Scale In-The-Wild Robot Manipulation Dataset}}, April 2025.
|
| 223 |
+
|
| 224 |
+
\bibitem[Kim et~al.(2024)Kim, Pertsch, Karamcheti, Xiao, Balakrishna, Nair, Rafailov, Foster, Lam, Sanketi, Vuong, Kollar, Burchfiel, Tedrake, Sadigh, Levine, Liang, and Finn]{kimOpenVLAOpenSourceVisionLanguageAction2024}
|
| 225 |
+
Moo~Jin Kim, Karl Pertsch, Siddharth Karamcheti, Ted Xiao, Ashwin Balakrishna, Suraj Nair, Rafael Rafailov, Ethan Foster, Grace Lam, Pannag Sanketi, Quan Vuong, Thomas Kollar, Benjamin Burchfiel, Russ Tedrake, Dorsa Sadigh, Sergey Levine, Percy Liang, and Chelsea Finn.
|
| 226 |
+
\newblock {{OpenVLA}}: {{An Open-Source Vision-Language-Action Model}}, September 2024.
|
| 227 |
+
|
| 228 |
+
\bibitem[Kingma and Welling(2013)]{kingma2013auto}
|
| 229 |
+
Diederik~P Kingma and Max Welling.
|
| 230 |
+
\newblock Auto-encoding variational bayes.
|
| 231 |
+
\newblock \emph{arXiv preprint arXiv:1312.6114}, 2013.
|
| 232 |
+
|
| 233 |
+
\bibitem[Knight et~al.()Knight, Kooijmans, Wolf, Alibert, Aractingi, Aubakirova, Zouitine, Martino, Palma, Pascal, and Cadene]{knightStandardOpenSO100}
|
| 234 |
+
Rob Knight, Pepijn Kooijmans, Thomas Wolf, Simon Alibert, Michel Aractingi, Dana Aubakirova, Adil Zouitine, Russi Martino, Steven Palma, Caroline Pascal, and Remi Cadene.
|
| 235 |
+
\newblock Standard {{Open SO-100}} \& {{SO-101 Arms}}.
|
| 236 |
+
|
| 237 |
+
\bibitem[Kober et~al.()Kober, Bagnell, and Peters]{koberReinforcementLearningRobotics}
|
| 238 |
+
Jens Kober, J~Andrew Bagnell, and Jan Peters.
|
| 239 |
+
\newblock Reinforcement {{Learning}} in {{Robotics}}: {{A Survey}}.
|
| 240 |
+
|
| 241 |
+
\bibitem[Koh et~al.(2023)Koh, Salakhutdinov, and Fried]{FROMAGe}
|
| 242 |
+
Jing~Yu Koh, Ruslan Salakhutdinov, and Daniel Fried.
|
| 243 |
+
\newblock Grounding language models to images for multimodal inputs and outputs, 2023.
|
| 244 |
+
|
| 245 |
+
\bibitem[Kong et~al.(2024)Kong, Goel, Badlani, Ping, Valle, and Catanzaro]{kong2024audioflam}
|
| 246 |
+
Zhifeng Kong, Arushi Goel, Rohan Badlani, Wei Ping, Rafael Valle, and Bryan Catanzaro.
|
| 247 |
+
\newblock Audio flamingo: A novel audio language model with few-shot learning and dialogue abilities.
|
| 248 |
+
\newblock In \emph{International Conference on Machine Learning}, pages 25125--25148. PMLR, 2024.
|
| 249 |
+
|
| 250 |
+
\bibitem[Korrapati(2024)]{moondream}
|
| 251 |
+
Vik Korrapati.
|
| 252 |
+
\newblock Moondream.
|
| 253 |
+
\newblock Online, 2024.
|
| 254 |
+
|
| 255 |
+
\bibitem[Lauren{\c c}on et~al.(2023)Lauren{\c c}on, Saulnier, Tronchon, Bekman, Singh, Lozhkov, Wang, Karamcheti, Rush, Kiela, Cord, and Sanh]{OBELICS}
|
| 256 |
+
Hugo Lauren{\c c}on, Lucile Saulnier, Leo Tronchon, Stas Bekman, Amanpreet Singh, Anton Lozhkov, Thomas Wang, Siddharth Karamcheti, Alexander~M Rush, Douwe Kiela, Matthieu Cord, and Victor Sanh.
|
| 257 |
+
\newblock {{OBELICS}}: {{An}} open web-scale filtered dataset of interleaved image-text documents.
|
| 258 |
+
\newblock In \emph{Thirty-Seventh Conference on Neural Information Processing Systems Datasets and Benchmarks Track}, 2023.
|
| 259 |
+
|
| 260 |
+
\bibitem[Lauren{\c c}on et~al.(2024)Lauren{\c c}on, Tronchon, Cord, and Sanh]{laurenconWhatMattersWhen2024}
|
| 261 |
+
Hugo Lauren{\c c}on, L{\'e}o Tronchon, Matthieu Cord, and Victor Sanh.
|
| 262 |
+
\newblock What matters when building vision-language models?, May 2024.
|
| 263 |
+
|
| 264 |
+
\bibitem[Lee et~al.(2020)Lee, Hwangbo, Wellhausen, Koltun, and Hutter]{leeLearningQuadrupedalLocomotion2020}
|
| 265 |
+
Joonho Lee, Jemin Hwangbo, Lorenz Wellhausen, Vladlen Koltun, and Marco Hutter.
|
| 266 |
+
\newblock Learning {{Quadrupedal Locomotion}} over {{Challenging Terrain}}.
|
| 267 |
+
\newblock \emph{Science Robotics}, 5\penalty0 (47):\penalty0 eabc5986, October 2020.
|
| 268 |
+
\newblock ISSN 2470-9476.
|
| 269 |
+
\newblock \doi{10.1126/scirobotics.abc5986}.
|
| 270 |
+
|
| 271 |
+
\bibitem[Lee et~al.(2024)Lee, Wang, Etukuru, Kim, Shafiullah, and Pinto]{leeBehaviorGenerationLatent2024}
|
| 272 |
+
Seungjae Lee, Yibin Wang, Haritheja Etukuru, H.~Jin Kim, Nur Muhammad~Mahi Shafiullah, and Lerrel Pinto.
|
| 273 |
+
\newblock Behavior {{Generation}} with {{Latent Actions}}, June 2024.
|
| 274 |
+
|
| 275 |
+
\bibitem[Li et~al.(2023)Li, Li, Savarese, and Hoi]{BLIP-2}
|
| 276 |
+
Junnan Li, Dongxu Li, Silvio Savarese, and Steven Hoi.
|
| 277 |
+
\newblock {{BLIP-2}}: Bootstrapping language-image pre-training with frozen image encoders and large language models.
|
| 278 |
+
\newblock In \emph{Proceedings of the 40th International Conference on Machine Learning}, {{ICML}}'23, , Honolulu, Hawaii, USA,, 2023. JMLR.org.
|
| 279 |
+
|
| 280 |
+
\bibitem[Lillicrap et~al.(2019)Lillicrap, Hunt, Pritzel, Heess, Erez, Tassa, Silver, and Wierstra]{lillicrapContinuousControlDeep2019a}
|
| 281 |
+
Timothy~P. Lillicrap, Jonathan~J. Hunt, Alexander Pritzel, Nicolas Heess, Tom Erez, Yuval Tassa, David Silver, and Daan Wierstra.
|
| 282 |
+
\newblock Continuous control with deep reinforcement learning, July 2019.
|
| 283 |
+
|
| 284 |
+
\bibitem[Lin et~al.(2024)Lin, Yin, Ping, Lu, Molchanov, Tao, Mao, Kautz, Shoeybi, and Han]{linVILAPretrainingVisual2024}
|
| 285 |
+
Ji~Lin, Hongxu Yin, Wei Ping, Yao Lu, Pavlo Molchanov, Andrew Tao, Huizi Mao, Jan Kautz, Mohammad Shoeybi, and Song Han.
|
| 286 |
+
\newblock {{VILA}}: {{On Pre-training}} for {{Visual Language Models}}, May 2024.
|
| 287 |
+
|
| 288 |
+
\bibitem[Lipman et~al.(2023)Lipman, Chen, {Ben-Hamu}, Nickel, and Le]{lipmanFlowMatchingGenerative2023}
|
| 289 |
+
Yaron Lipman, Ricky T.~Q. Chen, Heli {Ben-Hamu}, Maximilian Nickel, and Matt Le.
|
| 290 |
+
\newblock Flow {{Matching}} for {{Generative Modeling}}, February 2023.
|
| 291 |
+
|
| 292 |
\bibitem[Lipman et~al.(2024)Lipman, Havasi, Holderrieth, Shaul, Le, Karrer, Chen, {Lopez-Paz}, {Ben-Hamu}, and Gat]{lipmanFlowMatchingGuide2024}
|
| 293 |
Yaron Lipman, Marton Havasi, Peter Holderrieth, Neta Shaul, Matt Le, Brian Karrer, Ricky T.~Q. Chen, David {Lopez-Paz}, Heli {Ben-Hamu}, and Itai Gat.
|
| 294 |
\newblock Flow {{Matching Guide}} and {{Code}}, December 2024.
|
| 295 |
|
| 296 |
+
\bibitem[Liu et~al.(2023)Liu, Li, Li, and Lee]{LLaVA-1.5}
|
| 297 |
+
Haotian Liu, Chunyuan Li, Yuheng Li, and Yong~Jae Lee.
|
| 298 |
+
\newblock Improved baselines with visual instruction tuning.
|
| 299 |
+
\newblock In \emph{{{NeurIPS}} 2023 Workshop on Instruction Tuning and Instruction Following}, 2023.
|
| 300 |
+
|
| 301 |
+
\bibitem[Liu et~al.(2024)Liu, Wang, Ma, Wu, Ma, Wei, Jiao, Wu, and Hu]{liu2024kangaroo}
|
| 302 |
+
Jiajun Liu, Yibing Wang, Hanghang Ma, Xiaoping Wu, Xiaoqi Ma, Xiaoming Wei, Jianbin Jiao, Enhua Wu, and Jie Hu.
|
| 303 |
+
\newblock Kangaroo: {{A}} powerful video-language model supporting long-context video input.
|
| 304 |
+
\newblock \emph{arXiv preprint arXiv:2408.15542}, 2024.
|
| 305 |
+
|
| 306 |
+
\bibitem[Luo(2022)]{luoUnderstandingDiffusionModels2022}
|
| 307 |
+
Calvin Luo.
|
| 308 |
+
\newblock Understanding {{Diffusion Models}}: {{A Unified Perspective}}, August 2022.
|
| 309 |
+
|
| 310 |
+
\bibitem[Luo et~al.(2024)Luo, Xu, Wu, and Levine]{luoPreciseDexterousRobotic2024}
|
| 311 |
+
Jianlan Luo, Charles Xu, Jeffrey Wu, and Sergey Levine.
|
| 312 |
+
\newblock Precise and {{Dexterous Robotic Manipulation}} via {{Human-in-the-Loop Reinforcement Learning}}, October 2024.
|
| 313 |
+
|
| 314 |
+
\bibitem[Luo et~al.(2025)Luo, Hu, Xu, Tan, Berg, Sharma, Schaal, Finn, Gupta, and Levine]{luoSERLSoftwareSuite2025}
|
| 315 |
+
Jianlan Luo, Zheyuan Hu, Charles Xu, You~Liang Tan, Jacob Berg, Archit Sharma, Stefan Schaal, Chelsea Finn, Abhishek Gupta, and Sergey Levine.
|
| 316 |
+
\newblock {{SERL}}: {{A Software Suite}} for {{Sample-Efficient Robotic Reinforcement Learning}}, March 2025.
|
| 317 |
+
|
| 318 |
+
\bibitem[Lynch and Park(2017)]{lynchModernRoboticsMechanics2017}
|
| 319 |
+
Kevin~M. Lynch and Frank~C. Park.
|
| 320 |
+
\newblock \emph{Modern {{Robotics}}: {{Mechanics}}, {{Planning}}, and {{Control}}}.
|
| 321 |
+
\newblock Cambridge University Press, 1 edition, May 2017.
|
| 322 |
+
\newblock ISBN 978-1-316-66123-9 978-1-107-15630-2 978-1-316-60984-2.
|
| 323 |
+
\newblock \doi{10.1017/9781316661239}.
|
| 324 |
+
|
| 325 |
+
\bibitem[Ma{\~n}as et~al.(2023)Ma{\~n}as, Rodriguez~Lopez, Ahmadi, Nematzadeh, Goyal, and Agrawal]{MAPL}
|
| 326 |
+
Oscar Ma{\~n}as, Pau Rodriguez~Lopez, Saba Ahmadi, Aida Nematzadeh, Yash Goyal, and Aishwarya Agrawal.
|
| 327 |
+
\newblock {{MAPL}}: {{Parameter-efficient}} adaptation of unimodal pre-trained models for vision-language few-shot prompting.
|
| 328 |
+
\newblock In Andreas Vlachos and Isabelle Augenstein, editors, \emph{Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics}, pages 2523--2548, Dubrovnik, Croatia, May 2023. Association for Computational Linguistics.
|
| 329 |
+
\newblock \doi{10.18653/v1/2023.eacl-main.185}.
|
| 330 |
+
|
| 331 |
+
\bibitem[Marafioti et~al.(2025)Marafioti, Zohar, Farr{\'e}, Noyan, Bakouch, Cuenca, Zakka, Allal, Lozhkov, Tazi, Srivastav, Lochner, Larcher, Morlon, Tunstall, von Werra, and Wolf]{marafiotiSmolVLMRedefiningSmall2025}
|
| 332 |
+
Andr{\'e}s Marafioti, Orr Zohar, Miquel Farr{\'e}, Merve Noyan, Elie Bakouch, Pedro Cuenca, Cyril Zakka, Loubna~Ben Allal, Anton Lozhkov, Nouamane Tazi, Vaibhav Srivastav, Joshua Lochner, Hugo Larcher, Mathieu Morlon, Lewis Tunstall, Leandro von Werra, and Thomas Wolf.
|
| 333 |
+
\newblock {{SmolVLM}}: {{Redefining}} small and efficient multimodal models, April 2025.
|
| 334 |
+
|
| 335 |
+
\bibitem[Margolis et~al.(2022)Margolis, Yang, Paigwar, Chen, and Agrawal]{margolisRapidLocomotionReinforcement2022}
|
| 336 |
+
Gabriel~B. Margolis, Ge~Yang, Kartik Paigwar, Tao Chen, and Pulkit Agrawal.
|
| 337 |
+
\newblock Rapid {{Locomotion}} via {{Reinforcement Learning}}, May 2022.
|
| 338 |
+
|
| 339 |
+
\bibitem[McCormac et~al.(2016)McCormac, Handa, Davison, and Leutenegger]{mccormacSemanticFusionDense3D2016}
|
| 340 |
+
John McCormac, Ankur Handa, Andrew Davison, and Stefan Leutenegger.
|
| 341 |
+
\newblock {{SemanticFusion}}: {{Dense 3D Semantic Mapping}} with {{Convolutional Neural Networks}}, September 2016.
|
| 342 |
+
|
| 343 |
+
\bibitem[Mnih et~al.(2013)Mnih, Kavukcuoglu, Silver, Graves, Antonoglou, Wierstra, and Riedmiller]{mnihPlayingAtariDeep2013}
|
| 344 |
+
Volodymyr Mnih, Koray Kavukcuoglu, David Silver, Alex Graves, Ioannis Antonoglou, Daan Wierstra, and Martin Riedmiller.
|
| 345 |
+
\newblock Playing {{Atari}} with {{Deep Reinforcement Learning}}, December 2013.
|
| 346 |
+
|
| 347 |
\bibitem[Nakkiran et~al.(2024)Nakkiran, Bradley, Zhou, and Advani]{nakkiranStepbyStepDiffusionElementary2024}
|
| 348 |
Preetum Nakkiran, Arwen Bradley, Hattie Zhou, and Madhu Advani.
|
| 349 |
\newblock Step-by-{{Step Diffusion}}: {{An Elementary Tutorial}}, June 2024.
|
| 350 |
|
| 351 |
+
\bibitem[O'Neill et~al.(2025)O'Neill, Rehman, Gupta, Maddukuri, Gupta, Padalkar, Lee, Pooley, Gupta, Mandlekar, Jain, Tung, Bewley, Herzog, Irpan, Khazatsky, Rai, Gupta, Wang, Kolobov, Singh, Garg, Kembhavi, Xie, Brohan, Raffin, Sharma, Yavary, Jain, Balakrishna, Wahid, {Burgess-Limerick}, Kim, Sch{\"o}lkopf, Wulfe, Ichter, Lu, Xu, Le, Finn, Wang, Xu, Chi, Huang, Chan, Agia, Pan, Fu, Devin, Xu, Morton, Driess, Chen, Pathak, Shah, B{\"u}chler, Jayaraman, Kalashnikov, Sadigh, Johns, Foster, Liu, Ceola, Xia, Zhao, Frujeri, Stulp, Zhou, Sukhatme, Salhotra, Yan, Feng, Schiavi, Berseth, Kahn, Yang, Wang, Su, Fang, Shi, Bao, Amor, Christensen, Furuta, Bharadhwaj, Walke, Fang, Ha, Mordatch, Radosavovic, Leal, Liang, {Abou-Chakra}, Kim, Drake, Peters, Schneider, Hsu, Vakil, Bohg, Bingham, Wu, Gao, Hu, Wu, Wu, Sun, Luo, Gu, Tan, Oh, Wu, Lu, Yang, Malik, Silv{\'e}rio, Hejna, Booher, Tompson, Yang, Salvador, Lim, Han, Wang, Rao, Pertsch, Hausman, Go, Gopalakrishnan, Goldberg, Byrne, Oslund, Kawaharazuka, Black, Lin, Zhang, Ehsani, Lekkala, Ellis, Rana, Srinivasan, Fang, Singh, Zeng, Hatch, Hsu, Itti, Chen, Pinto, {Fei-Fei}, Tan, Fan, Ott, Lee, Weihs, Chen, Lepert, Memmel, Tomizuka, Itkina, Castro, Spero, Du, Ahn, Yip, Zhang, Ding, Heo, Srirama, Sharma, Kim, Irshad, Kanazawa, Hansen, Heess, Joshi, Suenderhauf, Liu, Palo, Shafiullah, Mees, Kroemer, Bastani, Sanketi, Miller, Yin, Wohlhart, Xu, Fagan, Mitrano, Sermanet, Abbeel, Sundaresan, Chen, Vuong, Rafailov, Tian, Doshi, {Mart{\'i}n-Mart{\'i}n}, Baijal, Scalise, Hendrix, Lin, Qian, Zhang, Mendonca, Shah, Hoque, Julian, Bustamante, Kirmani, Levine, Lin, Moore, Bahl, Dass, Sonawani, Tulsiani, Song, Xu, Haldar, Karamcheti, Adebola, Guist, Nasiriany, Schaal, Welker, Tian, Ramamoorthy, Dasari, Belkhale, Park, Nair, Mirchandani, Osa, Gupta, Harada, Matsushima, Xiao, Kollar, Yu, Ding, Davchev, Zhao, Armstrong, Darrell, Chung, Jain, Kumar, Vanhoucke, Guizilini, Zhan, Zhou, Burgard, Chen, Chen, Wang, Zhu, Geng, Liu, Liangwei, Li, Pang, Lu, Ma, Kim, Chebotar, Zhou, Zhu, Wu, Xu, Wang, Bisk, Dou, Cho, Lee, Cui, Cao, Wu, Tang, Zhu, Zhang, Jiang, Li, Li, Iwasawa, Matsuo, Ma, Xu, Cui, Zhang, Fu, and Lin]{oneillOpenXEmbodimentRobotic2025}
|
| 352 |
+
Abby O'Neill, Abdul Rehman, Abhinav Gupta, Abhiram Maddukuri, Abhishek Gupta, Abhishek Padalkar, Abraham Lee, Acorn Pooley, Agrim Gupta, Ajay Mandlekar, Ajinkya Jain, Albert Tung, Alex Bewley, Alex Herzog, Alex Irpan, Alexander Khazatsky, Anant Rai, Anchit Gupta, Andrew Wang, Andrey Kolobov, Anikait Singh, Animesh Garg, Aniruddha Kembhavi, Annie Xie, Anthony Brohan, Antonin Raffin, Archit Sharma, Arefeh Yavary, Arhan Jain, Ashwin Balakrishna, Ayzaan Wahid, Ben {Burgess-Limerick}, Beomjoon Kim, Bernhard Sch{\"o}lkopf, Blake Wulfe, Brian Ichter, Cewu Lu, Charles Xu, Charlotte Le, Chelsea Finn, Chen Wang, Chenfeng Xu, Cheng Chi, Chenguang Huang, Christine Chan, Christopher Agia, Chuer Pan, Chuyuan Fu, Coline Devin, Danfei Xu, Daniel Morton, Danny Driess, Daphne Chen, Deepak Pathak, Dhruv Shah, Dieter B{\"u}chler, Dinesh Jayaraman, Dmitry Kalashnikov, Dorsa Sadigh, Edward Johns, Ethan Foster, Fangchen Liu, Federico Ceola, Fei Xia, Feiyu Zhao, Felipe~Vieira Frujeri, Freek Stulp, Gaoyue Zhou, Gaurav~S. Sukhatme, Gautam Salhotra, Ge~Yan, Gilbert Feng, Giulio Schiavi, Glen Berseth, Gregory Kahn, Guangwen Yang, Guanzhi Wang, Hao Su, Hao-Shu Fang, Haochen Shi, Henghui Bao, Heni~Ben Amor, Henrik~I. Christensen, Hiroki Furuta, Homanga Bharadhwaj, Homer Walke, Hongjie Fang, Huy Ha, Igor Mordatch, Ilija Radosavovic, Isabel Leal, Jacky Liang, Jad {Abou-Chakra}, Jaehyung Kim, Jaimyn Drake, Jan Peters, Jan Schneider, Jasmine Hsu, Jay Vakil, Jeannette Bohg, Jeffrey Bingham, Jeffrey Wu, Jensen Gao, Jiaheng Hu, Jiajun Wu, Jialin Wu, Jiankai Sun, Jianlan Luo, Jiayuan Gu, Jie Tan, Jihoon Oh, Jimmy Wu, Jingpei Lu, Jingyun Yang, Jitendra Malik, Jo{\~a}o Silv{\'e}rio, Joey Hejna, Jonathan Booher, Jonathan Tompson, Jonathan Yang, Jordi Salvador, Joseph~J. Lim, Junhyek Han, Kaiyuan Wang, Kanishka Rao, Karl Pertsch, Karol Hausman, Keegan Go, Keerthana Gopalakrishnan, Ken Goldberg, Kendra Byrne, Kenneth Oslund, Kento Kawaharazuka, Kevin Black, Kevin Lin, Kevin Zhang, Kiana Ehsani, Kiran Lekkala, Kirsty Ellis, Krishan Rana, Krishnan Srinivasan, Kuan Fang, Kunal~Pratap Singh, Kuo-Hao Zeng, Kyle Hatch, Kyle Hsu, Laurent Itti, Lawrence~Yunliang Chen, Lerrel Pinto, Li~{Fei-Fei}, Liam Tan, Linxi~"Jim" Fan, Lionel Ott, Lisa Lee, Luca Weihs, Magnum Chen, Marion Lepert, Marius Memmel, Masayoshi Tomizuka, Masha Itkina, Mateo~Guaman Castro, Max Spero, Maximilian Du, Michael Ahn, Michael~C. Yip, Mingtong Zhang, Mingyu Ding, Minho Heo, Mohan~Kumar Srirama, Mohit Sharma, Moo~Jin Kim, Muhammad~Zubair Irshad, Naoaki Kanazawa, Nicklas Hansen, Nicolas Heess, Nikhil~J. Joshi, Niko Suenderhauf, Ning Liu, Norman~Di Palo, Nur Muhammad~Mahi Shafiullah, Oier Mees, Oliver Kroemer, Osbert Bastani, Pannag~R. Sanketi, Patrick~"Tree" Miller, Patrick Yin, Paul Wohlhart, Peng Xu, Peter~David Fagan, Peter Mitrano, Pierre Sermanet, Pieter Abbeel, Priya Sundaresan, Qiuyu Chen, Quan Vuong, Rafael Rafailov, Ran Tian, Ria Doshi, Roberto {Mart{\'i}n-Mart{\'i}n}, Rohan Baijal, Rosario Scalise, Rose Hendrix, Roy Lin, Runjia Qian, Ruohan Zhang, Russell Mendonca, Rutav Shah, Ryan Hoque, Ryan Julian, Samuel Bustamante, Sean Kirmani, Sergey Levine, Shan Lin, Sherry Moore, Shikhar Bahl, Shivin Dass, Shubham Sonawani, Shubham Tulsiani, Shuran Song, Sichun Xu, Siddhant Haldar, Siddharth Karamcheti, Simeon Adebola, Simon Guist, Soroush Nasiriany, Stefan Schaal, Stefan Welker, Stephen Tian, Subramanian Ramamoorthy, Sudeep Dasari, Suneel Belkhale, Sungjae Park, Suraj Nair, Suvir Mirchandani, Takayuki Osa, Tanmay Gupta, Tatsuya Harada, Tatsuya Matsushima, Ted Xiao, Thomas Kollar, Tianhe Yu, Tianli Ding, Todor Davchev, Tony~Z. Zhao, Travis Armstrong, Trevor Darrell, Trinity Chung, Vidhi Jain, Vikash Kumar, Vincent Vanhoucke, Vitor Guizilini, Wei Zhan, Wenxuan Zhou, Wolfram Burgard, Xi~Chen, Xiangyu Chen, Xiaolong Wang, Xinghao Zhu, Xinyang Geng, Xiyuan Liu, Xu~Liangwei, Xuanlin Li, Yansong Pang, Yao Lu, Yecheng~Jason Ma, Yejin Kim, Yevgen Chebotar, Yifan Zhou, Yifeng Zhu, Yilin Wu, Ying Xu, Yixuan Wang, Yonatan Bisk, Yongqiang Dou, Yoonyoung Cho, Youngwoon Lee, Yuchen Cui, Yue Cao, Yueh-Hua Wu, Yujin Tang, Yuke Zhu, Yunchu Zhang, Yunfan Jiang, Yunshuang Li, Yunzhu Li, Yusuke Iwasawa, Yutaka Matsuo, Zehan Ma, Zhuo Xu, Zichen~Jeff Cui, Zichen Zhang, Zipeng Fu, and Zipeng Lin.
|
| 353 |
+
\newblock Open {{X-Embodiment}}: {{Robotic Learning Datasets}} and {{RT-X Models}}, May 2025.
|
| 354 |
+
|
| 355 |
+
\bibitem[Oquab et~al.(2024)Oquab, Darcet, Moutakanni, Vo, Szafraniec, Khalidov, Fernandez, Haziza, Massa, {El-Nouby}, Assran, Ballas, Galuba, Howes, Huang, Li, Misra, Rabbat, Sharma, Synnaeve, Xu, Jegou, Mairal, Labatut, Joulin, and Bojanowski]{oquabDINOv2LearningRobust2024}
|
| 356 |
+
Maxime Oquab, Timoth{\'e}e Darcet, Th{\'e}o Moutakanni, Huy Vo, Marc Szafraniec, Vasil Khalidov, Pierre Fernandez, Daniel Haziza, Francisco Massa, Alaaeldin {El-Nouby}, Mahmoud Assran, Nicolas Ballas, Wojciech Galuba, Russell Howes, Po-Yao Huang, Shang-Wen Li, Ishan Misra, Michael Rabbat, Vasu Sharma, Gabriel Synnaeve, Hu~Xu, Herv{\'e} Jegou, Julien Mairal, Patrick Labatut, Armand Joulin, and Piotr Bojanowski.
|
| 357 |
+
\newblock {{DINOv2}}: {{Learning Robust Visual Features}} without {{Supervision}}, February 2024.
|
| 358 |
+
|
| 359 |
+
\bibitem[Permenter and Yuan(2024)]{permenterInterpretingImprovingDiffusion2024}
|
| 360 |
+
Frank Permenter and Chenyang Yuan.
|
| 361 |
+
\newblock Interpreting and {{Improving Diffusion Models}} from an {{Optimization Perspective}}, June 2024.
|
| 362 |
+
|
| 363 |
+
\bibitem[Polyak et~al.(2025)Polyak, Zohar, Brown, Tjandra, Sinha, Lee, Vyas, Shi, Ma, Chuang, Yan, Choudhary, Wang, Sethi, Pang, Ma, Misra, Hou, Wang, Jagadeesh, Li, Zhang, Singh, Williamson, Le, Yu, Singh, Zhang, Vajda, Duval, Girdhar, Sumbaly, Rambhatla, Tsai, Azadi, Datta, Chen, Bell, Ramaswamy, Sheynin, Bhattacharya, Motwani, Xu, Li, Hou, Hsu, Yin, Dai, Taigman, Luo, Liu, Wu, Zhao, Kirstain, He, He, Pumarola, Thabet, Sanakoyeu, Mallya, Guo, Araya, Kerr, Wood, Liu, Peng, Vengertsev, Schonfeld, Blanchard, {Juefei-Xu}, Nord, Liang, Hoffman, Kohler, Fire, Sivakumar, Chen, Yu, Gao, Georgopoulos, Moritz, Sampson, Li, Parmeggiani, Fine, Fowler, Petrovic, and Du]{polyakMovieGenCast2025}
|
| 364 |
+
Adam Polyak, Amit Zohar, Andrew Brown, Andros Tjandra, Animesh Sinha, Ann Lee, Apoorv Vyas, Bowen Shi, Chih-Yao Ma, Ching-Yao Chuang, David Yan, Dhruv Choudhary, Dingkang Wang, Geet Sethi, Guan Pang, Haoyu Ma, Ishan Misra, Ji~Hou, Jialiang Wang, Kiran Jagadeesh, Kunpeng Li, Luxin Zhang, Mannat Singh, Mary Williamson, Matt Le, Matthew Yu, Mitesh~Kumar Singh, Peizhao Zhang, Peter Vajda, Quentin Duval, Rohit Girdhar, Roshan Sumbaly, Sai~Saketh Rambhatla, Sam Tsai, Samaneh Azadi, Samyak Datta, Sanyuan Chen, Sean Bell, Sharadh Ramaswamy, Shelly Sheynin, Siddharth Bhattacharya, Simran Motwani, Tao Xu, Tianhe Li, Tingbo Hou, Wei-Ning Hsu, Xi~Yin, Xiaoliang Dai, Yaniv Taigman, Yaqiao Luo, Yen-Cheng Liu, Yi-Chiao Wu, Yue Zhao, Yuval Kirstain, Zecheng He, Zijian He, Albert Pumarola, Ali Thabet, Artsiom Sanakoyeu, Arun Mallya, Baishan Guo, Boris Araya, Breena Kerr, Carleigh Wood, Ce~Liu, Cen Peng, Dimitry Vengertsev, Edgar Schonfeld, Elliot Blanchard, Felix {Juefei-Xu}, Fraylie Nord, Jeff Liang, John Hoffman, Jonas Kohler, Kaolin Fire, Karthik Sivakumar, Lawrence Chen, Licheng Yu, Luya Gao, Markos Georgopoulos, Rashel Moritz, Sara~K. Sampson, Shikai Li, Simone Parmeggiani, Steve Fine, Tara Fowler, Vladan Petrovic, and Yuming Du.
|
| 365 |
+
\newblock Movie {{Gen}}: {{A Cast}} of {{Media Foundation Models}}, February 2025.
|
| 366 |
+
|
| 367 |
+
\bibitem[Pomerleau(1988)]{pomerleauALVINNAutonomousLand1988}
|
| 368 |
+
Dean~A. Pomerleau.
|
| 369 |
+
\newblock {{ALVINN}}: {{An Autonomous Land Vehicle}} in a {{Neural Network}}.
|
| 370 |
+
\newblock In \emph{Advances in {{Neural Information Processing Systems}}}, volume~1. Morgan-Kaufmann, 1988.
|
| 371 |
+
|
| 372 |
\bibitem[Prince(2023)]{prince2023understanding}
|
| 373 |
Simon~J.D. Prince.
|
| 374 |
\newblock \emph{Understanding Deep Learning}.
|
| 375 |
\newblock The MIT Press, 2023.
|
| 376 |
|
| 377 |
+
\bibitem[Radford et~al.(2021)Radford, Kim, Hallacy, Ramesh, Goh, Agarwal, Sastry, Askell, Mishkin, Clark, Krueger, and Sutskever]{radfordLearningTransferableVisual2021}
|
| 378 |
+
Alec Radford, Jong~Wook Kim, Chris Hallacy, Aditya Ramesh, Gabriel Goh, Sandhini Agarwal, Girish Sastry, Amanda Askell, Pamela Mishkin, Jack Clark, Gretchen Krueger, and Ilya Sutskever.
|
| 379 |
+
\newblock Learning {{Transferable Visual Models From Natural Language Supervision}}, February 2021.
|
| 380 |
+
|
| 381 |
+
\bibitem[Raffel et~al.(2023)Raffel, Shazeer, Roberts, Lee, Narang, Matena, Zhou, Li, and Liu]{raffelExploringLimitsTransfer2023}
|
| 382 |
+
Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, and Peter~J. Liu.
|
| 383 |
+
\newblock Exploring the {{Limits}} of {{Transfer Learning}} with a {{Unified Text-to-Text Transformer}}, September 2023.
|
| 384 |
+
|
| 385 |
+
\bibitem[Reed et~al.(2022)Reed, Zolna, Parisotto, Colmenarejo, Novikov, {Barth-Maron}, Gimenez, Sulsky, Kay, Springenberg, Eccles, Bruce, Razavi, Edwards, Heess, Chen, Hadsell, Vinyals, Bordbar, and de~Freitas]{reedGeneralistAgent2022}
|
| 386 |
+
Scott Reed, Konrad Zolna, Emilio Parisotto, Sergio~Gomez Colmenarejo, Alexander Novikov, Gabriel {Barth-Maron}, Mai Gimenez, Yury Sulsky, Jackie Kay, Jost~Tobias Springenberg, Tom Eccles, Jake Bruce, Ali Razavi, Ashley Edwards, Nicolas Heess, Yutian Chen, Raia Hadsell, Oriol Vinyals, Mahyar Bordbar, and Nando de~Freitas.
|
| 387 |
+
\newblock A {{Generalist Agent}}, November 2022.
|
| 388 |
+
|
| 389 |
+
\bibitem[Ronneberger et~al.(2015)Ronneberger, Fischer, and Brox]{ronnebergerUNetConvolutionalNetworks2015}
|
| 390 |
+
Olaf Ronneberger, Philipp Fischer, and Thomas Brox.
|
| 391 |
+
\newblock U-{{Net}}: {{Convolutional Networks}} for {{Biomedical Image Segmentation}}, May 2015.
|
| 392 |
+
|
| 393 |
+
\bibitem[Ross et~al.(2011)Ross, Gordon, and Bagnell]{rossReductionImitationLearning2011}
|
| 394 |
+
Stephane Ross, Geoffrey~J. Gordon, and J.~Andrew Bagnell.
|
| 395 |
+
\newblock A {{Reduction}} of {{Imitation Learning}} and {{Structured Prediction}} to {{No-Regret Online Learning}}, March 2011.
|
| 396 |
+
|
| 397 |
+
\bibitem[Sanneman et~al.(2020)Sanneman, Fourie, and Shah]{sannemanStateIndustrialRobotics2020}
|
| 398 |
+
Lindsay Sanneman, Christopher Fourie, and Julie~A. Shah.
|
| 399 |
+
\newblock The {{State}} of {{Industrial Robotics}}: {{Emerging Technologies}}, {{Challenges}}, and {{Key Research Directions}}, October 2020.
|
| 400 |
+
|
| 401 |
+
\bibitem[Schuhmann et~al.(2022)Schuhmann, K{\"o}pf, Vencu, Coombes, and Beaumont]{LAION-COCO}
|
| 402 |
+
C~Schuhmann, A~K{\"o}pf, R~Vencu, T~Coombes, and R~Beaumont.
|
| 403 |
+
\newblock Laion coco: 600m synthetic captions from laion2b-en.
|
| 404 |
+
\newblock \emph{URL https://laion.ai/blog/laion-coco}, 2022.
|
| 405 |
+
|
| 406 |
+
\bibitem[Schulman et~al.(2017{\natexlab{a}})Schulman, Levine, Moritz, Jordan, and Abbeel]{schulmanTrustRegionPolicy2017}
|
| 407 |
+
John Schulman, Sergey Levine, Philipp Moritz, Michael~I. Jordan, and Pieter Abbeel.
|
| 408 |
+
\newblock Trust {{Region Policy Optimization}}, April 2017{\natexlab{a}}.
|
| 409 |
+
|
| 410 |
+
\bibitem[Schulman et~al.(2017{\natexlab{b}})Schulman, Wolski, Dhariwal, Radford, and Klimov]{schulmanProximalPolicyOptimization2017}
|
| 411 |
+
John Schulman, Filip Wolski, Prafulla Dhariwal, Alec Radford, and Oleg Klimov.
|
| 412 |
+
\newblock Proximal {{Policy Optimization Algorithms}}, August 2017{\natexlab{b}}.
|
| 413 |
+
|
| 414 |
\bibitem[{Shalev-Shwartz} and {Ben-David}(2014)]{shalev-shwartzUnderstandingMachineLearning2014}
|
| 415 |
Shai {Shalev-Shwartz} and Shai {Ben-David}.
|
| 416 |
\newblock \emph{Understanding {{Machine Learning}}: {{From Theory}} to {{Algorithms}}}.
|
|
|
|
| 418 |
\newblock ISBN 978-1-107-05713-5 978-1-107-29801-9.
|
| 419 |
\newblock \doi{10.1017/CBO9781107298019}.
|
| 420 |
|
| 421 |
+
\bibitem[Shukor et~al.(2023)Shukor, Dancette, and Cord]{shukor2023epalm}
|
| 422 |
+
Mustafa Shukor, Corentin Dancette, and Matthieu Cord.
|
| 423 |
+
\newblock Ep-alm: {{Efficient}} perceptual augmentation of language models.
|
| 424 |
+
\newblock In \emph{Proceedings of the {{IEEE}}/{{CVF}} International Conference on Computer Vision}, pages 22056--22069, 2023.
|
| 425 |
+
|
| 426 |
+
\bibitem[Shukor et~al.(2025)Shukor, Aubakirova, Capuano, Kooijmans, Palma, Zouitine, Aractingi, Pascal, Russi, Marafioti, Alibert, Cord, Wolf, and Cadene]{shukorSmolVLAVisionLanguageActionModel2025}
|
| 427 |
+
Mustafa Shukor, Dana Aubakirova, Francesco Capuano, Pepijn Kooijmans, Steven Palma, Adil Zouitine, Michel Aractingi, Caroline Pascal, Martino Russi, Andres Marafioti, Simon Alibert, Matthieu Cord, Thomas Wolf, and Remi Cadene.
|
| 428 |
+
\newblock {{SmolVLA}}: {{A Vision-Language-Action Model}} for {{Affordable}} and {{Efficient Robotics}}, June 2025.
|
| 429 |
+
|
| 430 |
\bibitem[Siciliano and Khatib(2016)]{sicilianoSpringerHandbookRobotics2016}
|
| 431 |
Bruno Siciliano and Oussama Khatib, editors.
|
| 432 |
\newblock \emph{Springer {{Handbook}} of {{Robotics}}}.
|
|
|
|
| 434 |
\newblock ISBN 978-3-319-32550-7 978-3-319-32552-1.
|
| 435 |
\newblock \doi{10.1007/978-3-319-32552-1}.
|
| 436 |
|
| 437 |
+
\bibitem[Silver et~al.(2014)Silver, Lever, Heess, Degris, Wierstra, and Riedmiller]{pmlr-v32-silver14}
|
| 438 |
+
David Silver, Guy Lever, Nicolas Heess, Thomas Degris, Daan Wierstra, and Martin Riedmiller.
|
| 439 |
+
\newblock Deterministic policy gradient algorithms.
|
| 440 |
+
\newblock In Eric~P. Xing and Tony Jebara, editors, \emph{Proceedings of the 31st International Conference on Machine Learning}, volume~32 of \emph{Proceedings of Machine Learning Research}, pages 387--395, Bejing, China, June 2014. PMLR.
|
| 441 |
+
|
| 442 |
+
\bibitem[Sohn et~al.(2015)Sohn, Lee, and Yan]{sohnLearningStructuredOutput2015}
|
| 443 |
+
Kihyuk Sohn, Honglak Lee, and Xinchen Yan.
|
| 444 |
+
\newblock Learning {{Structured Output Representation}} using {{Deep Conditional Generative Models}}.
|
| 445 |
+
\newblock In \emph{Advances in {{Neural Information Processing Systems}}}, volume~28. Curran Associates, Inc., 2015.
|
| 446 |
+
|
| 447 |
+
\bibitem[Song et~al.(2022)Song, Meng, and Ermon]{songDenoisingDiffusionImplicit2022}
|
| 448 |
+
Jiaming Song, Chenlin Meng, and Stefano Ermon.
|
| 449 |
+
\newblock Denoising {{Diffusion Implicit Models}}, October 2022.
|
| 450 |
+
|
| 451 |
\bibitem[Sutton and Barto(2018)]{suttonReinforcementLearningIntroduction2018}
|
| 452 |
Richard~S. Sutton and Andrew~G. Barto.
|
| 453 |
\newblock \emph{Reinforcement Learning: An Introduction}.
|
| 454 |
\newblock Adaptive Computation and Machine Learning Series. The MIT Press, Cambridge, Massachusetts, second edition edition, 2018.
|
| 455 |
\newblock ISBN 978-0-262-03924-6.
|
| 456 |
|
| 457 |
+
\bibitem[Tancik et~al.(2020)Tancik, Srinivasan, Mildenhall, {Fridovich-Keil}, Raghavan, Singhal, Ramamoorthi, Barron, and Ng]{tancikFourierFeaturesLet2020}
|
| 458 |
+
Matthew Tancik, Pratul~P. Srinivasan, Ben Mildenhall, Sara {Fridovich-Keil}, Nithin Raghavan, Utkarsh Singhal, Ravi Ramamoorthi, Jonathan~T. Barron, and Ren Ng.
|
| 459 |
+
\newblock Fourier {{Features Let Networks Learn High Frequency Functions}} in {{Low Dimensional Domains}}, June 2020.
|
| 460 |
+
|
| 461 |
+
\bibitem[Tang et~al.(2025)Tang, Abbatematteo, Hu, Chandra, {Mart{\'i}n-Mart{\'i}n}, and Stone]{tangDeepReinforcementLearning2025}
|
| 462 |
+
Chen Tang, Ben Abbatematteo, Jiaheng Hu, Rohan Chandra, Roberto {Mart{\'i}n-Mart{\'i}n}, and Peter Stone.
|
| 463 |
+
\newblock Deep {{Reinforcement Learning}} for {{Robotics}}: {{A Survey}} of {{Real-World Successes}}.
|
| 464 |
+
\newblock \emph{Annual Review of Control, Robotics, and Autonomous Systems}, 8\penalty0 (Volume 8, 2025):\penalty0 153--188, May 2025.
|
| 465 |
+
\newblock ISSN 2573-5144.
|
| 466 |
+
\newblock \doi{10.1146/annurev-control-030323-022510}.
|
| 467 |
+
|
| 468 |
+
\bibitem[Tang et~al.(2023)Tang, Zhao, Wang, Zhang, Sun, Zheng, Du, Qian, and Kurths]{tangPerceptionNavigationAutonomous2023}
|
| 469 |
+
Yang Tang, Chaoqiang Zhao, Jianrui Wang, Chongzhen Zhang, Qiyu Sun, Weixing Zheng, Wenli Du, Feng Qian, and Juergen Kurths.
|
| 470 |
+
\newblock Perception and {{Navigation}} in {{Autonomous Systems}} in the {{Era}} of {{Learning}}: {{A Survey}}.
|
| 471 |
+
\newblock \emph{IEEE Transactions on Neural Networks and Learning Systems}, 34\penalty0 (12):\penalty0 9604--9624, December 2023.
|
| 472 |
+
\newblock ISSN 2162-237X, 2162-2388.
|
| 473 |
+
\newblock \doi{10.1109/TNNLS.2022.3167688}.
|
| 474 |
+
|
| 475 |
+
\bibitem[Team et~al.(2024)Team, Riviere, Pathak, Sessa, Hardin, Bhupatiraju, Hussenot, Mesnard, Shahriari, Ram{\'e}, Ferret, Liu, Tafti, Friesen, Casbon, Ramos, Kumar, Lan, Jerome, Tsitsulin, Vieillard, Stanczyk, Girgin, Momchev, Hoffman, Thakoor, Grill, Neyshabur, Bachem, Walton, Severyn, Parrish, Ahmad, Hutchison, Abdagic, Carl, Shen, Brock, Coenen, Laforge, Paterson, Bastian, Piot, Wu, Royal, Chen, Kumar, Perry, Welty, {Choquette-Choo}, Sinopalnikov, Weinberger, Vijaykumar, Rogozi{\'n}ska, Herbison, Bandy, Wang, Noland, Moreira, Senter, Eltyshev, Visin, Rasskin, Wei, Cameron, Martins, Hashemi, {Klimczak-Pluci{\'n}ska}, Batra, Dhand, Nardini, Mein, Zhou, Svensson, Stanway, Chan, Zhou, Carrasqueira, Iljazi, Becker, Fernandez, van Amersfoort, Gordon, Lipschultz, Newlan, Ji, Mohamed, Badola, Black, Millican, McDonell, Nguyen, Sodhia, Greene, Sjoesund, Usui, Sifre, Heuermann, Lago, McNealus, Soares, Kilpatrick, Dixon, Martins, Reid, Singh, Iverson, G{\"o}rner, Velloso, Wirth, Davidow, Miller, Rahtz, Watson, Risdal, Kazemi, Moynihan, Zhang, Kahng, Park, Rahman, Khatwani, Dao, Bardoliwalla, Devanathan, Dumai, Chauhan, Wahltinez, Botarda, Barnes, Barham, Michel, Jin, Georgiev, Culliton, Kuppala, Comanescu, Merhej, Jana, Rokni, Agarwal, Mullins, Saadat, Carthy, Perrin, Arnold, Krause, Dai, Garg, Sheth, Ronstrom, Chan, Jordan, Yu, Eccles, Hennigan, Kocisky, Doshi, Jain, Yadav, Meshram, Dharmadhikari, Barkley, Wei, Ye, Han, Kwon, Xu, Shen, Gong, Wei, Cotruta, Kirk, Rao, Giang, Peran, Warkentin, Collins, Barral, Ghahramani, Hadsell, Sculley, Banks, Dragan, Petrov, Vinyals, Dean, Hassabis, Kavukcuoglu, Farabet, Buchatskaya, Borgeaud, Fiedel, Joulin, Kenealy, Dadashi, and Andreev]{teamGemma2Improving2024}
|
| 476 |
+
Gemma Team, Morgane Riviere, Shreya Pathak, Pier~Giuseppe Sessa, Cassidy Hardin, Surya Bhupatiraju, L{\'e}onard Hussenot, Thomas Mesnard, Bobak Shahriari, Alexandre Ram{\'e}, Johan Ferret, Peter Liu, Pouya Tafti, Abe Friesen, Michelle Casbon, Sabela Ramos, Ravin Kumar, Charline~Le Lan, Sammy Jerome, Anton Tsitsulin, Nino Vieillard, Piotr Stanczyk, Sertan Girgin, Nikola Momchev, Matt Hoffman, Shantanu Thakoor, Jean-Bastien Grill, Behnam Neyshabur, Olivier Bachem, Alanna Walton, Aliaksei Severyn, Alicia Parrish, Aliya Ahmad, Allen Hutchison, Alvin Abdagic, Amanda Carl, Amy Shen, Andy Brock, Andy Coenen, Anthony Laforge, Antonia Paterson, Ben Bastian, Bilal Piot, Bo~Wu, Brandon Royal, Charlie Chen, Chintu Kumar, Chris Perry, Chris Welty, Christopher~A. {Choquette-Choo}, Danila Sinopalnikov, David Weinberger, Dimple Vijaykumar, Dominika Rogozi{\'n}ska, Dustin Herbison, Elisa Bandy, Emma Wang, Eric Noland, Erica Moreira, Evan Senter, Evgenii Eltyshev, Francesco Visin, Gabriel Rasskin, Gary Wei, Glenn Cameron, Gus Martins, Hadi Hashemi, Hanna {Klimczak-Pluci{\'n}ska}, Harleen Batra, Harsh Dhand, Ivan Nardini, Jacinda Mein, Jack Zhou, James Svensson, Jeff Stanway, Jetha Chan, Jin~Peng Zhou, Joana Carrasqueira, Joana Iljazi, Jocelyn Becker, Joe Fernandez, Joost van Amersfoort, Josh Gordon, Josh Lipschultz, Josh Newlan, Ju-yeong Ji, Kareem Mohamed, Kartikeya Badola, Kat Black, Katie Millican, Keelin McDonell, Kelvin Nguyen, Kiranbir Sodhia, Kish Greene, Lars~Lowe Sjoesund, Lauren Usui, Laurent Sifre, Lena Heuermann, Leticia Lago, Lilly McNealus, Livio~Baldini Soares, Logan Kilpatrick, Lucas Dixon, Luciano Martins, Machel Reid, Manvinder Singh, Mark Iverson, Martin G{\"o}rner, Mat Velloso, Mateo Wirth, Matt Davidow, Matt Miller, Matthew Rahtz, Matthew Watson, Meg Risdal, Mehran Kazemi, Michael Moynihan, Ming Zhang, Minsuk Kahng, Minwoo Park, Mofi Rahman, Mohit Khatwani, Natalie Dao, Nenshad Bardoliwalla, Nesh Devanathan, Neta Dumai, Nilay Chauhan, Oscar Wahltinez, Pankil Botarda, Parker Barnes, Paul Barham, Paul Michel, Pengchong Jin, Petko Georgiev, Phil Culliton, Pradeep Kuppala, Ramona Comanescu, Ramona Merhej, Reena Jana, Reza~Ardeshir Rokni, Rishabh Agarwal, Ryan Mullins, Samaneh Saadat, Sara~Mc Carthy, Sarah Perrin, S{\'e}bastien M.~R. Arnold, Sebastian Krause, Shengyang Dai, Shruti Garg, Shruti Sheth, Sue Ronstrom, Susan Chan, Timothy Jordan, Ting Yu, Tom Eccles, Tom Hennigan, Tomas Kocisky, Tulsee Doshi, Vihan Jain, Vikas Yadav, Vilobh Meshram, Vishal Dharmadhikari, Warren Barkley, Wei Wei, Wenming Ye, Woohyun Han, Woosuk Kwon, Xiang Xu, Zhe Shen, Zhitao Gong, Zichuan Wei, Victor Cotruta, Phoebe Kirk, Anand Rao, Minh Giang, Ludovic Peran, Tris Warkentin, Eli Collins, Joelle Barral, Zoubin Ghahramani, Raia Hadsell, D.~Sculley, Jeanine Banks, Anca Dragan, Slav Petrov, Oriol Vinyals, Jeff Dean, Demis Hassabis, Koray Kavukcuoglu, Clement Farabet, Elena Buchatskaya, Sebastian Borgeaud, Noah Fiedel, Armand Joulin, Kathleen Kenealy, Robert Dadashi, and Alek Andreev.
|
| 477 |
+
\newblock Gemma 2: {{Improving Open Language Models}} at a {{Practical Size}}, August 2024.
|
| 478 |
+
|
| 479 |
\bibitem[Tedrake({\natexlab{a}})]{tedrakeRoboticManipulationPerception}
|
| 480 |
Russ Tedrake.
|
| 481 |
\newblock Robotic {{Manipulation}}. {{Perception}}, {{Planning}} and {{Control}}., {\natexlab{a}}.
|
|
|
|
| 484 |
Russ Tedrake.
|
| 485 |
\newblock Underactuated {{Robotics}}. {{Algorithms}} for {{Walking}}, {{Running}}, {{Swimming}}, {{Flying}}, and {{Manipulation}}, {\natexlab{b}}.
|
| 486 |
|
| 487 |
+
\bibitem[Tiboni et~al.(2023)Tiboni, Arndt, and Kyrki]{tiboniDROPOSimtoRealTransfer2023}
|
| 488 |
+
Gabriele Tiboni, Karol Arndt, and Ville Kyrki.
|
| 489 |
+
\newblock {{DROPO}}: {{Sim-to-Real Transfer}} with {{Offline Domain Randomization}}, January 2023.
|
| 490 |
+
|
| 491 |
+
\bibitem[Tiboni et~al.(2024)Tiboni, Klink, Peters, Tommasi, D'Eramo, and Chalvatzaki]{tiboniDomainRandomizationEntropy2024}
|
| 492 |
+
Gabriele Tiboni, Pascal Klink, Jan Peters, Tatiana Tommasi, Carlo D'Eramo, and Georgia Chalvatzaki.
|
| 493 |
+
\newblock Domain {{Randomization}} via {{Entropy Maximization}}, March 2024.
|
| 494 |
+
|
| 495 |
+
\bibitem[Tobin et~al.(2017)Tobin, Fong, Ray, Schneider, Zaremba, and Abbeel]{tobinDomainRandomizationTransferring2017}
|
| 496 |
+
Josh Tobin, Rachel Fong, Alex Ray, Jonas Schneider, Wojciech Zaremba, and Pieter Abbeel.
|
| 497 |
+
\newblock Domain {{Randomization}} for {{Transferring Deep Neural Networks}} from {{Simulation}} to the {{Real World}}, March 2017.
|
| 498 |
+
|
| 499 |
+
\bibitem[Tong et~al.(2024)Tong, Brown, Wu, Woo, IYER, Akula, Yang, Yang, Middepogu, Wang, et~al.]{tong2024cambrian}
|
| 500 |
+
Peter Tong, Ellis Brown, Penghao Wu, Sanghyun Woo, Adithya Jairam~Vedagiri IYER, Sai~Charitha Akula, Shusheng Yang, Jihan Yang, Manoj Middepogu, Ziteng Wang, et~al.
|
| 501 |
+
\newblock Cambrian-1: {{A}} fully open, vision-centric exploration of multimodal llms.
|
| 502 |
+
\newblock \emph{Advances in Neural Information Processing Systems}, 37:\penalty0 87310--87356, 2024.
|
| 503 |
+
|
| 504 |
+
\bibitem[Touvron et~al.(2023)Touvron, Martin, Stone, Albert, Almahairi, Babaei, Bashlykov, Batra, Bhargava, Bhosale, Bikel, Blecher, Ferrer, Chen, Cucurull, Esiobu, Fernandes, Fu, Fu, Fuller, Gao, Goswami, Goyal, Hartshorn, Hosseini, Hou, Inan, Kardas, Kerkez, Khabsa, Kloumann, Korenev, Koura, Lachaux, Lavril, Lee, Liskovich, Lu, Mao, Martinet, Mihaylov, Mishra, Molybog, Nie, Poulton, Reizenstein, Rungta, Saladi, Schelten, Silva, Smith, Subramanian, Tan, Tang, Taylor, Williams, Kuan, Xu, Yan, Zarov, Zhang, Fan, Kambadur, Narang, Rodriguez, Stojnic, Edunov, and Scialom]{touvronLlama2Open2023}
|
| 505 |
+
Hugo Touvron, Louis Martin, Kevin Stone, Peter Albert, Amjad Almahairi, Yasmine Babaei, Nikolay Bashlykov, Soumya Batra, Prajjwal Bhargava, Shruti Bhosale, Dan Bikel, Lukas Blecher, Cristian~Canton Ferrer, Moya Chen, Guillem Cucurull, David Esiobu, Jude Fernandes, Jeremy Fu, Wenyin Fu, Brian Fuller, Cynthia Gao, Vedanuj Goswami, Naman Goyal, Anthony Hartshorn, Saghar Hosseini, Rui Hou, Hakan Inan, Marcin Kardas, Viktor Kerkez, Madian Khabsa, Isabel Kloumann, Artem Korenev, Punit~Singh Koura, Marie-Anne Lachaux, Thibaut Lavril, Jenya Lee, Diana Liskovich, Yinghai Lu, Yuning Mao, Xavier Martinet, Todor Mihaylov, Pushkar Mishra, Igor Molybog, Yixin Nie, Andrew Poulton, Jeremy Reizenstein, Rashi Rungta, Kalyan Saladi, Alan Schelten, Ruan Silva, Eric~Michael Smith, Ranjan Subramanian, Xiaoqing~Ellen Tan, Binh Tang, Ross Taylor, Adina Williams, Jian~Xiang Kuan, Puxin Xu, Zheng Yan, Iliyan Zarov, Yuchen Zhang, Angela Fan, Melanie Kambadur, Sharan Narang, Aurelien Rodriguez, Robert Stojnic, Sergey Edunov, and Thomas Scialom.
|
| 506 |
+
\newblock Llama 2: {{Open Foundation}} and {{Fine-Tuned Chat Models}}, July 2023.
|
| 507 |
+
|
| 508 |
+
\bibitem[Tsimpoukelli et~al.(2021)Tsimpoukelli, Menick, Cabi, Eslami, Vinyals, and Hill]{tsimpoukelli2021multimodalfrozen}
|
| 509 |
+
Maria Tsimpoukelli, Jacob~L Menick, Serkan Cabi, {\relax SM}~Eslami, Oriol Vinyals, and Felix Hill.
|
| 510 |
+
\newblock Multimodal few-shot learning with frozen language models.
|
| 511 |
+
\newblock \emph{Advances in Neural Information Processing Systems}, 34:\penalty0 200--212, 2021.
|
| 512 |
+
|
| 513 |
+
\bibitem[Vallaeys et~al.(2024)Vallaeys, Shukor, Cord, and Verbeek]{vallaeys2024improveddepalm}
|
| 514 |
+
Th{\'e}ophane Vallaeys, Mustafa Shukor, Matthieu Cord, and Jakob Verbeek.
|
| 515 |
+
\newblock Improved baselines for data-efficient perceptual augmentation of llms.
|
| 516 |
+
\newblock \emph{arXiv preprint arXiv:2403.13499}, 2024.
|
| 517 |
+
|
| 518 |
+
\bibitem[Wang et~al.(2025)Wang, Li, Yan, He, Yu, Zeng, Wang, Ma, Huang, Gao, et~al.]{wang2025internvideo2}
|
| 519 |
+
Yi~Wang, Xinhao Li, Ziang Yan, Yinan He, Jiashuo Yu, Xiangyu Zeng, Chenting Wang, Changlian Ma, Haian Huang, Jianfei Gao, et~al.
|
| 520 |
+
\newblock {{InternVideo2}}. 5: {{Empowering}} video mllms with long and rich context modeling.
|
| 521 |
+
\newblock \emph{arXiv preprint arXiv:2501.12386}, 2025.
|
| 522 |
+
|
| 523 |
+
\bibitem[Yao et~al.(2024)Yao, Yu, Zhang, Wang, Cui, Zhu, Cai, Li, Zhao, He, Chen, Zhou, Zou, Zhang, Hu, Zheng, Zhou, Cai, Han, Zeng, Li, Liu, and Sun]{minicmpv2024}
|
| 524 |
+
Yuan Yao, Tianyu Yu, Ao~Zhang, Chongyi Wang, Junbo Cui, Hongji Zhu, Tianchi Cai, Haoyu Li, Weilin Zhao, Zhihui He, Qianyu Chen, Huarong Zhou, Zhensheng Zou, Haoye Zhang, Shengding Hu, Zhi Zheng, Jie Zhou, Jie Cai, Xu~Han, Guoyang Zeng, Dahai Li, Zhiyuan Liu, and Maosong Sun.
|
| 525 |
+
\newblock {{MiniCPM-v}}: A {{GPT-4V}} level {{MLLM}} on your phone, 2024.
|
| 526 |
+
|
| 527 |
+
\bibitem[Zhai et~al.(2023)Zhai, Mustafa, Kolesnikov, and Beyer]{zhaiSigmoidLossLanguage2023}
|
| 528 |
+
Xiaohua Zhai, Basil Mustafa, Alexander Kolesnikov, and Lucas Beyer.
|
| 529 |
+
\newblock Sigmoid {{Loss}} for {{Language Image Pre-Training}}, September 2023.
|
| 530 |
+
|
| 531 |
+
\bibitem[Zhang et~al.(2025)Zhang, Li, Cheng, Hu, Yuan, Chen, Leng, Jiang, Zhang, Li, et~al.]{zhang2025videollama}
|
| 532 |
+
Boqiang Zhang, Kehan Li, Zesen Cheng, Zhiqiang Hu, Yuqian Yuan, Guanzheng Chen, Sicong Leng, Yuming Jiang, Hang Zhang, Xin Li, et~al.
|
| 533 |
+
\newblock {{VideoLLaMA}} 3: {{Frontier}} multimodal foundation models for image and video understanding.
|
| 534 |
+
\newblock \emph{arXiv preprint arXiv:2501.13106}, 2025.
|
| 535 |
+
|
| 536 |
+
\bibitem[Zhang et~al.(2024)Zhang, Xiao, He, and Shi]{zhangWoCoCoLearningWholeBody2024}
|
| 537 |
+
Chong Zhang, Wenli Xiao, Tairan He, and Guanya Shi.
|
| 538 |
+
\newblock {{WoCoCo}}: {{Learning Whole-Body Humanoid Control}} with {{Sequential Contacts}}, November 2024.
|
| 539 |
+
|
| 540 |
+
\bibitem[Zhao et~al.(2023)Zhao, Kumar, Levine, and Finn]{zhaoLearningFineGrainedBimanual2023}
|
| 541 |
+
Tony~Z. Zhao, Vikash Kumar, Sergey Levine, and Chelsea Finn.
|
| 542 |
+
\newblock Learning {{Fine-Grained Bimanual Manipulation}} with {{Low-Cost Hardware}}, April 2023.
|
| 543 |
+
|
| 544 |
+
\bibitem[Zhu et~al.(2024)Zhu, Chen, Shen, Li, and Elhoseiny]{zhu2024minigpt}
|
| 545 |
+
Deyao Zhu, Jun Chen, Xiaoqian Shen, Xiang Li, and Mohamed Elhoseiny.
|
| 546 |
+
\newblock {{MiniGPT-4}}: {{Enhancing}} vision-language understanding with advanced large language models.
|
| 547 |
+
\newblock In \emph{The Twelfth International Conference on Learning Representations}, 2024.
|
| 548 |
+
|
| 549 |
+
\bibitem[Zhu et~al.(2023)Zhu, Hessel, Awadalla, Gadre, Dodge, Fang, Yu, Schmidt, Wang, and Choi]{MMC4}
|
| 550 |
+
Wanrong Zhu, Jack Hessel, Anas Awadalla, Samir~Yitzhak Gadre, Jesse Dodge, Alex Fang, Youngjae Yu, Ludwig Schmidt, William~Yang Wang, and Yejin Choi.
|
| 551 |
+
\newblock Multimodal {{C4}}: {{An}} open, billion-scale corpus of images interleaved with text.
|
| 552 |
+
\newblock In \emph{Thirty-Seventh Conference on Neural Information Processing Systems Datasets and Benchmarks Track}, 2023.
|
| 553 |
+
|
| 554 |
\end{thebibliography}
|
app/scripts/latex-to-mdx/input/main.bib
CHANGED
|
@@ -351,17 +351,6 @@
|
|
| 351 |
file = {/Users/fracapuano/Zotero/storage/TFZQ6EHJ/Burridge et al. - 1999 - Sequential Composition of Dynamically Dexterous Robot Behaviors.pdf}
|
| 352 |
}
|
| 353 |
|
| 354 |
-
@misc{cadene2024lerobot,
|
| 355 |
-
title = {{{LeRobot}}: {{State-of-the-art}} Machine Learning for Real-World Robotics in Pytorch},
|
| 356 |
-
author = {Cadene, Remi and Alibert, Simon and Soare, Alexander and Gallouedec, Quentin and Zouitine, Adil and Palma, Steven and Kooijmans, Pepijn and Aractingi, Michel and Shukor, Mustafa and Aubakirova, Dana and Russi, Martino and Capuano, Francesco and Pascal, Caroline and Choghari, Jade and Moss, Jess and Wolf, Thomas},
|
| 357 |
-
year = {2024}
|
| 358 |
-
}
|
| 359 |
-
|
| 360 |
-
@misc{cadeneLeRobotStateoftheartMachine,
|
| 361 |
-
title = {{{LeRobot}}: {{State-of-the-art Machine Learning}} for {{Real-World Robotics}} in {{Pytorch}}},
|
| 362 |
-
author = {Cadene, Remi}
|
| 363 |
-
}
|
| 364 |
-
|
| 365 |
@misc{cadeneLeRobotStateoftheartMachine2024,
|
| 366 |
title = {{{LeRobot}}: {{State-of-the-art Machine Learning}} for {{Real-World Robotics}} in {{Pytorch}}},
|
| 367 |
author = {Cadene, Remi and Alibert, Simon and Soare, Alexander and Galloudec, Quentin and Zouitine, Adil and Palma, Steven and Kooijmans, Pepijn and Aractingi, Michel and Shukor, Mustafa and Aubakirova, Dana and Russi, Martino and Capuano, Francesco and Pascal, Caroline and Chogari, Jade and Moss, Jess and Wolf, Thomas},
|
|
@@ -385,15 +374,6 @@
|
|
| 385 |
file = {/Users/fracapuano/Zotero/storage/AYIY6DTF/Caron et al. - 2021 - Emerging Properties in Self-Supervised Vision Transformers.pdf;/Users/fracapuano/Zotero/storage/EKA7ZN2P/2104.html}
|
| 386 |
}
|
| 387 |
|
| 388 |
-
@inproceedings{chebotar2019closing,
|
| 389 |
-
title = {Closing the Sim-to-Real Loop: {{Adapting}} Simulation Randomization with Real World Experience},
|
| 390 |
-
booktitle = {2019 International Conference on Robotics and Automation ({{ICRA}})},
|
| 391 |
-
author = {Chebotar, Yevgen and Handa, Ankur and Makoviychuk, Viktor and Macklin, Miles and Issac, Jan and Ratliff, Nathan and Fox, Dieter},
|
| 392 |
-
year = {2019},
|
| 393 |
-
pages = {8973--8979},
|
| 394 |
-
publisher = {IEEE}
|
| 395 |
-
}
|
| 396 |
-
|
| 397 |
@inproceedings{chebotarClosingSimtorealLoop2019,
|
| 398 |
title = {Closing the Sim-to-Real Loop: {{Adapting}} Simulation Randomization with Real World Experience},
|
| 399 |
shorttitle = {Closing the Sim-to-Real Loop},
|
|
@@ -441,24 +421,6 @@
|
|
| 441 |
file = {/Users/fracapuano/Zotero/storage/7XRY3GJX/Chi et al. - 2024 - Diffusion Policy Visuomotor Policy Learning via Action Diffusion.pdf;/Users/fracapuano/Zotero/storage/BBBPKKMZ/2303.html}
|
| 442 |
}
|
| 443 |
|
| 444 |
-
@misc{collaborationOpenXEmbodimentRobotic2025,
|
| 445 |
-
title = {Open {{X-Embodiment}}: {{Robotic Learning Datasets}} and {{RT-X Models}}},
|
| 446 |
-
shorttitle = {Open {{X-Embodiment}}},
|
| 447 |
-
author = {Collaboration, Open X.-Embodiment and O'Neill, Abby and Rehman, Abdul and Gupta, Abhinav and Maddukuri, Abhiram and Gupta, Abhishek and Padalkar, Abhishek and Lee, Abraham and Pooley, Acorn and Gupta, Agrim and Mandlekar, Ajay and Jain, Ajinkya and Tung, Albert and Bewley, Alex and Herzog, Alex and Irpan, Alex and Khazatsky, Alexander and Rai, Anant and Gupta, Anchit and Wang, Andrew and Kolobov, Andrey and Singh, Anikait and Garg, Animesh and Kembhavi, Aniruddha and Xie, Annie and Brohan, Anthony and Raffin, Antonin and Sharma, Archit and Yavary, Arefeh and Jain, Arhan and Balakrishna, Ashwin and Wahid, Ayzaan and {Burgess-Limerick}, Ben and Kim, Beomjoon and Sch{\"o}lkopf, Bernhard and Wulfe, Blake and Ichter, Brian and Lu, Cewu and Xu, Charles and Le, Charlotte and Finn, Chelsea and Wang, Chen and Xu, Chenfeng and Chi, Cheng and Huang, Chenguang and Chan, Christine and Agia, Christopher and Pan, Chuer and Fu, Chuyuan and Devin, Coline and Xu, Danfei and Morton, Daniel and Driess, Danny and Chen, Daphne and Pathak, Deepak and Shah, Dhruv and B{\"u}chler, Dieter and Jayaraman, Dinesh and Kalashnikov, Dmitry and Sadigh, Dorsa and Johns, Edward and Foster, Ethan and Liu, Fangchen and Ceola, Federico and Xia, Fei and Zhao, Feiyu and Frujeri, Felipe Vieira and Stulp, Freek and Zhou, Gaoyue and Sukhatme, Gaurav S. and Salhotra, Gautam and Yan, Ge and Feng, Gilbert and Schiavi, Giulio and Berseth, Glen and Kahn, Gregory and Yang, Guangwen and Wang, Guanzhi and Su, Hao and Fang, Hao-Shu and Shi, Haochen and Bao, Henghui and Amor, Heni Ben and Christensen, Henrik I. and Furuta, Hiroki and Bharadhwaj, Homanga and Walke, Homer and Fang, Hongjie and Ha, Huy and Mordatch, Igor and Radosavovic, Ilija and Leal, Isabel and Liang, Jacky and {Abou-Chakra}, Jad and Kim, Jaehyung and Drake, Jaimyn and Peters, Jan and Schneider, Jan and Hsu, Jasmine and Vakil, Jay and Bohg, Jeannette and Bingham, Jeffrey and Wu, Jeffrey and Gao, Jensen and Hu, Jiaheng and Wu, Jiajun and Wu, Jialin and Sun, Jiankai and Luo, Jianlan and Gu, Jiayuan and Tan, Jie and Oh, Jihoon and Wu, Jimmy and Lu, Jingpei and Yang, Jingyun and Malik, Jitendra and Silv{\'e}rio, Jo{\~a}o and Hejna, Joey and Booher, Jonathan and Tompson, Jonathan and Yang, Jonathan and Salvador, Jordi and Lim, Joseph J. and Han, Junhyek and Wang, Kaiyuan and Rao, Kanishka and Pertsch, Karl and Hausman, Karol and Go, Keegan and Gopalakrishnan, Keerthana and Goldberg, Ken and Byrne, Kendra and Oslund, Kenneth and Kawaharazuka, Kento and Black, Kevin and Lin, Kevin and Zhang, Kevin and Ehsani, Kiana and Lekkala, Kiran and Ellis, Kirsty and Rana, Krishan and Srinivasan, Krishnan and Fang, Kuan and Singh, Kunal Pratap and Zeng, Kuo-Hao and Hatch, Kyle and Hsu, Kyle and Itti, Laurent and Chen, Lawrence Yunliang and Pinto, Lerrel and {Fei-Fei}, Li and Tan, Liam and Fan, Linxi "Jim" and Ott, Lionel and Lee, Lisa and Weihs, Luca and Chen, Magnum and Lepert, Marion and Memmel, Marius and Tomizuka, Masayoshi and Itkina, Masha and Castro, Mateo Guaman and Spero, Max and Du, Maximilian and Ahn, Michael and Yip, Michael C. and Zhang, Mingtong and Ding, Mingyu and Heo, Minho and Srirama, Mohan Kumar and Sharma, Mohit and Kim, Moo Jin and Irshad, Muhammad Zubair and Kanazawa, Naoaki and Hansen, Nicklas and Heess, Nicolas and Joshi, Nikhil J. and Suenderhauf, Niko and Liu, Ning and Palo, Norman Di and Shafiullah, Nur Muhammad Mahi and Mees, Oier and Kroemer, Oliver and Bastani, Osbert and Sanketi, Pannag R. and Miller, Patrick "Tree" and Yin, Patrick and Wohlhart, Paul and Xu, Peng and Fagan, Peter David and Mitrano, Peter and Sermanet, Pierre and Abbeel, Pieter and Sundaresan, Priya and Chen, Qiuyu and Vuong, Quan and Rafailov, Rafael and Tian, Ran and Doshi, Ria and {Mart{\'i}n-Mart{\'i}n}, Roberto and Baijal, Rohan and Scalise, Rosario and Hendrix, Rose and Lin, Roy and Qian, Runjia and Zhang, Ruohan and Mendonca, Russell and Shah, Rutav and Hoque, Ryan and Julian, Ryan and Bustamante, Samuel and Kirmani, Sean and Levine, Sergey and Lin, Shan and Moore, Sherry and Bahl, Shikhar and Dass, Shivin and Sonawani, Shubham and Tulsiani, Shubham and Song, Shuran and Xu, Sichun and Haldar, Siddhant and Karamcheti, Siddharth and Adebola, Simeon and Guist, Simon and Nasiriany, Soroush and Schaal, Stefan and Welker, Stefan and Tian, Stephen and Ramamoorthy, Subramanian and Dasari, Sudeep and Belkhale, Suneel and Park, Sungjae and Nair, Suraj and Mirchandani, Suvir and Osa, Takayuki and Gupta, Tanmay and Harada, Tatsuya and Matsushima, Tatsuya and Xiao, Ted and Kollar, Thomas and Yu, Tianhe and Ding, Tianli and Davchev, Todor and Zhao, Tony Z. and Armstrong, Travis and Darrell, Trevor and Chung, Trinity and Jain, Vidhi and Kumar, Vikash and Vanhoucke, Vincent and Guizilini, Vitor and Zhan, Wei and Zhou, Wenxuan and Burgard, Wolfram and Chen, Xi and Chen, Xiangyu and Wang, Xiaolong and Zhu, Xinghao and Geng, Xinyang and Liu, Xiyuan and Liangwei, Xu and Li, Xuanlin and Pang, Yansong and Lu, Yao and Ma, Yecheng Jason and Kim, Yejin and Chebotar, Yevgen and Zhou, Yifan and Zhu, Yifeng and Wu, Yilin and Xu, Ying and Wang, Yixuan and Bisk, Yonatan and Dou, Yongqiang and Cho, Yoonyoung and Lee, Youngwoon and Cui, Yuchen and Cao, Yue and Wu, Yueh-Hua and Tang, Yujin and Zhu, Yuke and Zhang, Yunchu and Jiang, Yunfan and Li, Yunshuang and Li, Yunzhu and Iwasawa, Yusuke and Matsuo, Yutaka and Ma, Zehan and Xu, Zhuo and Cui, Zichen Jeff and Zhang, Zichen and Fu, Zipeng and Lin, Zipeng},
|
| 448 |
-
year = {2025},
|
| 449 |
-
month = may,
|
| 450 |
-
number = {arXiv:2310.08864},
|
| 451 |
-
eprint = {2310.08864},
|
| 452 |
-
primaryclass = {cs},
|
| 453 |
-
publisher = {arXiv},
|
| 454 |
-
doi = {10.48550/arXiv.2310.08864},
|
| 455 |
-
urldate = {2025-09-08},
|
| 456 |
-
abstract = {Large, high-capacity models trained on diverse datasets have shown remarkable successes on efficiently tackling downstream applications. In domains from NLP to Computer Vision, this has led to a consolidation of pretrained models, with general pretrained backbones serving as a starting point for many applications. Can such a consolidation happen in robotics? Conventionally, robotic learning methods train a separate model for every application, every robot, and even every environment. Can we instead train generalist X-robot policy that can be adapted efficiently to new robots, tasks, and environments? In this paper, we provide datasets in standardized data formats and models to make it possible to explore this possibility in the context of robotic manipulation, alongside experimental results that provide an example of effective X-robot policies. We assemble a dataset from 22 different robots collected through a collaboration between 21 institutions, demonstrating 527 skills (160266 tasks). We show that a high-capacity model trained on this data, which we call RT-X, exhibits positive transfer and improves the capabilities of multiple robots by leveraging experience from other platforms. More details can be found on the project website https://robotics-transformer-x.github.io.},
|
| 457 |
-
archiveprefix = {arXiv},
|
| 458 |
-
keywords = {Computer Science - Robotics},
|
| 459 |
-
file = {/Users/fracapuano/Zotero/storage/2U73MMVN/Collaboration et al. - 2025 - Open X-Embodiment Robotic Learning Datasets and RT-X Models.pdf;/Users/fracapuano/Zotero/storage/PX7IHY32/2310.html}
|
| 460 |
-
}
|
| 461 |
-
|
| 462 |
@book{connellRobotLearning1993,
|
| 463 |
title = {Robot {{Learning}}},
|
| 464 |
editor = {Connell, Jonathan H. and Mahadevan, Sridhar},
|
|
@@ -662,40 +624,6 @@
|
|
| 662 |
file = {/Users/fracapuano/Zotero/storage/SSNAZ6U4/Griffin et al. - 2017 - Walking Stabilization Using Step Timing and Location Adjustment on the Humanoid Robot, Atlas.pdf;/Users/fracapuano/Zotero/storage/VP885PA9/1703.html}
|
| 663 |
}
|
| 664 |
|
| 665 |
-
@misc{haarnojaReinforcementLearningDeep2017,
|
| 666 |
-
title = {Reinforcement {{Learning}} with {{Deep Energy-Based Policies}}},
|
| 667 |
-
author = {Haarnoja, Tuomas and Tang, Haoran and Abbeel, Pieter and Levine, Sergey},
|
| 668 |
-
year = {2017},
|
| 669 |
-
month = jul,
|
| 670 |
-
number = {arXiv:1702.08165},
|
| 671 |
-
eprint = {1702.08165},
|
| 672 |
-
primaryclass = {cs},
|
| 673 |
-
publisher = {arXiv},
|
| 674 |
-
doi = {10.48550/arXiv.1702.08165},
|
| 675 |
-
urldate = {2025-08-31},
|
| 676 |
-
abstract = {We propose a method for learning expressive energy-based policies for continuous states and actions, which has been feasible only in tabular domains before. We apply our method to learning maximum entropy policies, resulting into a new algorithm, called soft Q-learning, that expresses the optimal policy via a Boltzmann distribution. We use the recently proposed amortized Stein variational gradient descent to learn a stochastic sampling network that approximates samples from this distribution. The benefits of the proposed algorithm include improved exploration and compositionality that allows transferring skills between tasks, which we confirm in simulated experiments with swimming and walking robots. We also draw a connection to actor-critic methods, which can be viewed performing approximate inference on the corresponding energy-based model.},
|
| 677 |
-
archiveprefix = {arXiv},
|
| 678 |
-
keywords = {Computer Science - Artificial Intelligence,Computer Science - Machine Learning},
|
| 679 |
-
file = {/Users/fracapuano/Zotero/storage/PXCR4TCT/Haarnoja et al. - 2017 - Reinforcement Learning with Deep Energy-Based Policies.pdf;/Users/fracapuano/Zotero/storage/VUXXX9B7/1702.html}
|
| 680 |
-
}
|
| 681 |
-
|
| 682 |
-
@misc{haarnojaReinforcementLearningDeep2017a,
|
| 683 |
-
title = {Reinforcement {{Learning}} with {{Deep Energy-Based Policies}}},
|
| 684 |
-
author = {Haarnoja, Tuomas and Tang, Haoran and Abbeel, Pieter and Levine, Sergey},
|
| 685 |
-
year = {2017},
|
| 686 |
-
month = jul,
|
| 687 |
-
number = {arXiv:1702.08165},
|
| 688 |
-
eprint = {1702.08165},
|
| 689 |
-
primaryclass = {cs},
|
| 690 |
-
publisher = {arXiv},
|
| 691 |
-
doi = {10.48550/arXiv.1702.08165},
|
| 692 |
-
urldate = {2025-08-31},
|
| 693 |
-
abstract = {We propose a method for learning expressive energy-based policies for continuous states and actions, which has been feasible only in tabular domains before. We apply our method to learning maximum entropy policies, resulting into a new algorithm, called soft Q-learning, that expresses the optimal policy via a Boltzmann distribution. We use the recently proposed amortized Stein variational gradient descent to learn a stochastic sampling network that approximates samples from this distribution. The benefits of the proposed algorithm include improved exploration and compositionality that allows transferring skills between tasks, which we confirm in simulated experiments with swimming and walking robots. We also draw a connection to actor-critic methods, which can be viewed performing approximate inference on the corresponding energy-based model.},
|
| 694 |
-
archiveprefix = {arXiv},
|
| 695 |
-
keywords = {Computer Science - Artificial Intelligence,Computer Science - Machine Learning},
|
| 696 |
-
file = {/Users/fracapuano/Zotero/storage/T84UBYDJ/Haarnoja et al. - 2017 - Reinforcement Learning with Deep Energy-Based Policies.pdf;/Users/fracapuano/Zotero/storage/53SJ2ED8/1702.html}
|
| 697 |
-
}
|
| 698 |
-
|
| 699 |
@inproceedings{haarnojaReinforcementLearningDeep2017b,
|
| 700 |
title = {Reinforcement {{Learning}} with {{Deep Energy-Based Policies}}},
|
| 701 |
booktitle = {Proceedings of the 34th {{International Conference}} on {{Machine Learning}}},
|
|
@@ -787,22 +715,6 @@
|
|
| 787 |
file = {/Users/fracapuano/Zotero/storage/DE655AYQ/Ho et al. - 2020 - Denoising Diffusion Probabilistic Models.pdf;/Users/fracapuano/Zotero/storage/NVIS47ZH/2006.html}
|
| 788 |
}
|
| 789 |
|
| 790 |
-
@article{hwangboLearningAgileDynamic2019,
|
| 791 |
-
title = {Learning Agile and Dynamic Motor Skills for Legged Robots},
|
| 792 |
-
author = {Hwangbo, Jemin and Lee, Joonho and Dosovitskiy, Alexey and Bellicoso, Dario and Tsounis, Vassilios and Koltun, Vladlen and Hutter, Marco},
|
| 793 |
-
year = {2019},
|
| 794 |
-
month = jan,
|
| 795 |
-
journal = {Science Robotics},
|
| 796 |
-
volume = {4},
|
| 797 |
-
number = {26},
|
| 798 |
-
pages = {eaau5872},
|
| 799 |
-
publisher = {American Association for the Advancement of Science},
|
| 800 |
-
doi = {10.1126/scirobotics.aau5872},
|
| 801 |
-
urldate = {2025-08-27},
|
| 802 |
-
abstract = {Legged robots pose one of the greatest challenges in robotics. Dynamic and agile maneuvers of animals cannot be imitated by existing methods that are crafted by humans. A compelling alternative is reinforcement learning, which requires minimal craftsmanship and promotes the natural evolution of a control policy. However, so far, reinforcement learning research for legged robots is mainly limited to simulation, and only few and comparably simple examples have been deployed on real systems. The primary reason is that training with real robots, particularly with dynamically balancing systems, is complicated and expensive. In the present work, we introduce a method for training a neural network policy in simulation and transferring it to a state-of-the-art legged system, thereby leveraging fast, automated, and cost-effective data generation schemes. The approach is applied to the ANYmal robot, a sophisticated medium-dog--sized quadrupedal system. Using policies trained in simulation, the quadrupedal machine achieves locomotion skills that go beyond what had been achieved with prior methods: ANYmal is capable of precisely and energy-efficiently following high-level body velocity commands, running faster than before, and recovering from falling even in complex configurations.},
|
| 803 |
-
file = {/Users/fracapuano/Zotero/storage/9V3X2F7R/Hwangbo et al. - 2019 - Learning agile and dynamic motor skills for legged robots.pdf}
|
| 804 |
-
}
|
| 805 |
-
|
| 806 |
@inproceedings{ImageNet_VSS09,
|
| 807 |
title = {Construction and Analysis of a Large Scale Image Ontology},
|
| 808 |
author = {Deng, J. and Li, K. and Do, M. and Su, H. and {Fei-Fei}, L.},
|
|
@@ -817,6 +729,24 @@
|
|
| 817 |
year = {2023}
|
| 818 |
}
|
| 819 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 820 |
@misc{jangBCZZeroShotTask2022,
|
| 821 |
title = {{{BC-Z}}: {{Zero-Shot Task Generalization}} with {{Robotic Imitation Learning}}},
|
| 822 |
shorttitle = {{{BC-Z}}},
|
|
@@ -928,14 +858,6 @@
|
|
| 928 |
file = {/Users/fracapuano/Zotero/storage/ZUPECLSW/Ke et al. - 2020 - Grasping with Chopsticks Combating Covariate Shift in Model-free Imitation Learning for Fine Manipu.pdf;/Users/fracapuano/Zotero/storage/X7PX638S/2011.html}
|
| 929 |
}
|
| 930 |
|
| 931 |
-
@article{khatibRealTimeObstancleAvoidance1986,
|
| 932 |
-
title = {Real-{{Time Obstancle Avoidance}} for {{Manipulators}} and {{Mobile Robots}}},
|
| 933 |
-
author = {Khatib, Oussama},
|
| 934 |
-
year = {1986},
|
| 935 |
-
journal = {The International Journal of Robotics Research},
|
| 936 |
-
volume = {5}
|
| 937 |
-
}
|
| 938 |
-
|
| 939 |
@misc{khazatskyDROIDLargeScaleInTheWild2025,
|
| 940 |
title = {{{DROID}}: {{A Large-Scale In-The-Wild Robot Manipulation Dataset}}},
|
| 941 |
shorttitle = {{{DROID}}},
|
|
@@ -972,21 +894,14 @@
|
|
| 972 |
file = {/Users/fracapuano/Zotero/storage/XR2SX8WG/Kim et al. - 2024 - OpenVLA An Open-Source Vision-Language-Action Model.pdf;/Users/fracapuano/Zotero/storage/63Q96WRV/2406.html}
|
| 973 |
}
|
| 974 |
|
| 975 |
-
@
|
| 976 |
-
title = {Auto-
|
| 977 |
-
author = {Kingma, Diederik P
|
| 978 |
-
year = {
|
| 979 |
-
|
| 980 |
-
number = {arXiv:1312.6114},
|
| 981 |
eprint = {1312.6114},
|
| 982 |
-
primaryclass = {stat},
|
| 983 |
-
publisher = {arXiv},
|
| 984 |
-
doi = {10.48550/arXiv.1312.6114},
|
| 985 |
-
urldate = {2025-09-02},
|
| 986 |
abstract = {How can we perform efficient inference and learning in directed probabilistic models, in the presence of continuous latent variables with intractable posterior distributions, and large datasets? We introduce a stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case. Our contributions are two-fold. First, we show that a reparameterization of the variational lower bound yields a lower bound estimator that can be straightforwardly optimized using standard stochastic gradient methods. Second, we show that for i.i.d. datasets with continuous latent variables per datapoint, posterior inference can be made especially efficient by fitting an approximate inference model (also called a recognition model) to the intractable posterior using the proposed lower bound estimator. Theoretical advantages are reflected in experimental results.},
|
| 987 |
-
archiveprefix = {arXiv}
|
| 988 |
-
keywords = {Computer Science - Machine Learning,Statistics - Machine Learning},
|
| 989 |
-
file = {/Users/fracapuano/Zotero/storage/IT7VNQ4U/Kingma and Welling - 2022 - Auto-Encoding Variational Bayes.pdf;/Users/fracapuano/Zotero/storage/HQT22HP5/1312.html}
|
| 990 |
}
|
| 991 |
|
| 992 |
@misc{knightStandardOpenSO100,
|
|
@@ -1119,23 +1034,6 @@
|
|
| 1119 |
file = {/Users/fracapuano/Zotero/storage/8B9EF2CE/Lee et al. - 2020 - Learning Quadrupedal Locomotion over Challenging Terrain.pdf}
|
| 1120 |
}
|
| 1121 |
|
| 1122 |
-
@misc{lillicrapContinuousControlDeep2019,
|
| 1123 |
-
title = {Continuous Control with Deep Reinforcement Learning},
|
| 1124 |
-
author = {Lillicrap, Timothy P. and Hunt, Jonathan J. and Pritzel, Alexander and Heess, Nicolas and Erez, Tom and Tassa, Yuval and Silver, David and Wierstra, Daan},
|
| 1125 |
-
year = {2019},
|
| 1126 |
-
month = jul,
|
| 1127 |
-
number = {arXiv:1509.02971},
|
| 1128 |
-
eprint = {1509.02971},
|
| 1129 |
-
primaryclass = {cs},
|
| 1130 |
-
publisher = {arXiv},
|
| 1131 |
-
doi = {10.48550/arXiv.1509.02971},
|
| 1132 |
-
urldate = {2025-08-31},
|
| 1133 |
-
abstract = {We adapt the ideas underlying the success of Deep Q-Learning to the continuous action domain. We present an actor-critic, model-free algorithm based on the deterministic policy gradient that can operate over continuous action spaces. Using the same learning algorithm, network architecture and hyper-parameters, our algorithm robustly solves more than 20 simulated physics tasks, including classic problems such as cartpole swing-up, dexterous manipulation, legged locomotion and car driving. Our algorithm is able to find policies whose performance is competitive with those found by a planning algorithm with full access to the dynamics of the domain and its derivatives. We further demonstrate that for many of the tasks the algorithm can learn policies end-to-end: directly from raw pixel inputs.},
|
| 1134 |
-
archiveprefix = {arXiv},
|
| 1135 |
-
keywords = {Computer Science - Machine Learning,Statistics - Machine Learning},
|
| 1136 |
-
file = {/Users/fracapuano/Zotero/storage/2VN6TMVK/Lillicrap et al. - 2019 - Continuous control with deep reinforcement learning.pdf;/Users/fracapuano/Zotero/storage/4FQ4W5VE/1509.html}
|
| 1137 |
-
}
|
| 1138 |
-
|
| 1139 |
@misc{lillicrapContinuousControlDeep2019a,
|
| 1140 |
title = {Continuous Control with Deep Reinforcement Learning},
|
| 1141 |
author = {Lillicrap, Timothy P. and Hunt, Jonathan J. and Pritzel, Alexander and Heess, Nicolas and Erez, Tom and Tassa, Yuval and Silver, David and Wierstra, Daan},
|
|
@@ -1256,6 +1154,25 @@
|
|
| 1256 |
file = {/Users/fracapuano/Zotero/storage/IFYQTF4K/Luo et al. - 2025 - SERL A Software Suite for Sample-Efficient Robotic Reinforcement Learning.pdf;/Users/fracapuano/Zotero/storage/5B67QZDM/2401.html}
|
| 1257 |
}
|
| 1258 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1259 |
@book{lynchModernRoboticsMechanics2017,
|
| 1260 |
title = {Modern {{Robotics}}: {{Mechanics}}, {{Planning}}, and {{Control}}},
|
| 1261 |
shorttitle = {Modern {{Robotics}}},
|
|
@@ -1430,6 +1347,24 @@
|
|
| 1430 |
year = {2023}
|
| 1431 |
}
|
| 1432 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1433 |
@misc{openaiGPT4TechnicalReport2024,
|
| 1434 |
title = {{{GPT-4 Technical Report}}},
|
| 1435 |
author = {OpenAI and Achiam, Josh and Adler, Steven and Agarwal, Sandhini and Ahmad, Lama and Akkaya, Ilge and Aleman, Florencia Leoni and Almeida, Diogo and Altenschmidt, Janko and Altman, Sam and Anadkat, Shyamal and Avila, Red and Babuschkin, Igor and Balaji, Suchir and Balcom, Valerie and Baltescu, Paul and Bao, Haiming and Bavarian, Mohammad and Belgum, Jeff and Bello, Irwan and Berdine, Jake and {Bernadett-Shapiro}, Gabriel and Berner, Christopher and Bogdonoff, Lenny and Boiko, Oleg and Boyd, Madelaine and Brakman, Anna-Luisa and Brockman, Greg and Brooks, Tim and Brundage, Miles and Button, Kevin and Cai, Trevor and Campbell, Rosie and Cann, Andrew and Carey, Brittany and Carlson, Chelsea and Carmichael, Rory and Chan, Brooke and Chang, Che and Chantzis, Fotis and Chen, Derek and Chen, Sully and Chen, Ruby and Chen, Jason and Chen, Mark and Chess, Ben and Cho, Chester and Chu, Casey and Chung, Hyung Won and Cummings, Dave and Currier, Jeremiah and Dai, Yunxing and Decareaux, Cory and Degry, Thomas and Deutsch, Noah and Deville, Damien and Dhar, Arka and Dohan, David and Dowling, Steve and Dunning, Sheila and Ecoffet, Adrien and Eleti, Atty and Eloundou, Tyna and Farhi, David and Fedus, Liam and Felix, Niko and Fishman, Sim{\'o}n Posada and Forte, Juston and Fulford, Isabella and Gao, Leo and Georges, Elie and Gibson, Christian and Goel, Vik and Gogineni, Tarun and Goh, Gabriel and {Gontijo-Lopes}, Rapha and Gordon, Jonathan and Grafstein, Morgan and Gray, Scott and Greene, Ryan and Gross, Joshua and Gu, Shixiang Shane and Guo, Yufei and Hallacy, Chris and Han, Jesse and Harris, Jeff and He, Yuchen and Heaton, Mike and Heidecke, Johannes and Hesse, Chris and Hickey, Alan and Hickey, Wade and Hoeschele, Peter and Houghton, Brandon and Hsu, Kenny and Hu, Shengli and Hu, Xin and Huizinga, Joost and Jain, Shantanu and Jain, Shawn and Jang, Joanne and Jiang, Angela and Jiang, Roger and Jin, Haozhun and Jin, Denny and Jomoto, Shino and Jonn, Billie and Jun, Heewoo and Kaftan, Tomer and Kaiser, {\L}ukasz and Kamali, Ali and Kanitscheider, Ingmar and Keskar, Nitish Shirish and Khan, Tabarak and Kilpatrick, Logan and Kim, Jong Wook and Kim, Christina and Kim, Yongjik and Kirchner, Jan Hendrik and Kiros, Jamie and Knight, Matt and Kokotajlo, Daniel and Kondraciuk, {\L}ukasz and Kondrich, Andrew and Konstantinidis, Aris and Kosic, Kyle and Krueger, Gretchen and Kuo, Vishal and Lampe, Michael and Lan, Ikai and Lee, Teddy and Leike, Jan and Leung, Jade and Levy, Daniel and Li, Chak Ming and Lim, Rachel and Lin, Molly and Lin, Stephanie and Litwin, Mateusz and Lopez, Theresa and Lowe, Ryan and Lue, Patricia and Makanju, Anna and Malfacini, Kim and Manning, Sam and Markov, Todor and Markovski, Yaniv and Martin, Bianca and Mayer, Katie and Mayne, Andrew and McGrew, Bob and McKinney, Scott Mayer and McLeavey, Christine and McMillan, Paul and McNeil, Jake and Medina, David and Mehta, Aalok and Menick, Jacob and Metz, Luke and Mishchenko, Andrey and Mishkin, Pamela and Monaco, Vinnie and Morikawa, Evan and Mossing, Daniel and Mu, Tong and Murati, Mira and Murk, Oleg and M{\'e}ly, David and Nair, Ashvin and Nakano, Reiichiro and Nayak, Rajeev and Neelakantan, Arvind and Ngo, Richard and Noh, Hyeonwoo and Ouyang, Long and O'Keefe, Cullen and Pachocki, Jakub and Paino, Alex and Palermo, Joe and Pantuliano, Ashley and Parascandolo, Giambattista and Parish, Joel and Parparita, Emy and Passos, Alex and Pavlov, Mikhail and Peng, Andrew and Perelman, Adam and Peres, Filipe de Avila Belbute and Petrov, Michael and Pinto, Henrique Ponde de Oliveira and Michael and Pokorny and Pokrass, Michelle and Pong, Vitchyr H. and Powell, Tolly and Power, Alethea and Power, Boris and Proehl, Elizabeth and Puri, Raul and Radford, Alec and Rae, Jack and Ramesh, Aditya and Raymond, Cameron and Real, Francis and Rimbach, Kendra and Ross, Carl and Rotsted, Bob and Roussez, Henri and Ryder, Nick and Saltarelli, Mario and Sanders, Ted and Santurkar, Shibani and Sastry, Girish and Schmidt, Heather and Schnurr, David and Schulman, John and Selsam, Daniel and Sheppard, Kyla and Sherbakov, Toki and Shieh, Jessica and Shoker, Sarah and Shyam, Pranav and Sidor, Szymon and Sigler, Eric and Simens, Maddie and Sitkin, Jordan and Slama, Katarina and Sohl, Ian and Sokolowsky, Benjamin and Song, Yang and Staudacher, Natalie and Such, Felipe Petroski and Summers, Natalie and Sutskever, Ilya and Tang, Jie and Tezak, Nikolas and Thompson, Madeleine B. and Tillet, Phil and Tootoonchian, Amin and Tseng, Elizabeth and Tuggle, Preston and Turley, Nick and Tworek, Jerry and Uribe, Juan Felipe Cer{\'o}n and Vallone, Andrea and Vijayvergiya, Arun and Voss, Chelsea and Wainwright, Carroll and Wang, Justin Jay and Wang, Alvin and Wang, Ben and Ward, Jonathan and Wei, Jason and Weinmann, C. J. and Welihinda, Akila and Welinder, Peter and Weng, Jiayi and Weng, Lilian and Wiethoff, Matt and Willner, Dave and Winter, Clemens and Wolrich, Samuel and Wong, Hannah and Workman, Lauren and Wu, Sherwin and Wu, Jeff and Wu, Michael and Xiao, Kai and Xu, Tao and Yoo, Sarah and Yu, Kevin and Yuan, Qiming and Zaremba, Wojciech and Zellers, Rowan and Zhang, Chong and Zhang, Marvin and Zhao, Shengjia and Zheng, Tianhao and Zhuang, Juntang and Zhuk, William and Zoph, Barret},
|
|
@@ -1447,15 +1382,6 @@
|
|
| 1447 |
file = {/Users/fracapuano/Zotero/storage/9CJAC5WC/OpenAI et al. - 2024 - GPT-4 Technical Report.pdf;/Users/fracapuano/Zotero/storage/8VS6FA7G/2303.html}
|
| 1448 |
}
|
| 1449 |
|
| 1450 |
-
@misc{OpenXEmbodimentRobotic,
|
| 1451 |
-
title = {Open {{X-Embodiment}}: {{Robotic Learning Datasets}} and {{RT-X Models}}},
|
| 1452 |
-
shorttitle = {Open {{X-Embodiment}}},
|
| 1453 |
-
urldate = {2025-08-27},
|
| 1454 |
-
abstract = {Project page for Open X-Embodiment: Robotic Learning Datasets and RT-X Models.},
|
| 1455 |
-
howpublished = {https://robotics-transformer-x.github.io/},
|
| 1456 |
-
file = {/Users/fracapuano/Zotero/storage/5DS9SYCH/robotics-transformer-x.github.io.html}
|
| 1457 |
-
}
|
| 1458 |
-
|
| 1459 |
@misc{oquabDINOv2LearningRobust2024,
|
| 1460 |
title = {{{DINOv2}}: {{Learning Robust Visual Features}} without {{Supervision}}},
|
| 1461 |
shorttitle = {{{DINOv2}}},
|
|
@@ -1553,19 +1479,6 @@
|
|
| 1553 |
file = {/Users/fracapuano/Zotero/storage/BT7UE8MA/Pomerleau - 1988 - ALVINN An Autonomous Land Vehicle in a Neural Network.pdf}
|
| 1554 |
}
|
| 1555 |
|
| 1556 |
-
@inproceedings{pomerleauALVINNAutonomousLand1988a,
|
| 1557 |
-
title = {{{ALVINN}}: {{An Autonomous Land Vehicle}} in a {{Neural Network}}},
|
| 1558 |
-
shorttitle = {{{ALVINN}}},
|
| 1559 |
-
booktitle = {Advances in {{Neural Information Processing Systems}}},
|
| 1560 |
-
author = {Pomerleau, Dean A.},
|
| 1561 |
-
year = {1988},
|
| 1562 |
-
volume = {1},
|
| 1563 |
-
publisher = {Morgan-Kaufmann},
|
| 1564 |
-
urldate = {2025-09-01},
|
| 1565 |
-
abstract = {ALVINN (Autonomous Land Vehicle In a Neural Network) is a 3-layer back-propagation network designed for the task of road following. Cur(cid:173) rently ALVINN takes images from a camera and a laser range finder as input and produces as output the direction the vehicle should travel in order to follow the road. Training has been conducted using simulated road images. Successful tests on the Carnegie Mellon autonomous navigation test vehicle indicate that the network can effectively follow real roads under certain field conditions. The representation developed to perfOIm the task differs dra(cid:173) matically when the networlc is trained under various conditions, suggesting the possibility of a novel adaptive autonomous navigation system capable of tailoring its processing to the conditions at hand.},
|
| 1566 |
-
file = {/Users/fracapuano/Zotero/storage/P64K7XYH/Pomerleau - 1988 - ALVINN An Autonomous Land Vehicle in a Neural Network.pdf}
|
| 1567 |
-
}
|
| 1568 |
-
|
| 1569 |
@book{prince2023understanding,
|
| 1570 |
title = {Understanding Deep Learning},
|
| 1571 |
author = {Prince, Simon J.D.},
|
|
@@ -1727,12 +1640,12 @@
|
|
| 1727 |
edition = {1},
|
| 1728 |
publisher = {Cambridge University Press},
|
| 1729 |
doi = {10.1017/CBO9781107298019},
|
| 1730 |
-
urldate = {2025-
|
| 1731 |
abstract = {Machine learning is one of the fastest growing areas of computer science, with far-reaching applications. The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way. The book provides a theoretical account of the fundamentals underlying machine learning and the mathematical derivations that transform these principles into practical algorithms. Following a presentation of the basics, the book covers a wide array of central topics unaddressed by previous textbooks. These include a discussion of the computational complexity of learning and the concepts of convexity and stability; important algorithmic paradigms including stochastic gradient descent, neural networks, and structured output learning; and emerging theoretical concepts such as the PAC-Bayes approach and compression-based bounds. Designed for advanced undergraduates or beginning graduates, the text makes the fundamentals and algorithms of machine learning accessible to students and non-expert readers in statistics, computer science, mathematics and engineering.},
|
| 1732 |
copyright = {https://www.cambridge.org/core/terms},
|
| 1733 |
isbn = {978-1-107-05713-5 978-1-107-29801-9},
|
| 1734 |
langid = {english},
|
| 1735 |
-
file = {/Users/fracapuano/Zotero/storage/
|
| 1736 |
}
|
| 1737 |
|
| 1738 |
@article{shazeerOUTRAGEOUSLYLARGENEURAL2017,
|
|
@@ -1803,61 +1716,6 @@
|
|
| 1803 |
file = {/Users/fracapuano/Zotero/storage/JHG94GYG/Siciliano and Khatib - 2016 - Springer Handbook of Robotics.pdf}
|
| 1804 |
}
|
| 1805 |
|
| 1806 |
-
@misc{SignYourAccount,
|
| 1807 |
-
title = {Sign in to Your Account},
|
| 1808 |
-
urldate = {2025-09-02},
|
| 1809 |
-
howpublished = {https://login.microsoftonline.com/cc95de1b-97f5-4f93-b4ba-fe68b852cf91/login},
|
| 1810 |
-
file = {/Users/fracapuano/Zotero/storage/AP6JNKS8/login.html}
|
| 1811 |
-
}
|
| 1812 |
-
|
| 1813 |
-
@article{silverDeterministicPolicyGradient,
|
| 1814 |
-
title = {Deterministic {{Policy Gradient Algorithms}}},
|
| 1815 |
-
author = {Silver, David and Lever, Guy and Heess, Nicolas and Degris, Thomas and Wierstra, Daan and Riedmiller, Martin},
|
| 1816 |
-
abstract = {In this paper we consider deterministic policy gradient algorithms for reinforcement learning with continuous actions. The deterministic policy gradient has a particularly appealing form: it is the expected gradient of the action-value function. This simple form means that the deterministic policy gradient can be estimated much more efficiently than the usual stochastic policy gradient. To ensure adequate exploration, we introduce an off-policy actor-critic algorithm that learns a deterministic target policy from an exploratory behaviour policy. We demonstrate that deterministic policy gradient algorithms can significantly outperform their stochastic counterparts in high-dimensional action spaces.},
|
| 1817 |
-
langid = {english},
|
| 1818 |
-
file = {/Users/fracapuano/Zotero/storage/IMFSXA3G/Silver et al. - Deterministic Policy Gradient Algorithms.pdf}
|
| 1819 |
-
}
|
| 1820 |
-
|
| 1821 |
-
@inproceedings{silverDeterministicPolicyGradient2014,
|
| 1822 |
-
title = {Deterministic {{Policy Gradient Algorithms}}},
|
| 1823 |
-
booktitle = {Proceedings of the 31st {{International Conference}} on {{Machine Learning}}},
|
| 1824 |
-
author = {Silver, David and Lever, Guy and Heess, Nicolas and Degris, Thomas and Wierstra, Daan and Riedmiller, Martin},
|
| 1825 |
-
year = {2014},
|
| 1826 |
-
month = jan,
|
| 1827 |
-
pages = {387--395},
|
| 1828 |
-
publisher = {PMLR},
|
| 1829 |
-
issn = {1938-7228},
|
| 1830 |
-
urldate = {2025-08-31},
|
| 1831 |
-
abstract = {In this paper we consider deterministic policy gradient algorithms for reinforcement learning with continuous actions. The deterministic policy gradient has a particularly appealing form: it is the expected gradient of the action-value function. This simple form means that the deterministic policy gradient can be estimated much more efficiently than the usual stochastic policy gradient. To ensure adequate exploration, we introduce an off-policy actor-critic algorithm that learns a deterministic target policy from an exploratory behaviour policy. Deterministic policy gradient algorithms outperformed their stochastic counterparts in several benchmark problems, particularly in high-dimensional action spaces.},
|
| 1832 |
-
langid = {english},
|
| 1833 |
-
file = {/Users/fracapuano/Zotero/storage/YI9JNYPV/Silver et al. - 2014 - Deterministic Policy Gradient Algorithms.pdf}
|
| 1834 |
-
}
|
| 1835 |
-
|
| 1836 |
-
@article{silverDeterministicPolicyGradienta,
|
| 1837 |
-
title = {Deterministic {{Policy Gradient Algorithms}}},
|
| 1838 |
-
author = {Silver, David and Lever, Guy and Heess, Nicolas and Degris, Thomas and Wierstra, Daan and Riedmiller, Martin},
|
| 1839 |
-
abstract = {In this paper we consider deterministic policy gradient algorithms for reinforcement learning with continuous actions. The deterministic policy gradient has a particularly appealing form: it is the expected gradient of the action-value function. This simple form means that the deterministic policy gradient can be estimated much more efficiently than the usual stochastic policy gradient. To ensure adequate exploration, we introduce an off-policy actor-critic algorithm that learns a deterministic target policy from an exploratory behaviour policy. We demonstrate that deterministic policy gradient algorithms can significantly outperform their stochastic counterparts in high-dimensional action spaces.},
|
| 1840 |
-
langid = {english},
|
| 1841 |
-
file = {/Users/fracapuano/Zotero/storage/VWQNLK9R/Silver et al. - Deterministic Policy Gradient Algorithms.pdf}
|
| 1842 |
-
}
|
| 1843 |
-
|
| 1844 |
-
@misc{sohl-dicksteinDeepUnsupervisedLearning2015,
|
| 1845 |
-
title = {Deep {{Unsupervised Learning}} Using {{Nonequilibrium Thermodynamics}}},
|
| 1846 |
-
author = {{Sohl-Dickstein}, Jascha and Weiss, Eric A. and Maheswaranathan, Niru and Ganguli, Surya},
|
| 1847 |
-
year = {2015},
|
| 1848 |
-
month = nov,
|
| 1849 |
-
number = {arXiv:1503.03585},
|
| 1850 |
-
eprint = {1503.03585},
|
| 1851 |
-
primaryclass = {cs},
|
| 1852 |
-
publisher = {arXiv},
|
| 1853 |
-
doi = {10.48550/arXiv.1503.03585},
|
| 1854 |
-
urldate = {2025-09-04},
|
| 1855 |
-
abstract = {A central problem in machine learning involves modeling complex data-sets using highly flexible families of probability distributions in which learning, sampling, inference, and evaluation are still analytically or computationally tractable. Here, we develop an approach that simultaneously achieves both flexibility and tractability. The essential idea, inspired by non-equilibrium statistical physics, is to systematically and slowly destroy structure in a data distribution through an iterative forward diffusion process. We then learn a reverse diffusion process that restores structure in data, yielding a highly flexible and tractable generative model of the data. This approach allows us to rapidly learn, sample from, and evaluate probabilities in deep generative models with thousands of layers or time steps, as well as to compute conditional and posterior probabilities under the learned model. We additionally release an open source reference implementation of the algorithm.},
|
| 1856 |
-
archiveprefix = {arXiv},
|
| 1857 |
-
keywords = {Computer Science - Machine Learning,Condensed Matter - Disordered Systems and Neural Networks,Quantitative Biology - Neurons and Cognition,Statistics - Machine Learning},
|
| 1858 |
-
file = {/Users/fracapuano/Zotero/storage/YZ5GBG5Z/Sohl-Dickstein et al. - 2015 - Deep Unsupervised Learning using Nonequilibrium Thermodynamics.pdf;/Users/fracapuano/Zotero/storage/97PKSBVT/1503.html}
|
| 1859 |
-
}
|
| 1860 |
-
|
| 1861 |
@inproceedings{sohnLearningStructuredOutput2015,
|
| 1862 |
title = {Learning {{Structured Output Representation}} Using {{Deep Conditional Generative Models}}},
|
| 1863 |
booktitle = {Advances in {{Neural Information Processing Systems}}},
|
|
@@ -1893,13 +1751,6 @@
|
|
| 1893 |
year = {2018}
|
| 1894 |
}
|
| 1895 |
|
| 1896 |
-
@misc{SuttonBartoBook,
|
| 1897 |
-
title = {Sutton \& {{Barto Book}}: {{Reinforcement Learning}}: {{An Introduction}}},
|
| 1898 |
-
urldate = {2025-08-28},
|
| 1899 |
-
howpublished = {http://incompleteideas.net/book/the-book-2nd.html},
|
| 1900 |
-
file = {/Users/fracapuano/Zotero/storage/A3QZFGPB/the-book-2nd.html}
|
| 1901 |
-
}
|
| 1902 |
-
|
| 1903 |
@inproceedings{suttonPolicyGradientMethods1999,
|
| 1904 |
title = {Policy {{Gradient Methods}} for {{Reinforcement Learning}} with {{Function Approximation}}},
|
| 1905 |
booktitle = {Advances in {{Neural Information Processing Systems}}},
|
|
@@ -1958,24 +1809,6 @@
|
|
| 1958 |
file = {/Users/fracapuano/Zotero/storage/AYWWN7ME/Tancik et al. - 2020 - Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains.pdf;/Users/fracapuano/Zotero/storage/68Q4Y4LM/2006.html}
|
| 1959 |
}
|
| 1960 |
|
| 1961 |
-
@misc{tangDeepReinforcementLearning2024,
|
| 1962 |
-
title = {Deep {{Reinforcement Learning}} for {{Robotics}}: {{A Survey}} of {{Real-World Successes}}},
|
| 1963 |
-
shorttitle = {Deep {{Reinforcement Learning}} for {{Robotics}}},
|
| 1964 |
-
author = {Tang, Chen and Abbatematteo, Ben and Hu, Jiaheng and Chandra, Rohan and {Mart{\'i}n-Mart{\'i}n}, Roberto and Stone, Peter},
|
| 1965 |
-
year = {2024},
|
| 1966 |
-
month = sep,
|
| 1967 |
-
number = {arXiv:2408.03539},
|
| 1968 |
-
eprint = {2408.03539},
|
| 1969 |
-
primaryclass = {cs},
|
| 1970 |
-
publisher = {arXiv},
|
| 1971 |
-
doi = {10.48550/arXiv.2408.03539},
|
| 1972 |
-
urldate = {2025-08-29},
|
| 1973 |
-
abstract = {Reinforcement learning (RL), particularly its combination with deep neural networks referred to as deep RL (DRL), has shown tremendous promise across a wide range of applications, suggesting its potential for enabling the development of sophisticated robotic behaviors. Robotics problems, however, pose fundamental difficulties for the application of RL, stemming from the complexity and cost of interacting with the physical world. This article provides a modern survey of DRL for robotics, with a particular focus on evaluating the real-world successes achieved with DRL in realizing several key robotic competencies. Our analysis aims to identify the key factors underlying those exciting successes, reveal underexplored areas, and provide an overall characterization of the status of DRL in robotics. We highlight several important avenues for future work, emphasizing the need for stable and sample-efficient real-world RL paradigms, holistic approaches for discovering and integrating various competencies to tackle complex long-horizon, open-world tasks, and principled development and evaluation procedures. This survey is designed to offer insights for both RL practitioners and roboticists toward harnessing RL's power to create generally capable real-world robotic systems.},
|
| 1974 |
-
archiveprefix = {arXiv},
|
| 1975 |
-
keywords = {Computer Science - Machine Learning,Computer Science - Robotics},
|
| 1976 |
-
file = {/Users/fracapuano/Zotero/storage/ZTX4VSMA/Tang et al. - 2024 - Deep Reinforcement Learning for Robotics A Survey of Real-World Successes.pdf;/Users/fracapuano/Zotero/storage/WDVGKFL3/2408.html}
|
| 1977 |
-
}
|
| 1978 |
-
|
| 1979 |
@article{tangDeepReinforcementLearning2025,
|
| 1980 |
title = {Deep {{Reinforcement Learning}} for {{Robotics}}: {{A Survey}} of {{Real-World Successes}}},
|
| 1981 |
shorttitle = {Deep {{Reinforcement Learning}} for {{Robotics}}},
|
|
@@ -2218,29 +2051,9 @@
|
|
| 2218 |
file = {/Users/fracapuano/Zotero/storage/4P7GCF3I/Zhao et al. - 2023 - Learning Fine-Grained Bimanual Manipulation with Low-Cost Hardware.pdf;/Users/fracapuano/Zotero/storage/3BC9S3Z2/2304.html}
|
| 2219 |
}
|
| 2220 |
|
| 2221 |
-
@misc{zhongPracticalBlockwiseNeural2018,
|
| 2222 |
-
title = {Practical {{Block-wise Neural Network Architecture Generation}}},
|
| 2223 |
-
author = {Zhong, Zhao and Yan, Junjie and Wu, Wei and Shao, Jing and Liu, Cheng-Lin},
|
| 2224 |
-
year = {2018},
|
| 2225 |
-
month = may,
|
| 2226 |
-
number = {arXiv:1708.05552},
|
| 2227 |
-
eprint = {1708.05552},
|
| 2228 |
-
primaryclass = {cs},
|
| 2229 |
-
publisher = {arXiv},
|
| 2230 |
-
urldate = {2023-05-05},
|
| 2231 |
-
abstract = {Convolutional neural networks have gained a remarkable success in computer vision. However, most usable network architectures are hand-crafted and usually require expertise and elaborate design. In this paper, we provide a block-wise network generation pipeline called BlockQNN which automatically builds high-performance networks using the Q-Learning paradigm with epsilon-greedy exploration strategy. The optimal network block is constructed by the learning agent which is trained sequentially to choose component layers. We stack the block to construct the whole auto-generated network. To accelerate the generation process, we also propose a distributed asynchronous framework and an early stop strategy. The block-wise generation brings unique advantages: (1) it performs competitive results in comparison to the hand-crafted state-of-the-art networks on image classification, additionally, the best network generated by BlockQNN achieves 3.54\% top-1 error rate on CIFAR-10 which beats all existing auto-generate networks. (2) in the meanwhile, it offers tremendous reduction of the search space in designing networks which only spends 3 days with 32 GPUs, and (3) moreover, it has strong generalizability that the network built on CIFAR also performs well on a larger-scale ImageNet dataset.},
|
| 2232 |
-
archiveprefix = {arXiv},
|
| 2233 |
-
keywords = {Computer Science - Computer Vision and Pattern Recognition,Computer Science - Machine Learning},
|
| 2234 |
-
file = {/Users/fracapuano/Zotero/storage/7ZJWPCRW/Zhong et al. - 2018 - Practical Block-wise Neural Network Architecture G.pdf;/Users/fracapuano/Zotero/storage/ZI2R395F/Zhong et al. - 2018 - Practical Block-wise Neural Network Architecture G.html}
|
| 2235 |
-
}
|
| 2236 |
-
|
| 2237 |
@inproceedings{zhu2024minigpt,
|
| 2238 |
title = {{{MiniGPT-4}}: {{Enhancing}} Vision-Language Understanding with Advanced Large Language Models},
|
| 2239 |
booktitle = {The Twelfth International Conference on Learning Representations},
|
| 2240 |
author = {Zhu, Deyao and Chen, Jun and Shen, Xiaoqian and Li, Xiang and Elhoseiny, Mohamed},
|
| 2241 |
year = {2024}
|
| 2242 |
}
|
| 2243 |
-
|
| 2244 |
-
@misc{zotero-item-169,
|
| 2245 |
-
type = {Misc}
|
| 2246 |
-
}
|
|
|
|
| 351 |
file = {/Users/fracapuano/Zotero/storage/TFZQ6EHJ/Burridge et al. - 1999 - Sequential Composition of Dynamically Dexterous Robot Behaviors.pdf}
|
| 352 |
}
|
| 353 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 354 |
@misc{cadeneLeRobotStateoftheartMachine2024,
|
| 355 |
title = {{{LeRobot}}: {{State-of-the-art Machine Learning}} for {{Real-World Robotics}} in {{Pytorch}}},
|
| 356 |
author = {Cadene, Remi and Alibert, Simon and Soare, Alexander and Galloudec, Quentin and Zouitine, Adil and Palma, Steven and Kooijmans, Pepijn and Aractingi, Michel and Shukor, Mustafa and Aubakirova, Dana and Russi, Martino and Capuano, Francesco and Pascal, Caroline and Chogari, Jade and Moss, Jess and Wolf, Thomas},
|
|
|
|
| 374 |
file = {/Users/fracapuano/Zotero/storage/AYIY6DTF/Caron et al. - 2021 - Emerging Properties in Self-Supervised Vision Transformers.pdf;/Users/fracapuano/Zotero/storage/EKA7ZN2P/2104.html}
|
| 375 |
}
|
| 376 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 377 |
@inproceedings{chebotarClosingSimtorealLoop2019,
|
| 378 |
title = {Closing the Sim-to-Real Loop: {{Adapting}} Simulation Randomization with Real World Experience},
|
| 379 |
shorttitle = {Closing the Sim-to-Real Loop},
|
|
|
|
| 421 |
file = {/Users/fracapuano/Zotero/storage/7XRY3GJX/Chi et al. - 2024 - Diffusion Policy Visuomotor Policy Learning via Action Diffusion.pdf;/Users/fracapuano/Zotero/storage/BBBPKKMZ/2303.html}
|
| 422 |
}
|
| 423 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 424 |
@book{connellRobotLearning1993,
|
| 425 |
title = {Robot {{Learning}}},
|
| 426 |
editor = {Connell, Jonathan H. and Mahadevan, Sridhar},
|
|
|
|
| 624 |
file = {/Users/fracapuano/Zotero/storage/SSNAZ6U4/Griffin et al. - 2017 - Walking Stabilization Using Step Timing and Location Adjustment on the Humanoid Robot, Atlas.pdf;/Users/fracapuano/Zotero/storage/VP885PA9/1703.html}
|
| 625 |
}
|
| 626 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 627 |
@inproceedings{haarnojaReinforcementLearningDeep2017b,
|
| 628 |
title = {Reinforcement {{Learning}} with {{Deep Energy-Based Policies}}},
|
| 629 |
booktitle = {Proceedings of the 34th {{International Conference}} on {{Machine Learning}}},
|
|
|
|
| 715 |
file = {/Users/fracapuano/Zotero/storage/DE655AYQ/Ho et al. - 2020 - Denoising Diffusion Probabilistic Models.pdf;/Users/fracapuano/Zotero/storage/NVIS47ZH/2006.html}
|
| 716 |
}
|
| 717 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 718 |
@inproceedings{ImageNet_VSS09,
|
| 719 |
title = {Construction and Analysis of a Large Scale Image Ontology},
|
| 720 |
author = {Deng, J. and Li, K. and Do, M. and Su, H. and {Fei-Fei}, L.},
|
|
|
|
| 729 |
year = {2023}
|
| 730 |
}
|
| 731 |
|
| 732 |
+
@misc{intelligence$p_05$VisionLanguageActionModel2025,
|
| 733 |
+
title = {\${$\pi\_$}\{0.5\}\$: A {{Vision-Language-Action Model}} with {{Open-World Generalization}}},
|
| 734 |
+
shorttitle = {\${$\pi\_$}\{0.5\}\$},
|
| 735 |
+
author = {Intelligence, Physical and Black, Kevin and Brown, Noah and Darpinian, James and Dhabalia, Karan and Driess, Danny and Esmail, Adnan and Equi, Michael and Finn, Chelsea and Fusai, Niccolo and Galliker, Manuel Y. and Ghosh, Dibya and Groom, Lachy and Hausman, Karol and Ichter, Brian and Jakubczak, Szymon and Jones, Tim and Ke, Liyiming and LeBlanc, Devin and Levine, Sergey and {Li-Bell}, Adrian and Mothukuri, Mohith and Nair, Suraj and Pertsch, Karl and Ren, Allen Z. and Shi, Lucy Xiaoyang and Smith, Laura and Springenberg, Jost Tobias and Stachowicz, Kyle and Tanner, James and Vuong, Quan and Walke, Homer and Walling, Anna and Wang, Haohuan and Yu, Lili and Zhilinsky, Ury},
|
| 736 |
+
year = {2025},
|
| 737 |
+
month = apr,
|
| 738 |
+
number = {arXiv:2504.16054},
|
| 739 |
+
eprint = {2504.16054},
|
| 740 |
+
primaryclass = {cs},
|
| 741 |
+
publisher = {arXiv},
|
| 742 |
+
doi = {10.48550/arXiv.2504.16054},
|
| 743 |
+
urldate = {2025-09-12},
|
| 744 |
+
abstract = {In order for robots to be useful, they must perform practically relevant tasks in the real world, outside of the lab. While vision-language-action (VLA) models have demonstrated impressive results for end-to-end robot control, it remains an open question how far such models can generalize in the wild. We describe \${\textbackslash}pi\_\{0.5\}\$, a new model based on \${\textbackslash}pi\_\{0\}\$ that uses co-training on heterogeneous tasks to enable broad generalization. \${\textbackslash}pi\_\{0.5\}\${\textbackslash} uses data from multiple robots, high-level semantic prediction, web data, and other sources to enable broadly generalizable real-world robotic manipulation. Our system uses a combination of co-training and hybrid multi-modal examples that combine image observations, language commands, object detections, semantic subtask prediction, and low-level actions. Our experiments show that this kind of knowledge transfer is essential for effective generalization, and we demonstrate for the first time that an end-to-end learning-enabled robotic system can perform long-horizon and dexterous manipulation skills, such as cleaning a kitchen or bedroom, in entirely new homes.},
|
| 745 |
+
archiveprefix = {arXiv},
|
| 746 |
+
keywords = {Computer Science - Machine Learning,Computer Science - Robotics},
|
| 747 |
+
file = {/Users/fracapuano/Zotero/storage/UC3RB96R/Intelligence et al. - 2025 - $π_ 0.5 $ a Vision-Language-Action Model with Open-World Generalization.pdf;/Users/fracapuano/Zotero/storage/DSFCCRF3/2504.html}
|
| 748 |
+
}
|
| 749 |
+
|
| 750 |
@misc{jangBCZZeroShotTask2022,
|
| 751 |
title = {{{BC-Z}}: {{Zero-Shot Task Generalization}} with {{Robotic Imitation Learning}}},
|
| 752 |
shorttitle = {{{BC-Z}}},
|
|
|
|
| 858 |
file = {/Users/fracapuano/Zotero/storage/ZUPECLSW/Ke et al. - 2020 - Grasping with Chopsticks Combating Covariate Shift in Model-free Imitation Learning for Fine Manipu.pdf;/Users/fracapuano/Zotero/storage/X7PX638S/2011.html}
|
| 859 |
}
|
| 860 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 861 |
@misc{khazatskyDROIDLargeScaleInTheWild2025,
|
| 862 |
title = {{{DROID}}: {{A Large-Scale In-The-Wild Robot Manipulation Dataset}}},
|
| 863 |
shorttitle = {{{DROID}}},
|
|
|
|
| 894 |
file = {/Users/fracapuano/Zotero/storage/XR2SX8WG/Kim et al. - 2024 - OpenVLA An Open-Source Vision-Language-Action Model.pdf;/Users/fracapuano/Zotero/storage/63Q96WRV/2406.html}
|
| 895 |
}
|
| 896 |
|
| 897 |
+
@article{kingma2013auto,
|
| 898 |
+
title = {Auto-Encoding Variational Bayes},
|
| 899 |
+
author = {Kingma, Diederik P and Welling, Max},
|
| 900 |
+
year = {2013},
|
| 901 |
+
journal = {arXiv preprint arXiv:1312.6114},
|
|
|
|
| 902 |
eprint = {1312.6114},
|
|
|
|
|
|
|
|
|
|
|
|
|
| 903 |
abstract = {How can we perform efficient inference and learning in directed probabilistic models, in the presence of continuous latent variables with intractable posterior distributions, and large datasets? We introduce a stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case. Our contributions are two-fold. First, we show that a reparameterization of the variational lower bound yields a lower bound estimator that can be straightforwardly optimized using standard stochastic gradient methods. Second, we show that for i.i.d. datasets with continuous latent variables per datapoint, posterior inference can be made especially efficient by fitting an approximate inference model (also called a recognition model) to the intractable posterior using the proposed lower bound estimator. Theoretical advantages are reflected in experimental results.},
|
| 904 |
+
archiveprefix = {arXiv}
|
|
|
|
|
|
|
| 905 |
}
|
| 906 |
|
| 907 |
@misc{knightStandardOpenSO100,
|
|
|
|
| 1034 |
file = {/Users/fracapuano/Zotero/storage/8B9EF2CE/Lee et al. - 2020 - Learning Quadrupedal Locomotion over Challenging Terrain.pdf}
|
| 1035 |
}
|
| 1036 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1037 |
@misc{lillicrapContinuousControlDeep2019a,
|
| 1038 |
title = {Continuous Control with Deep Reinforcement Learning},
|
| 1039 |
author = {Lillicrap, Timothy P. and Hunt, Jonathan J. and Pritzel, Alexander and Heess, Nicolas and Erez, Tom and Tassa, Yuval and Silver, David and Wierstra, Daan},
|
|
|
|
| 1154 |
file = {/Users/fracapuano/Zotero/storage/IFYQTF4K/Luo et al. - 2025 - SERL A Software Suite for Sample-Efficient Robotic Reinforcement Learning.pdf;/Users/fracapuano/Zotero/storage/5B67QZDM/2401.html}
|
| 1155 |
}
|
| 1156 |
|
| 1157 |
+
@misc{luoUnderstandingDiffusionModels2022,
|
| 1158 |
+
title = {Understanding {{Diffusion Models}}: {{A Unified Perspective}}},
|
| 1159 |
+
shorttitle = {Understanding {{Diffusion Models}}},
|
| 1160 |
+
author = {Luo, Calvin},
|
| 1161 |
+
year = {2022},
|
| 1162 |
+
month = aug,
|
| 1163 |
+
number = {arXiv:2208.11970},
|
| 1164 |
+
eprint = {2208.11970},
|
| 1165 |
+
primaryclass = {cs},
|
| 1166 |
+
publisher = {arXiv},
|
| 1167 |
+
doi = {10.48550/arXiv.2208.11970},
|
| 1168 |
+
urldate = {2025-09-28},
|
| 1169 |
+
abstract = {Diffusion models have shown incredible capabilities as generative models; indeed, they power the current state-of-the-art models on text-conditioned image generation such as Imagen and DALL-E 2. In this work we review, demystify, and unify the understanding of diffusion models across both variational and score-based perspectives. We first derive Variational Diffusion Models (VDM) as a special case of a Markovian Hierarchical Variational Autoencoder, where three key assumptions enable tractable computation and scalable optimization of the ELBO. We then prove that optimizing a VDM boils down to learning a neural network to predict one of three potential objectives: the original source input from any arbitrary noisification of it, the original source noise from any arbitrarily noisified input, or the score function of a noisified input at any arbitrary noise level. We then dive deeper into what it means to learn the score function, and connect the variational perspective of a diffusion model explicitly with the Score-based Generative Modeling perspective through Tweedie's Formula. Lastly, we cover how to learn a conditional distribution using diffusion models via guidance.},
|
| 1170 |
+
archiveprefix = {arXiv},
|
| 1171 |
+
langid = {english},
|
| 1172 |
+
keywords = {Computer Science - Computer Vision and Pattern Recognition,Computer Science - Machine Learning},
|
| 1173 |
+
file = {/Users/fracapuano/Zotero/storage/3MLGC83L/Luo - 2022 - Understanding Diffusion Models A Unified Perspective.pdf}
|
| 1174 |
+
}
|
| 1175 |
+
|
| 1176 |
@book{lynchModernRoboticsMechanics2017,
|
| 1177 |
title = {Modern {{Robotics}}: {{Mechanics}}, {{Planning}}, and {{Control}}},
|
| 1178 |
shorttitle = {Modern {{Robotics}}},
|
|
|
|
| 1347 |
year = {2023}
|
| 1348 |
}
|
| 1349 |
|
| 1350 |
+
@misc{oneillOpenXEmbodimentRobotic2025,
|
| 1351 |
+
title = {Open {{X-Embodiment}}: {{Robotic Learning Datasets}} and {{RT-X Models}}},
|
| 1352 |
+
shorttitle = {Open {{X-Embodiment}}},
|
| 1353 |
+
author = {O'Neill, Abby and Rehman, Abdul and Gupta, Abhinav and Maddukuri, Abhiram and Gupta, Abhishek and Padalkar, Abhishek and Lee, Abraham and Pooley, Acorn and Gupta, Agrim and Mandlekar, Ajay and Jain, Ajinkya and Tung, Albert and Bewley, Alex and Herzog, Alex and Irpan, Alex and Khazatsky, Alexander and Rai, Anant and Gupta, Anchit and Wang, Andrew and Kolobov, Andrey and Singh, Anikait and Garg, Animesh and Kembhavi, Aniruddha and Xie, Annie and Brohan, Anthony and Raffin, Antonin and Sharma, Archit and Yavary, Arefeh and Jain, Arhan and Balakrishna, Ashwin and Wahid, Ayzaan and {Burgess-Limerick}, Ben and Kim, Beomjoon and Sch{\"o}lkopf, Bernhard and Wulfe, Blake and Ichter, Brian and Lu, Cewu and Xu, Charles and Le, Charlotte and Finn, Chelsea and Wang, Chen and Xu, Chenfeng and Chi, Cheng and Huang, Chenguang and Chan, Christine and Agia, Christopher and Pan, Chuer and Fu, Chuyuan and Devin, Coline and Xu, Danfei and Morton, Daniel and Driess, Danny and Chen, Daphne and Pathak, Deepak and Shah, Dhruv and B{\"u}chler, Dieter and Jayaraman, Dinesh and Kalashnikov, Dmitry and Sadigh, Dorsa and Johns, Edward and Foster, Ethan and Liu, Fangchen and Ceola, Federico and Xia, Fei and Zhao, Feiyu and Frujeri, Felipe Vieira and Stulp, Freek and Zhou, Gaoyue and Sukhatme, Gaurav S. and Salhotra, Gautam and Yan, Ge and Feng, Gilbert and Schiavi, Giulio and Berseth, Glen and Kahn, Gregory and Yang, Guangwen and Wang, Guanzhi and Su, Hao and Fang, Hao-Shu and Shi, Haochen and Bao, Henghui and Amor, Heni Ben and Christensen, Henrik I. and Furuta, Hiroki and Bharadhwaj, Homanga and Walke, Homer and Fang, Hongjie and Ha, Huy and Mordatch, Igor and Radosavovic, Ilija and Leal, Isabel and Liang, Jacky and {Abou-Chakra}, Jad and Kim, Jaehyung and Drake, Jaimyn and Peters, Jan and Schneider, Jan and Hsu, Jasmine and Vakil, Jay and Bohg, Jeannette and Bingham, Jeffrey and Wu, Jeffrey and Gao, Jensen and Hu, Jiaheng and Wu, Jiajun and Wu, Jialin and Sun, Jiankai and Luo, Jianlan and Gu, Jiayuan and Tan, Jie and Oh, Jihoon and Wu, Jimmy and Lu, Jingpei and Yang, Jingyun and Malik, Jitendra and Silv{\'e}rio, Jo{\~a}o and Hejna, Joey and Booher, Jonathan and Tompson, Jonathan and Yang, Jonathan and Salvador, Jordi and Lim, Joseph J. and Han, Junhyek and Wang, Kaiyuan and Rao, Kanishka and Pertsch, Karl and Hausman, Karol and Go, Keegan and Gopalakrishnan, Keerthana and Goldberg, Ken and Byrne, Kendra and Oslund, Kenneth and Kawaharazuka, Kento and Black, Kevin and Lin, Kevin and Zhang, Kevin and Ehsani, Kiana and Lekkala, Kiran and Ellis, Kirsty and Rana, Krishan and Srinivasan, Krishnan and Fang, Kuan and Singh, Kunal Pratap and Zeng, Kuo-Hao and Hatch, Kyle and Hsu, Kyle and Itti, Laurent and Chen, Lawrence Yunliang and Pinto, Lerrel and {Fei-Fei}, Li and Tan, Liam and Fan, Linxi "Jim" and Ott, Lionel and Lee, Lisa and Weihs, Luca and Chen, Magnum and Lepert, Marion and Memmel, Marius and Tomizuka, Masayoshi and Itkina, Masha and Castro, Mateo Guaman and Spero, Max and Du, Maximilian and Ahn, Michael and Yip, Michael C. and Zhang, Mingtong and Ding, Mingyu and Heo, Minho and Srirama, Mohan Kumar and Sharma, Mohit and Kim, Moo Jin and Irshad, Muhammad Zubair and Kanazawa, Naoaki and Hansen, Nicklas and Heess, Nicolas and Joshi, Nikhil J. and Suenderhauf, Niko and Liu, Ning and Palo, Norman Di and Shafiullah, Nur Muhammad Mahi and Mees, Oier and Kroemer, Oliver and Bastani, Osbert and Sanketi, Pannag R. and Miller, Patrick "Tree" and Yin, Patrick and Wohlhart, Paul and Xu, Peng and Fagan, Peter David and Mitrano, Peter and Sermanet, Pierre and Abbeel, Pieter and Sundaresan, Priya and Chen, Qiuyu and Vuong, Quan and Rafailov, Rafael and Tian, Ran and Doshi, Ria and {Mart{\'i}n-Mart{\'i}n}, Roberto and Baijal, Rohan and Scalise, Rosario and Hendrix, Rose and Lin, Roy and Qian, Runjia and Zhang, Ruohan and Mendonca, Russell and Shah, Rutav and Hoque, Ryan and Julian, Ryan and Bustamante, Samuel and Kirmani, Sean and Levine, Sergey and Lin, Shan and Moore, Sherry and Bahl, Shikhar and Dass, Shivin and Sonawani, Shubham and Tulsiani, Shubham and Song, Shuran and Xu, Sichun and Haldar, Siddhant and Karamcheti, Siddharth and Adebola, Simeon and Guist, Simon and Nasiriany, Soroush and Schaal, Stefan and Welker, Stefan and Tian, Stephen and Ramamoorthy, Subramanian and Dasari, Sudeep and Belkhale, Suneel and Park, Sungjae and Nair, Suraj and Mirchandani, Suvir and Osa, Takayuki and Gupta, Tanmay and Harada, Tatsuya and Matsushima, Tatsuya and Xiao, Ted and Kollar, Thomas and Yu, Tianhe and Ding, Tianli and Davchev, Todor and Zhao, Tony Z. and Armstrong, Travis and Darrell, Trevor and Chung, Trinity and Jain, Vidhi and Kumar, Vikash and Vanhoucke, Vincent and Guizilini, Vitor and Zhan, Wei and Zhou, Wenxuan and Burgard, Wolfram and Chen, Xi and Chen, Xiangyu and Wang, Xiaolong and Zhu, Xinghao and Geng, Xinyang and Liu, Xiyuan and Liangwei, Xu and Li, Xuanlin and Pang, Yansong and Lu, Yao and Ma, Yecheng Jason and Kim, Yejin and Chebotar, Yevgen and Zhou, Yifan and Zhu, Yifeng and Wu, Yilin and Xu, Ying and Wang, Yixuan and Bisk, Yonatan and Dou, Yongqiang and Cho, Yoonyoung and Lee, Youngwoon and Cui, Yuchen and Cao, Yue and Wu, Yueh-Hua and Tang, Yujin and Zhu, Yuke and Zhang, Yunchu and Jiang, Yunfan and Li, Yunshuang and Li, Yunzhu and Iwasawa, Yusuke and Matsuo, Yutaka and Ma, Zehan and Xu, Zhuo and Cui, Zichen Jeff and Zhang, Zichen and Fu, Zipeng and Lin, Zipeng},
|
| 1354 |
+
year = {2025},
|
| 1355 |
+
month = may,
|
| 1356 |
+
number = {arXiv:2310.08864},
|
| 1357 |
+
eprint = {2310.08864},
|
| 1358 |
+
primaryclass = {cs},
|
| 1359 |
+
publisher = {arXiv},
|
| 1360 |
+
doi = {10.48550/arXiv.2310.08864},
|
| 1361 |
+
urldate = {2025-09-08},
|
| 1362 |
+
abstract = {Large, high-capacity models trained on diverse datasets have shown remarkable successes on efficiently tackling downstream applications. In domains from NLP to Computer Vision, this has led to a consolidation of pretrained models, with general pretrained backbones serving as a starting point for many applications. Can such a consolidation happen in robotics? Conventionally, robotic learning methods train a separate model for every application, every robot, and even every environment. Can we instead train generalist X-robot policy that can be adapted efficiently to new robots, tasks, and environments? In this paper, we provide datasets in standardized data formats and models to make it possible to explore this possibility in the context of robotic manipulation, alongside experimental results that provide an example of effective X-robot policies. We assemble a dataset from 22 different robots collected through a collaboration between 21 institutions, demonstrating 527 skills (160266 tasks). We show that a high-capacity model trained on this data, which we call RT-X, exhibits positive transfer and improves the capabilities of multiple robots by leveraging experience from other platforms. More details can be found on the project website https://robotics-transformer-x.github.io.},
|
| 1363 |
+
archiveprefix = {arXiv},
|
| 1364 |
+
keywords = {Computer Science - Robotics},
|
| 1365 |
+
file = {/Users/fracapuano/Zotero/storage/2U73MMVN/Collaboration et al. - 2025 - Open X-Embodiment Robotic Learning Datasets and RT-X Models.pdf;/Users/fracapuano/Zotero/storage/PX7IHY32/2310.html}
|
| 1366 |
+
}
|
| 1367 |
+
|
| 1368 |
@misc{openaiGPT4TechnicalReport2024,
|
| 1369 |
title = {{{GPT-4 Technical Report}}},
|
| 1370 |
author = {OpenAI and Achiam, Josh and Adler, Steven and Agarwal, Sandhini and Ahmad, Lama and Akkaya, Ilge and Aleman, Florencia Leoni and Almeida, Diogo and Altenschmidt, Janko and Altman, Sam and Anadkat, Shyamal and Avila, Red and Babuschkin, Igor and Balaji, Suchir and Balcom, Valerie and Baltescu, Paul and Bao, Haiming and Bavarian, Mohammad and Belgum, Jeff and Bello, Irwan and Berdine, Jake and {Bernadett-Shapiro}, Gabriel and Berner, Christopher and Bogdonoff, Lenny and Boiko, Oleg and Boyd, Madelaine and Brakman, Anna-Luisa and Brockman, Greg and Brooks, Tim and Brundage, Miles and Button, Kevin and Cai, Trevor and Campbell, Rosie and Cann, Andrew and Carey, Brittany and Carlson, Chelsea and Carmichael, Rory and Chan, Brooke and Chang, Che and Chantzis, Fotis and Chen, Derek and Chen, Sully and Chen, Ruby and Chen, Jason and Chen, Mark and Chess, Ben and Cho, Chester and Chu, Casey and Chung, Hyung Won and Cummings, Dave and Currier, Jeremiah and Dai, Yunxing and Decareaux, Cory and Degry, Thomas and Deutsch, Noah and Deville, Damien and Dhar, Arka and Dohan, David and Dowling, Steve and Dunning, Sheila and Ecoffet, Adrien and Eleti, Atty and Eloundou, Tyna and Farhi, David and Fedus, Liam and Felix, Niko and Fishman, Sim{\'o}n Posada and Forte, Juston and Fulford, Isabella and Gao, Leo and Georges, Elie and Gibson, Christian and Goel, Vik and Gogineni, Tarun and Goh, Gabriel and {Gontijo-Lopes}, Rapha and Gordon, Jonathan and Grafstein, Morgan and Gray, Scott and Greene, Ryan and Gross, Joshua and Gu, Shixiang Shane and Guo, Yufei and Hallacy, Chris and Han, Jesse and Harris, Jeff and He, Yuchen and Heaton, Mike and Heidecke, Johannes and Hesse, Chris and Hickey, Alan and Hickey, Wade and Hoeschele, Peter and Houghton, Brandon and Hsu, Kenny and Hu, Shengli and Hu, Xin and Huizinga, Joost and Jain, Shantanu and Jain, Shawn and Jang, Joanne and Jiang, Angela and Jiang, Roger and Jin, Haozhun and Jin, Denny and Jomoto, Shino and Jonn, Billie and Jun, Heewoo and Kaftan, Tomer and Kaiser, {\L}ukasz and Kamali, Ali and Kanitscheider, Ingmar and Keskar, Nitish Shirish and Khan, Tabarak and Kilpatrick, Logan and Kim, Jong Wook and Kim, Christina and Kim, Yongjik and Kirchner, Jan Hendrik and Kiros, Jamie and Knight, Matt and Kokotajlo, Daniel and Kondraciuk, {\L}ukasz and Kondrich, Andrew and Konstantinidis, Aris and Kosic, Kyle and Krueger, Gretchen and Kuo, Vishal and Lampe, Michael and Lan, Ikai and Lee, Teddy and Leike, Jan and Leung, Jade and Levy, Daniel and Li, Chak Ming and Lim, Rachel and Lin, Molly and Lin, Stephanie and Litwin, Mateusz and Lopez, Theresa and Lowe, Ryan and Lue, Patricia and Makanju, Anna and Malfacini, Kim and Manning, Sam and Markov, Todor and Markovski, Yaniv and Martin, Bianca and Mayer, Katie and Mayne, Andrew and McGrew, Bob and McKinney, Scott Mayer and McLeavey, Christine and McMillan, Paul and McNeil, Jake and Medina, David and Mehta, Aalok and Menick, Jacob and Metz, Luke and Mishchenko, Andrey and Mishkin, Pamela and Monaco, Vinnie and Morikawa, Evan and Mossing, Daniel and Mu, Tong and Murati, Mira and Murk, Oleg and M{\'e}ly, David and Nair, Ashvin and Nakano, Reiichiro and Nayak, Rajeev and Neelakantan, Arvind and Ngo, Richard and Noh, Hyeonwoo and Ouyang, Long and O'Keefe, Cullen and Pachocki, Jakub and Paino, Alex and Palermo, Joe and Pantuliano, Ashley and Parascandolo, Giambattista and Parish, Joel and Parparita, Emy and Passos, Alex and Pavlov, Mikhail and Peng, Andrew and Perelman, Adam and Peres, Filipe de Avila Belbute and Petrov, Michael and Pinto, Henrique Ponde de Oliveira and Michael and Pokorny and Pokrass, Michelle and Pong, Vitchyr H. and Powell, Tolly and Power, Alethea and Power, Boris and Proehl, Elizabeth and Puri, Raul and Radford, Alec and Rae, Jack and Ramesh, Aditya and Raymond, Cameron and Real, Francis and Rimbach, Kendra and Ross, Carl and Rotsted, Bob and Roussez, Henri and Ryder, Nick and Saltarelli, Mario and Sanders, Ted and Santurkar, Shibani and Sastry, Girish and Schmidt, Heather and Schnurr, David and Schulman, John and Selsam, Daniel and Sheppard, Kyla and Sherbakov, Toki and Shieh, Jessica and Shoker, Sarah and Shyam, Pranav and Sidor, Szymon and Sigler, Eric and Simens, Maddie and Sitkin, Jordan and Slama, Katarina and Sohl, Ian and Sokolowsky, Benjamin and Song, Yang and Staudacher, Natalie and Such, Felipe Petroski and Summers, Natalie and Sutskever, Ilya and Tang, Jie and Tezak, Nikolas and Thompson, Madeleine B. and Tillet, Phil and Tootoonchian, Amin and Tseng, Elizabeth and Tuggle, Preston and Turley, Nick and Tworek, Jerry and Uribe, Juan Felipe Cer{\'o}n and Vallone, Andrea and Vijayvergiya, Arun and Voss, Chelsea and Wainwright, Carroll and Wang, Justin Jay and Wang, Alvin and Wang, Ben and Ward, Jonathan and Wei, Jason and Weinmann, C. J. and Welihinda, Akila and Welinder, Peter and Weng, Jiayi and Weng, Lilian and Wiethoff, Matt and Willner, Dave and Winter, Clemens and Wolrich, Samuel and Wong, Hannah and Workman, Lauren and Wu, Sherwin and Wu, Jeff and Wu, Michael and Xiao, Kai and Xu, Tao and Yoo, Sarah and Yu, Kevin and Yuan, Qiming and Zaremba, Wojciech and Zellers, Rowan and Zhang, Chong and Zhang, Marvin and Zhao, Shengjia and Zheng, Tianhao and Zhuang, Juntang and Zhuk, William and Zoph, Barret},
|
|
|
|
| 1382 |
file = {/Users/fracapuano/Zotero/storage/9CJAC5WC/OpenAI et al. - 2024 - GPT-4 Technical Report.pdf;/Users/fracapuano/Zotero/storage/8VS6FA7G/2303.html}
|
| 1383 |
}
|
| 1384 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1385 |
@misc{oquabDINOv2LearningRobust2024,
|
| 1386 |
title = {{{DINOv2}}: {{Learning Robust Visual Features}} without {{Supervision}}},
|
| 1387 |
shorttitle = {{{DINOv2}}},
|
|
|
|
| 1479 |
file = {/Users/fracapuano/Zotero/storage/BT7UE8MA/Pomerleau - 1988 - ALVINN An Autonomous Land Vehicle in a Neural Network.pdf}
|
| 1480 |
}
|
| 1481 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1482 |
@book{prince2023understanding,
|
| 1483 |
title = {Understanding Deep Learning},
|
| 1484 |
author = {Prince, Simon J.D.},
|
|
|
|
| 1640 |
edition = {1},
|
| 1641 |
publisher = {Cambridge University Press},
|
| 1642 |
doi = {10.1017/CBO9781107298019},
|
| 1643 |
+
urldate = {2025-10-10},
|
| 1644 |
abstract = {Machine learning is one of the fastest growing areas of computer science, with far-reaching applications. The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way. The book provides a theoretical account of the fundamentals underlying machine learning and the mathematical derivations that transform these principles into practical algorithms. Following a presentation of the basics, the book covers a wide array of central topics unaddressed by previous textbooks. These include a discussion of the computational complexity of learning and the concepts of convexity and stability; important algorithmic paradigms including stochastic gradient descent, neural networks, and structured output learning; and emerging theoretical concepts such as the PAC-Bayes approach and compression-based bounds. Designed for advanced undergraduates or beginning graduates, the text makes the fundamentals and algorithms of machine learning accessible to students and non-expert readers in statistics, computer science, mathematics and engineering.},
|
| 1645 |
copyright = {https://www.cambridge.org/core/terms},
|
| 1646 |
isbn = {978-1-107-05713-5 978-1-107-29801-9},
|
| 1647 |
langid = {english},
|
| 1648 |
+
file = {/Users/fracapuano/Zotero/storage/H2QY9ZK9/Shalev-Shwartz and Ben-David - 2014 - Understanding Machine Learning From Theory to Algorithms.pdf}
|
| 1649 |
}
|
| 1650 |
|
| 1651 |
@article{shazeerOUTRAGEOUSLYLARGENEURAL2017,
|
|
|
|
| 1716 |
file = {/Users/fracapuano/Zotero/storage/JHG94GYG/Siciliano and Khatib - 2016 - Springer Handbook of Robotics.pdf}
|
| 1717 |
}
|
| 1718 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1719 |
@inproceedings{sohnLearningStructuredOutput2015,
|
| 1720 |
title = {Learning {{Structured Output Representation}} Using {{Deep Conditional Generative Models}}},
|
| 1721 |
booktitle = {Advances in {{Neural Information Processing Systems}}},
|
|
|
|
| 1751 |
year = {2018}
|
| 1752 |
}
|
| 1753 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1754 |
@inproceedings{suttonPolicyGradientMethods1999,
|
| 1755 |
title = {Policy {{Gradient Methods}} for {{Reinforcement Learning}} with {{Function Approximation}}},
|
| 1756 |
booktitle = {Advances in {{Neural Information Processing Systems}}},
|
|
|
|
| 1809 |
file = {/Users/fracapuano/Zotero/storage/AYWWN7ME/Tancik et al. - 2020 - Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains.pdf;/Users/fracapuano/Zotero/storage/68Q4Y4LM/2006.html}
|
| 1810 |
}
|
| 1811 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1812 |
@article{tangDeepReinforcementLearning2025,
|
| 1813 |
title = {Deep {{Reinforcement Learning}} for {{Robotics}}: {{A Survey}} of {{Real-World Successes}}},
|
| 1814 |
shorttitle = {Deep {{Reinforcement Learning}} for {{Robotics}}},
|
|
|
|
| 2051 |
file = {/Users/fracapuano/Zotero/storage/4P7GCF3I/Zhao et al. - 2023 - Learning Fine-Grained Bimanual Manipulation with Low-Cost Hardware.pdf;/Users/fracapuano/Zotero/storage/3BC9S3Z2/2304.html}
|
| 2052 |
}
|
| 2053 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 2054 |
@inproceedings{zhu2024minigpt,
|
| 2055 |
title = {{{MiniGPT-4}}: {{Enhancing}} Vision-Language Understanding with Advanced Large Language Models},
|
| 2056 |
booktitle = {The Twelfth International Conference on Learning Representations},
|
| 2057 |
author = {Zhu, Deyao and Chen, Jun and Shen, Xiaoqian and Li, Xiang and Elhoseiny, Mohamed},
|
| 2058 |
year = {2024}
|
| 2059 |
}
|
|
|
|
|
|
|
|
|
|
|
|
app/scripts/latex-to-mdx/input/main.blg
ADDED
|
@@ -0,0 +1,63 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
This is BibTeX, Version 0.99d (TeX Live 2025)
|
| 2 |
+
Capacity: max_strings=200000, hash_size=200000, hash_prime=170003
|
| 3 |
+
The top-level auxiliary file: main.aux
|
| 4 |
+
The style file: hfstyle/plainnat.bst
|
| 5 |
+
Database file #1: main.bib
|
| 6 |
+
Reallocated str_pool (elt_size=1) to 130000 items from 65000.
|
| 7 |
+
Warning--empty journal in SpinningUp2018
|
| 8 |
+
Warning--empty journal in agrawalComputationalSensorimotorLearning
|
| 9 |
+
Warning--empty year in agrawalComputationalSensorimotorLearning
|
| 10 |
+
Warning--empty year in agrawalComputationalSensorimotorLearning
|
| 11 |
+
Warning--empty journal in aldacoALOHA2Enhanced
|
| 12 |
+
Warning--empty year in aldacoALOHA2Enhanced
|
| 13 |
+
Warning--empty year in aldacoALOHA2Enhanced
|
| 14 |
+
Warning--empty booktitle in ImageNet_VSS09
|
| 15 |
+
Warning--empty journal in fujitaDevelopmentRobotsNuclear2020
|
| 16 |
+
Warning--empty year in knightStandardOpenSO100
|
| 17 |
+
Warning--empty journal in koberReinforcementLearningRobotics
|
| 18 |
+
Warning--empty year in koberReinforcementLearningRobotics
|
| 19 |
+
Warning--empty year in koberReinforcementLearningRobotics
|
| 20 |
+
Warning--empty year in tedrakeRoboticManipulationPerception
|
| 21 |
+
Warning--empty year in tedrakeUnderactuatedRoboticsAlgorithms
|
| 22 |
+
You've used 120 entries,
|
| 23 |
+
2773 wiz_defined-function locations,
|
| 24 |
+
1321 strings with 69311 characters,
|
| 25 |
+
and the built_in function-call counts, 152564 in all, are:
|
| 26 |
+
= -- 10303
|
| 27 |
+
> -- 21678
|
| 28 |
+
< -- 28
|
| 29 |
+
+ -- 7234
|
| 30 |
+
- -- 7111
|
| 31 |
+
* -- 16383
|
| 32 |
+
:= -- 25959
|
| 33 |
+
add.period$ -- 317
|
| 34 |
+
call.type$ -- 120
|
| 35 |
+
change.case$ -- 2723
|
| 36 |
+
chr.to.int$ -- 117
|
| 37 |
+
cite$ -- 255
|
| 38 |
+
duplicate$ -- 2255
|
| 39 |
+
empty$ -- 4262
|
| 40 |
+
format.name$ -- 7247
|
| 41 |
+
if$ -- 28994
|
| 42 |
+
int.to.chr$ -- 4
|
| 43 |
+
int.to.str$ -- 1
|
| 44 |
+
missing$ -- 48
|
| 45 |
+
newline$ -- 554
|
| 46 |
+
num.names$ -- 486
|
| 47 |
+
pop$ -- 5804
|
| 48 |
+
preamble$ -- 1
|
| 49 |
+
purify$ -- 2606
|
| 50 |
+
quote$ -- 0
|
| 51 |
+
skip$ -- 3913
|
| 52 |
+
stack$ -- 0
|
| 53 |
+
substring$ -- 763
|
| 54 |
+
swap$ -- 306
|
| 55 |
+
text.length$ -- 11
|
| 56 |
+
text.prefix$ -- 0
|
| 57 |
+
top$ -- 0
|
| 58 |
+
type$ -- 1290
|
| 59 |
+
warning$ -- 15
|
| 60 |
+
while$ -- 401
|
| 61 |
+
width$ -- 0
|
| 62 |
+
write$ -- 1375
|
| 63 |
+
(There were 15 warnings)
|
app/scripts/latex-to-mdx/input/main.fdb_latexmk
ADDED
|
@@ -0,0 +1,464 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Fdb version 4
|
| 2 |
+
["bibtex main"] 1760375737.27684 "main.aux" "main.bbl" "main" 1760378129.31317 0
|
| 3 |
+
"./hfstyle/plainnat.bst" 1757605857.52274 26811 5a50053214cba5f9650f9d3e85dfb915 ""
|
| 4 |
+
"./main.bib" 1760138076.6384 244485 4566853749bf7b0f6d192adb3ef3ad7d ""
|
| 5 |
+
"main.aux" 1760378128.81232 95069 e38c0810d72c43c3020821b1066734a2 "pdflatex"
|
| 6 |
+
(generated)
|
| 7 |
+
"main.bbl"
|
| 8 |
+
"main.blg"
|
| 9 |
+
(rewritten before read)
|
| 10 |
+
["pdflatex"] 1760378110.64369 "/Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/main.tex" "main.pdf" "main" 1760378129.31351 0
|
| 11 |
+
"/Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/main.tex" 1760378109.98581 6215 dc9e4d14873a65119a75a93ee8fa4891 ""
|
| 12 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/enc/dvips/cm-super/cm-super-t1.enc" 1136849721 2971 def0b6c1f0b107b3b936def894055589 ""
|
| 13 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/enc/dvips/cm-super/cm-super-ts1.enc" 1136849721 2900 1537cc8184ad1792082cd229ecc269f4 ""
|
| 14 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/enc/ttf2pk/base/T1-WGL4.enc" 1136246938 3667 797dd419deb79396824beeb9558df721 ""
|
| 15 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/map/fontname/texfonts.map" 1577235249 3524 cb3e574dea2d1052e39280babc910dc8 ""
|
| 16 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/adobe/symbol/psyr.tfm" 1136768653 1408 5937f58aa508ea2cea4901c07d10f5fe ""
|
| 17 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/adobe/zapfding/pzdr.tfm" 1136768653 1528 f853c4d1b4e0550255e02831fdc8496f ""
|
| 18 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecbx0900.tfm" 1136768653 3584 1a7de6c99457381c64abc1a7c545505f ""
|
| 19 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecbx1000.tfm" 1136768653 3584 2d666ecf6d466d8b007246bc2f94d9da ""
|
| 20 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/eccc0500.tfm" 1136768653 3072 01a75ef80b618de34a6e0cce64bc8aef ""
|
| 21 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/eccc0700.tfm" 1136768653 3072 c25a5e08cb36d69ee7e65fe231221f6d ""
|
| 22 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/eccc1000.tfm" 1136768653 3072 0e825b3b77feef5bf88d50fba1209c90 ""
|
| 23 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm0500.tfm" 1136768653 3584 178baad7ffca7f5d3428a83bd7cc64c3 ""
|
| 24 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm0600.tfm" 1136768653 3584 291a5713401683441e0a8c8f4417b17b ""
|
| 25 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm0700.tfm" 1136768653 3584 cf973739aac7ab6247f9150296af7954 ""
|
| 26 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm0800.tfm" 1136768653 3584 49064b465390a8e316a3c8417a050403 ""
|
| 27 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm0900.tfm" 1136768653 3584 d3d8ac8b25ca19c0a40b86a5db1e8ccc ""
|
| 28 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm1000.tfm" 1136768653 3584 adb004a0c8e7c46ee66cad73671f37b4 ""
|
| 29 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm1095.tfm" 1136768653 3584 929cdff2b7a8c11bd4d49fd68cb0ae70 ""
|
| 30 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm1200.tfm" 1136768653 3584 f80ddd985bd00e29e9a6047ebd9d4781 ""
|
| 31 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm1440.tfm" 1136768653 3584 3169d30142b88a27d4ab0e3468e963a2 ""
|
| 32 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm2488.tfm" 1136768653 3584 406ad7b70d9a41f7833f92b6313150c8 ""
|
| 33 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecti0800.tfm" 1136768653 3072 828ba0ea87cf5b727c4bfd6367195ec2 ""
|
| 34 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecti0900.tfm" 1136768653 3072 a603fa6d934ebc72197ed1c389943d86 ""
|
| 35 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecti1000.tfm" 1136768653 3072 3bce340d4c075dffe6d4ec732b4c32fe ""
|
| 36 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ectt0800.tfm" 1136768653 1536 0b0b8ca286de6a006b681926403f35cd ""
|
| 37 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ectt0900.tfm" 1136768653 1536 ae7aab2f8a4bc9edfce2899f53ba88c3 ""
|
| 38 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ectt1000.tfm" 1136768653 1536 06717a2b50de47d4087ac0e6cd759455 ""
|
| 39 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ectt1200.tfm" 1136768653 1536 487c9b46984a816c7ed238d0674595c7 ""
|
| 40 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/tcrm1000.tfm" 1136768653 1536 e07581a4bb3136ece9eeb4c3ffab8233 ""
|
| 41 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/tctt0800.tfm" 1136768653 1536 3d9caa7b59d4d3f96d272d1de68d8742 ""
|
| 42 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmbsy5.tfm" 1246382020 1120 1e8878807317373affa7f7bba4cf2f6a ""
|
| 43 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmbsy6.tfm" 1246382020 1124 14ccf5552bc7f77ca02a8a402bea8bfb ""
|
| 44 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmbsy7.tfm" 1246382020 1120 7f9f170e8aa57527ad6c49feafd45d54 ""
|
| 45 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmbsy8.tfm" 1246382020 1120 200be8b775682cdf80acad4be5ef57e4 ""
|
| 46 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmbsy9.tfm" 1246382020 1112 cbc11b646ccc26599775160605aaee3a ""
|
| 47 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex7.tfm" 1246382020 1004 54797486969f23fa377b128694d548df ""
|
| 48 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex8.tfm" 1246382020 988 bdf658c3bfc2d96d3c8b02cfc1c94c20 ""
|
| 49 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex9.tfm" 1246382020 996 a18840b13b499c08ac2de96a99eda4bc ""
|
| 50 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmmib5.tfm" 1246382020 1496 c79f6914c6d39ffb3759967363d1be79 ""
|
| 51 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmmib6.tfm" 1246382020 1516 a3bf6a5e7ec4401b1f52092dfaaed242 ""
|
| 52 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmmib7.tfm" 1246382020 1508 6e807ff901c35a5f1fde0ca275533df8 ""
|
| 53 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmmib8.tfm" 1246382020 1528 dab402b9d3774ca98baa037071cee7ae ""
|
| 54 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmmib9.tfm" 1246382020 1528 159d57adcba064aab4277245c826577d ""
|
| 55 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam10.tfm" 1246382020 916 f87d7c45f9c908e672703b83b72241a3 ""
|
| 56 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam5.tfm" 1246382020 924 9904cf1d39e9767e7a3622f2a125a565 ""
|
| 57 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam7.tfm" 1246382020 928 2dc8d444221b7a635bb58038579b861a ""
|
| 58 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm10.tfm" 1246382020 908 2921f8a10601f252058503cc6570e581 ""
|
| 59 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm5.tfm" 1246382020 940 75ac932a52f80982a9f8ea75d03a34cf ""
|
| 60 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm7.tfm" 1246382020 940 228d6584342e91276bf566bcf9716b83 ""
|
| 61 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbsy10.tfm" 1136768653 1116 4e6ba9d7914baa6482fd69f67d126380 ""
|
| 62 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx10.tfm" 1136768653 1328 c834bbb027764024c09d3d2bf908b5f0 ""
|
| 63 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx12.tfm" 1136768653 1324 c910af8c371558dc20f2d7822f66fe64 ""
|
| 64 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx5.tfm" 1136768653 1332 f817c21a1ba54560425663374f1b651a ""
|
| 65 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx6.tfm" 1136768653 1344 8a0be4fe4d376203000810ad4dc81558 ""
|
| 66 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx7.tfm" 1136768653 1336 3125ccb448c1a09074e3aa4a9832f130 ""
|
| 67 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx8.tfm" 1136768653 1332 1fde11373e221473104d6cc5993f046e ""
|
| 68 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx9.tfm" 1136768653 1328 5442e22a7072966dbaf88ca900acf3f0 ""
|
| 69 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmex10.tfm" 1136768653 992 662f679a0b3d2d53c1b94050fdaa3f50 ""
|
| 70 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi10.tfm" 1136768653 1528 abec98dbc43e172678c11b3b9031252a ""
|
| 71 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi12.tfm" 1136768653 1524 4414a8315f39513458b80dfc63bff03a ""
|
| 72 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi6.tfm" 1136768653 1512 f21f83efb36853c0b70002322c1ab3ad ""
|
| 73 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi8.tfm" 1136768653 1520 eccf95517727cb11801f4f1aee3a21b4 ""
|
| 74 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi9.tfm" 1136768653 1524 d89e2d087a9828407a196f428428ef4a ""
|
| 75 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmib10.tfm" 1136768653 1524 554068197b70979a55370e6c6495f441 ""
|
| 76 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr10.tfm" 1136768653 1296 45809c5a464d5f32c8f98ba97c1bb47f ""
|
| 77 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr12.tfm" 1136768653 1288 655e228510b4c2a1abe905c368440826 ""
|
| 78 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr5.tfm" 1136768653 1220 ad296dff3c8796c18053ab7b9f86ad7c ""
|
| 79 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr6.tfm" 1136768653 1300 b62933e007d01cfd073f79b963c01526 ""
|
| 80 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr8.tfm" 1136768653 1292 21c1c5bfeaebccffdb478fd231a0997d ""
|
| 81 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr9.tfm" 1136768653 1292 6b21b9c2c7bebb38aa2273f7ca0fb3af ""
|
| 82 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy10.tfm" 1136768653 1124 6c73e740cf17375f03eec0ee63599741 ""
|
| 83 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy6.tfm" 1136768653 1116 933a60c408fc0a863a92debe84b2d294 ""
|
| 84 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy8.tfm" 1136768653 1120 8b7d695260f3cff42e636090a8002094 ""
|
| 85 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy9.tfm" 1136768653 1116 25a7bf822c58caf309a702ef79f4afbb ""
|
| 86 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmbx10.pfb" 1248133631 34811 78b52f49e893bcba91bd7581cdc144c0 ""
|
| 87 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmbx7.pfb" 1248133631 32007 e8fa0078355f39467039935974716569 ""
|
| 88 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmex10.pfb" 1248133631 30251 6afa5cb1d0204815a708a080681d4674 ""
|
| 89 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi10.pfb" 1248133631 36299 5f9df58c2139e7edcf37c8fca4bd384d ""
|
| 90 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi12.pfb" 1248133631 36741 fa121aac0049305630cf160b86157ee4 ""
|
| 91 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi5.pfb" 1248133631 37912 77d683123f92148345f3fc36a38d9ab1 ""
|
| 92 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi6.pfb" 1248133631 37166 8ab3487cbe3ab49ebce74c29ea2418db ""
|
| 93 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi7.pfb" 1248133631 36281 c355509802a035cadc5f15869451dcee ""
|
| 94 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi9.pfb" 1248133631 36094 798f80770b3b148ceedd006d487db67c ""
|
| 95 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmib10.pfb" 1248133631 36912 b448ef9ad9d7228ec3c6e71005136d55 ""
|
| 96 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr10.pfb" 1248133631 35752 024fb6c41858982481f6968b5fc26508 ""
|
| 97 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr5.pfb" 1248133631 31809 8670ca339bf94e56da1fc21c80635e2a ""
|
| 98 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr6.pfb" 1248133631 32734 69e00a6b65cedb993666e42eedb3d48f ""
|
| 99 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr7.pfb" 1248133631 32762 224316ccc9ad3ca0423a14971cfa7fc1 ""
|
| 100 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr8.pfb" 1248133631 32726 0a1aea6fcd6468ee2cf64d891f5c43c8 ""
|
| 101 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr9.pfb" 1248133631 33993 9b89b85fd2d9df0482bd47194d1d3bf3 ""
|
| 102 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy10.pfb" 1248133631 32569 5e5ddc8df908dea60932f3c484a54c0d ""
|
| 103 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy5.pfb" 1248133631 32915 7bf7720c61a5b3a7ff25b0964421c9b6 ""
|
| 104 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy7.pfb" 1248133631 32716 08e384dc442464e7285e891af9f45947 ""
|
| 105 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy9.pfb" 1248133631 32442 c975af247b6702f7ca0c299af3616b80 ""
|
| 106 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cmextra/cmex7.pfb" 1248133631 30457 bc0868ebece724ed7c3d37e3d9bff7bd ""
|
| 107 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/symbols/msbm10.pfb" 1248133631 34694 ad62b13721ee8eda1dcc8993c8bd7041 ""
|
| 108 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/symbols/msbm7.pfb" 1248133631 35309 940e81a5b9e04201a07e8b33a3ae6e64 ""
|
| 109 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfbx1000.pfb" 1215737283 145408 43d44302ca7d82d487f511f83e309505 ""
|
| 110 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfcc1000.pfb" 1215737283 109713 30d6420127e1c84309fa7008a69b7170 ""
|
| 111 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0500.pfb" 1215737283 180418 5fa231fda2d05a9c0eca9fd483cc7254 ""
|
| 112 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0600.pfb" 1215737283 162624 9dcc92cd3b1dfe2ecc80e6da7f2eb6bd ""
|
| 113 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0700.pfb" 1215737283 154599 ded6d7c21788a8930eadc7fef7518942 ""
|
| 114 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0800.pfb" 1215737283 164227 3df942b4ff2124425d8fb1b6d3e01c7a ""
|
| 115 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0900.pfb" 1215737283 149037 995a6f1e12c1d647b99b1cf55db78699 ""
|
| 116 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm1000.pfb" 1215737283 138258 6525c253f16cededa14c7fd0da7f67b2 ""
|
| 117 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfti0800.pfb" 1215737283 187625 f02a8c2c788e6490af6fc4f5ed857a0c ""
|
| 118 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfti0900.pfb" 1215737283 183673 6df73819bb3e1246a6315a4913a2d331 ""
|
| 119 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfti1000.pfb" 1215737283 186554 e8f0fa8ca05e038f257a06405232745f ""
|
| 120 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt0800.pfb" 1215737283 175641 e30a691dc1402b08fd6535cf8a31e5b7 ""
|
| 121 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt0900.pfb" 1215737283 170827 2e4b634de7b58578eae1dc93e51dfe48 ""
|
| 122 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt1000.pfb" 1215737283 169201 9ebf99020dde51a5086e186761a34e8f ""
|
| 123 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt1200.pfb" 1215737283 167085 68de377d2744a68a88fa40a1f610615a ""
|
| 124 |
+
"/usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii" 1461363279 71627 94eb9990bed73c364d7f53f960cc8c5b ""
|
| 125 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/atbegshi/atbegshi.sty" 1575674566 24708 5584a51a7101caf7e6bbf1fc27d8f7b1 ""
|
| 126 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/babel-english/english.ldf" 1496785618 7008 9ff5fdcc865b01beca2b0fe4a46231d4 ""
|
| 127 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/babel-latin/latin.ldf" 1624912547 25417 d32c800f31057b7c3047104f8182c4b8 ""
|
| 128 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/babel/babel.sty" 1739572459 143388 b008e1666520ff43d0878397f2926242 ""
|
| 129 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/en/babel-en.ini" 1661803479 3966 caeee5a9e5771d4446aa1ca9015ba1b2 ""
|
| 130 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/en/babel-english.tex" 1498512262 336 ed676b5e7dfd862bc78d634f6a973f37 ""
|
| 131 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/la/babel-la.ini" 1701897282 4046 5df115e15ba828b8e76756e74a19ca94 ""
|
| 132 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/la/babel-latin.tex" 1663444672 424 ce3651b39928f2f307916dc782fcee35 ""
|
| 133 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/babel/txtbabel.def" 1735765002 6945 a248d839b1f26b388440c973966647b5 ""
|
| 134 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty" 1576625341 40635 c40361e206be584d448876bba8a64a3b ""
|
| 135 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/bitset/bitset.sty" 1576016050 33961 6b5c75130e435b2bfdb9f480a09a39f9 ""
|
| 136 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/catchfile/catchfile.sty" 1576016007 8622 63834878edeb14dd71d58d8f22bc3e06 ""
|
| 137 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/etexcmds/etexcmds.sty" 1576625273 7734 b98cbb34c81f667027c1e3ebdbfce34b ""
|
| 138 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty" 1576625223 8371 9d55b8bd010bc717624922fb3477d92e ""
|
| 139 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/iftex.sty" 1734129479 7984 7dbb9280f03c0a315425f1b4f35d43ee ""
|
| 140 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifvtex.sty" 1572645307 1057 525c2192b5febbd8c1f662c9468335bb ""
|
| 141 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifxetex.sty" 1572645307 488 4565444a3e75e59cb2702dc42e18f482 ""
|
| 142 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/infwarerr/infwarerr.sty" 1575499628 8356 7bbb2c2373aa810be568c29e333da8ed ""
|
| 143 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/intcalc/intcalc.sty" 1576625065 31769 002a487f55041f8e805cfbf6385ffd97 ""
|
| 144 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty" 1576878844 5412 d5a2436094cd7be85769db90f29250a6 ""
|
| 145 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty" 1701727651 17865 1a9bd36b4f98178fa551aca822290953 ""
|
| 146 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pdfescape/pdfescape.sty" 1576015897 19007 15924f7228aca6c6d184b115f4baa231 ""
|
| 147 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty" 1593379760 20089 80423eac55aa175305d35b49e04fe23b ""
|
| 148 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex" 1673816307 1016 1c2b89187d12a2768764b83b4945667c ""
|
| 149 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorearrows.code.tex" 1601326656 43820 1fef971b75380574ab35a0d37fd92608 ""
|
| 150 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreexternal.code.tex" 1601326656 19324 f4e4c6403dd0f1605fd20ed22fa79dea ""
|
| 151 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoregraphicstate.code.tex" 1601326656 6038 ccb406740cc3f03bbfb58ad504fe8c27 ""
|
| 152 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreimage.code.tex" 1673816307 6911 f6d4cf5a3fef5cc879d668b810e82868 ""
|
| 153 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorelayers.code.tex" 1601326656 4883 42daaf41e27c3735286e23e48d2d7af9 ""
|
| 154 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreobjects.code.tex" 1601326656 2544 8c06d2a7f0f469616ac9e13db6d2f842 ""
|
| 155 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathconstruct.code.tex" 1601326656 44195 5e390c414de027626ca5e2df888fa68d ""
|
| 156 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathprocessing.code.tex" 1601326656 17311 2ef6b2e29e2fc6a2fc8d6d652176e257 ""
|
| 157 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathusage.code.tex" 1601326656 21302 788a79944eb22192a4929e46963a3067 ""
|
| 158 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepatterns.code.tex" 1673816307 9691 3d42d89522f4650c2f3dc616ca2b925e ""
|
| 159 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepoints.code.tex" 1601326656 33335 dd1fa4814d4e51f18be97d88bf0da60c ""
|
| 160 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorequick.code.tex" 1601326656 2965 4c2b1f4e0826925746439038172e5d6f ""
|
| 161 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorerdf.code.tex" 1601326656 5196 2cc249e0ee7e03da5f5f6589257b1e5b ""
|
| 162 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorescopes.code.tex" 1673816307 20821 7579108c1e9363e61a0b1584778804aa ""
|
| 163 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreshade.code.tex" 1601326656 35249 abd4adf948f960299a4b3d27c5dddf46 ""
|
| 164 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransformations.code.tex" 1673816307 22012 81b34a0aa8fa1a6158cc6220b00e4f10 ""
|
| 165 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransparency.code.tex" 1601326656 8893 e851de2175338fdf7c17f3e091d94618 ""
|
| 166 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibraryfadings.code.tex" 1601326656 1179 5483d86c1582c569e665c74efab6281f ""
|
| 167 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarypositioning.code.tex" 1601326656 3937 3f208572dd82c71103831da976d74f1a ""
|
| 168 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarytopaths.code.tex" 1608933718 11518 738408f795261b70ce8dd47459171309 ""
|
| 169 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex" 1673816307 186782 af500404a9edec4d362912fe762ded92 ""
|
| 170 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryfadings.code.tex" 1601326656 2563 d5b174eb7709fd6bdcc2f70953dbdf8e ""
|
| 171 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryplothandlers.code.tex" 1601326656 32995 ac577023e12c0e4bd8aa420b2e852d1a ""
|
| 172 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfint.code.tex" 1557692582 3063 8c415c68a0f3394e45cfeca0b65f6ee6 ""
|
| 173 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex" 1673816307 949 cea70942e7b7eddabfb3186befada2e6 ""
|
| 174 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathcalc.code.tex" 1673816307 13270 2e54f2ce7622437bf37e013d399743e3 ""
|
| 175 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfloat.code.tex" 1673816307 104717 9b2393fbf004a0ce7fa688dbce423848 ""
|
| 176 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.base.code.tex" 1601326656 10165 cec5fa73d49da442e56efc2d605ef154 ""
|
| 177 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.basic.code.tex" 1601326656 28178 41c17713108e0795aac6fef3d275fbca ""
|
| 178 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.code.tex" 1673816307 9649 85779d3d8d573bfd2cd4137ba8202e60 ""
|
| 179 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.comparison.code.tex" 1601326656 3865 ac538ab80c5cf82b345016e474786549 ""
|
| 180 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.integerarithmetics.code.tex" 1557692582 3177 27d85c44fbfe09ff3b2cf2879e3ea434 ""
|
| 181 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.misc.code.tex" 1621110968 11024 0179538121bc2dba172013a3ef89519f ""
|
| 182 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.random.code.tex" 1673816307 7890 0a86dbf4edfd88d022e0d889ec78cc03 ""
|
| 183 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.round.code.tex" 1601326656 3379 781797a101f647bab82741a99944a229 ""
|
| 184 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.trigonometric.code.tex" 1601326656 92405 f515f31275db273f97b9d8f52e1b0736 ""
|
| 185 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathparser.code.tex" 1673816307 37466 97b0a1ba732e306a1a2034f5a73e239f ""
|
| 186 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathutil.code.tex" 1601326656 8471 c2883569d03f69e8e1cabfef4999cfd7 ""
|
| 187 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmodulematrix.code.tex" 1673816307 21211 1e73ec76bd73964d84197cc3d2685b01 ""
|
| 188 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleplot.code.tex" 1601326656 16121 346f9013d34804439f7436ff6786cef7 ""
|
| 189 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleshapes.code.tex" 1673816307 44792 271e2e1934f34c759f4dedb1e14a5015 ""
|
| 190 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/pgf.revision.tex" 1673816307 114 e6d443369d0673933b38834bf99e422d ""
|
| 191 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgf.cfg" 1601326656 926 2963ea0dcf6cc6c0a770b69ec46a477b ""
|
| 192 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-common-pdf.def" 1673816307 5542 32f75a31ea6c3a7e1148cd6d5e93dbb7 ""
|
| 193 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.def" 1673816307 12612 7774ba67bfd72e593c4436c2de6201e3 ""
|
| 194 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex" 1673816307 61351 bc5f86e0355834391e736e97a61abced ""
|
| 195 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.tex" 1601326656 1896 b8e0ca0ac371d74c0ca05583f6313c91 ""
|
| 196 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.tex" 1601326656 7778 53c8b5623d80238f6a20aa1df1868e63 ""
|
| 197 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex" 1673816307 24033 d8893a1ec4d1bfa101b172754743d340 ""
|
| 198 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex" 1673816307 39784 414c54e866ebab4b801e2ad81d9b21d8 ""
|
| 199 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeyslibraryfiltered.code.tex" 1673816307 37433 940bc6d409f1ffd298adfdcaf125dd86 ""
|
| 200 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex" 1673816307 4385 510565c2f07998c8a0e14f0ec07ff23c ""
|
| 201 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-common.tex" 1673816307 29239 22e8c7516012992a49873eff0d868fed ""
|
| 202 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-latex.def" 1673816307 6950 8524a062d82b7afdc4a88a57cb377784 ""
|
| 203 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/soul/soul-ori.sty" 1686773644 25449 100b0515cc1d8ea1e0366560a3b7ad0c ""
|
| 204 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/soul/soul.sty" 1686773644 16881 d8b4a12a106b66e472bfd2de30eb0314 ""
|
| 205 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/stringenc/stringenc.sty" 1575152242 21514 b7557edcee22835ef6b03ede1802dad4 ""
|
| 206 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty" 1576624663 7008 f92eaa0a3872ed622bbf538217cd2ab7 ""
|
| 207 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkeyval.tex" 1655411236 19231 27205ee17aaa2902aea3e0c07a3cfc65 ""
|
| 208 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkvutils.tex" 1655411236 7677 9cb1a74d945bc9331f2181c0a59ff34a ""
|
| 209 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjcalc.sty" 1666037967 5598 c49b91713cbe5e50a1fabefb733eda0d ""
|
| 210 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjustbox.sty" 1740604409 56907 b74d2bd6fed8dc761953edb2fbea781b ""
|
| 211 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/tc-pdftex.def" 1740604409 4304 461724faa0dfbdec2d80de16c11f407c ""
|
| 212 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/trimclip.sty" 1740176375 7245 2bf1779563af51e666da8f26ea1f8455 ""
|
| 213 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/algorithmicx/algorithmicx.sty" 1160617237 26750 ce139c05a983e19ddca355b43e29c395 ""
|
| 214 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/algorithmicx/algpseudocode.sty" 1160617237 3457 d9077efe6b74c5a094199256af8d7d9a ""
|
| 215 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/algorithms/algorithm.sty" 1251330371 3249 15763257e50278eef5db1952ccde229c ""
|
| 216 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amsfonts.sty" 1359763108 5949 3f3fd50a8cc94c3d4cbf4fc66cd3df1c ""
|
| 217 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amssymb.sty" 1359763108 13829 94730e64147574077f8ecfea9bb69af4 ""
|
| 218 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsa.fd" 1359763108 961 6518c6525a34feb5e8250ffa91731cff ""
|
| 219 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsb.fd" 1359763108 961 d02606146ba5601b5645f987c92e6193 ""
|
| 220 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsbsy.sty" 1717359999 2222 2166a1f7827be30ddc30434e5efcee1b ""
|
| 221 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsgen.sty" 1717359999 4173 d22509bc0c91281d991b2de7c88720dd ""
|
| 222 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsmath.sty" 1730928152 88370 c780f23aea0ece6add91e09b44dca2cd ""
|
| 223 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsopn.sty" 1717359999 4474 23ca1d3a79a57b405388059456d0a8df ""
|
| 224 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amstext.sty" 1717359999 2444 71618ea5f2377e33b04fb97afdd0eac2 ""
|
| 225 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/arydshln/arydshln.sty" 1550877088 46329 ffecd4a08bc2c823f27a610f95aa12d3 ""
|
| 226 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/atveryend/atveryend.sty" 1728505250 1695 be6b4d13b33db697fd3fd30b24716c1a ""
|
| 227 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/article.cls" 1738182759 20144 63d8bacaf52e5abf4db3bc322373e1d4 ""
|
| 228 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/atbegshi-ltx.sty" 1738182759 2963 d8ec5a1b4e0a106c5c737900202763e4 ""
|
| 229 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/atveryend-ltx.sty" 1738182759 2378 14b657ee5031da98cf91648f19642694 ""
|
| 230 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/fontenc.sty" 1738182759 5275 0d62fb62162c7ab056e941ef18c5076d ""
|
| 231 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/ifthen.sty" 1738182759 5525 9dced5929f36b19fa837947f5175b331 ""
|
| 232 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/inputenc.sty" 1738182759 5048 0270515b828149155424600fd2d58ac5 ""
|
| 233 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/size10.clo" 1738182759 8448 5cf247d4bd0c7d5d711bbbdf111fae2e ""
|
| 234 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/textcomp.sty" 1738182759 2846 e26604d3d895e65d874c07f30c291f3f ""
|
| 235 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/bold-extra/bold-extra.sty" 1266435381 1923 d0cfef8b32c8df31c8dc244f7e40f3dc ""
|
| 236 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/booktabs/booktabs.sty" 1579038678 6078 f1cb470c9199e7110a27851508ed7a5c ""
|
| 237 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption.sty" 1696191071 56128 c2ccf1a29d78c33bc553880402e4fb9a ""
|
| 238 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption3.sty" 1696191071 72619 ee90b6612147680fd73c3b1406a74245 ""
|
| 239 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/caption/ltcaption.sty" 1645391520 7418 021d7c4eb11bde94592761855a3d046e ""
|
| 240 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/caption/subcaption.sty" 1690576852 12494 0c0cdb824278a4d51cefeb2e79901315 ""
|
| 241 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/carlisle/scalefnt.sty" 1137109962 1360 df2086bf924b14b72d6121fe9502fcdb ""
|
| 242 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/cleveref/cleveref.sty" 1525128982 329481 7fc6b003158402a4c694bc0a1b729308 ""
|
| 243 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/collectbox/collectbox.sty" 1666037909 9124 59c3b56f1a073de66e3eea35f9c173c8 ""
|
| 244 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/colortbl/colortbl.sty" 1720383029 12726 67708fc852a887b2ba598148f60c3756 ""
|
| 245 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.cfg" 1429144587 7068 06f8d141725d114847527a66439066b6 ""
|
| 246 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.def" 1712263026 22135 0975a49eeaed232aa861e9425ffb2e7c ""
|
| 247 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.sty" 1712263026 62767 e79d6d7a989e7da62dcf3d0a65c1faee ""
|
| 248 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/enumitem/enumitem.sty" 1738874546 52272 63d293bc0d496619edb57585740861a2 ""
|
| 249 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/environ/environ.sty" 1399239813 4378 f429f0da968c278653359293040a8f52 ""
|
| 250 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/epigraph/epigraph.sty" 1578002819 4602 e947be1727d6ac747322008f8359ee17 ""
|
| 251 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty" 1579991033 13886 d1306dcf79a944f6988e688c1785f9ce ""
|
| 252 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/etoolbox/etoolbox.sty" 1739306980 46850 d87daedc2abdc653769a6f1067849fe0 ""
|
| 253 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/fancyvrb/fancyvrb.sty" 1705955721 43712 c3d93734f3bc56e03c21b3dc69268d3c ""
|
| 254 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/float/float.sty" 1137110151 6749 16d2656a1984957e674b149555f1ea1d ""
|
| 255 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/fvextra/fvextra.sty" 1741210328 114780 8f122e4cec6e0ef004709dbe67456f27 ""
|
| 256 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/geometry/geometry.sty" 1578002852 41601 9cf6c5257b1bc7af01a58859749dd37a ""
|
| 257 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg" 1459978653 1213 620bba36b25224fa9b7e1ccb4ecb76fd ""
|
| 258 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg" 1465944070 1224 978390e9c2234eab29404bc21b268d1e ""
|
| 259 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def" 1713382759 19440 9da9dcbb27470349a580fca7372d454b ""
|
| 260 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/color.sty" 1730496337 7245 57f7defed4fb41562dc4b6ca13958ca9 ""
|
| 261 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/dvipsnam.def" 1717359999 5009 1ca6c92de20f17acac654d4e0598c200 ""
|
| 262 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphics.sty" 1730496337 18363 dee506cb8d56825d8a4d020f5d5f8704 ""
|
| 263 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphicx.sty" 1717359999 8010 6f2ad8c2b2ffbd607af6475441c7b5e4 ""
|
| 264 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/keyval.sty" 1717359999 2671 70891d50dac933918b827d326687c6e8 ""
|
| 265 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx" 1667332637 2885 9c645d672ae17285bba324998918efd8 ""
|
| 266 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/trig.sty" 1717359999 4023 2c9f39712cf7b43d3eb93a8bbd5c8f67 ""
|
| 267 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hycolor/hycolor.sty" 1580250785 17914 4c28a13fc3d975e6e81c9bea1d697276 ""
|
| 268 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def" 1730838014 48154 82da9991b9f0390b3a9d3af6c8618af4 ""
|
| 269 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hyperref.sty" 1730838014 222112 c22dbd2288f89f7ba942ac22f7d00f11 ""
|
| 270 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/nameref.sty" 1705871765 11026 182c63f139a71afd30a28e5f1ed2cd1c ""
|
| 271 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def" 1730838014 14249 ff700eb13ce975a424b2dd99b1a83044 ""
|
| 272 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def" 1730838014 117112 7533bff456301d32e6d6356fad15f543 ""
|
| 273 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyphenat/hyphenat.sty" 1252025529 5798 0437b031e663035b68539cf7ac7c8eeb ""
|
| 274 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/ifoddpage/ifoddpage.sty" 1666126449 2142 eae42205b97b7a3ad0e58db5fe99e3e6 ""
|
| 275 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/kvoptions/kvoptions.sty" 1655478651 22555 6d8e155cfef6d82c3d5c742fea7c992e ""
|
| 276 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty" 1665067230 13815 760b0c02f691ea230f5359c4e1de23a7 ""
|
| 277 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def" 1716410060 29785 9f93ab201fe5dd053afcc6c1bcf7d266 ""
|
| 278 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/l3kernel/expl3.sty" 1738271527 6565 f51d809db6193fae7b06c1bc26ca8f75 ""
|
| 279 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/l3packages/l3keys2e/l3keys2e.sty" 1724879202 4674 22943918cc84173478a588d6efbc800b ""
|
| 280 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/l3packages/xparse/xparse.sty" 1724879202 9783 ab4bee47700c04aadedb8da27591b0ab ""
|
| 281 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/latex2pydata/latex2pydata.sty" 1741122543 18441 b520f246c834e8d86f0d9856e5992a5a ""
|
| 282 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg" 1279039959 678 4792914a8f45be57bb98413425e4c7af ""
|
| 283 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/lineno/lineno.sty" 1738182647 154882 28bfba3d27ac868f944c98d1dd51fd65 ""
|
| 284 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/lipsum/lipsum.ltd.tex" 1632168149 95525 6fd0552101a6b1f9b7a84b402ec435ba ""
|
| 285 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/lipsum/lipsum.sty" 1632168149 14690 c2c754218a7108db7823a4839c1bc3cd ""
|
| 286 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/listings/listings.cfg" 1727126400 1865 301ae3c26fb8c0243307b619a6aa2dd3 ""
|
| 287 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/listings/listings.sty" 1727126400 81640 997090b6c021dc4af9ee00a97b85c5b4 ""
|
| 288 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstlang1.sty" 1727126400 206518 4eb59a801ad842a713fa168c34227290 ""
|
| 289 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstmisc.sty" 1727126400 77051 be68720e5402397a830abb9eed5a2cb4 ""
|
| 290 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstpatch.sty" 1710360531 353 9024412f43e92cd5b21fe9ded82d0610 ""
|
| 291 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/listingsutf8/listingsutf8.sty" 1576101256 5148 1baf596b2560b44d9ff1889dc1d7564e ""
|
| 292 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/makecell/makecell.sty" 1249334690 15773 2dd7dde1ec1c2a3d0c85bc3b273e04d8 ""
|
| 293 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype-pdftex.def" 1739394495 49650 26a5e891c8da4553198575ba0517c0e5 ""
|
| 294 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype.cfg" 1739394495 27015 bd167d0154f271c424b157d8894ae4a4 ""
|
| 295 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype.sty" 1739394495 102775 6624742dafeb6f262a13657f9f77f048 ""
|
| 296 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-cmr.cfg" 1739394495 22906 b0be544692bb405a84d147a96af5d777 ""
|
| 297 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-msa.cfg" 1739394495 5929 11976688d7d8ed4d0d05efd1b3d5a7e9 ""
|
| 298 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-msb.cfg" 1739394495 5594 d52d5015abe666fae752ed7674c2dd73 ""
|
| 299 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/minted/minted.sty" 1741296711 71211 7bd410e5b0fefdc9042144b76fd52a29 ""
|
| 300 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/multirow/bigdelim.sty" 1731446765 1725 a426f77bbc71842be3506e781a61f41a ""
|
| 301 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/multirow/multirow.sty" 1731446765 6696 886c9f3087d0b973ed2c19aa79cb3023 ""
|
| 302 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/nextpage/nextpage.sty" 1252088423 1745 04b0f50af5d59a9cf3c17f3f4452ed12 ""
|
| 303 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/nicematrix/nicematrix.sty" 1741210318 403700 b392b1675aed7fc914022d52517c501d ""
|
| 304 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/ninecolors/ninecolors.sty" 1644787553 24994 276ffd1a92e69ec7402483b3bf3318d2 ""
|
| 305 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/parskip/parskip.sty" 1615762720 4288 94714aa7f535440f33181fec52a31963 ""
|
| 306 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pdfcol/pdfcol.sty" 1663877585 8086 ac143843b6ea88d172677dc3ed532925 ""
|
| 307 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf-pie/pgf-pie.sty" 1655323805 235 27eb33457ea75445bb99c64f6d935690 ""
|
| 308 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf-pie/tikzlibrarypie.code.tex" 1655323805 14000 29db17eb16436bea851cf45e6d962444 ""
|
| 309 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgf.sty" 1601326656 1090 bae35ef70b3168089ef166db3e66f5b2 ""
|
| 310 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty" 1673816307 373 00b204b1d7d095b892ad31a7494b0373 ""
|
| 311 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-0-65.sty" 1601326656 21013 f4ff83d25bb56552493b030f27c075ae ""
|
| 312 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-1-18.sty" 1601326656 989 c49c8ae06d96f8b15869da7428047b1e ""
|
| 313 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/frontendlayer/tikz.sty" 1601326656 339 c2e180022e3afdb99c7d0ea5ce469b7d ""
|
| 314 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/math/pgfmath.sty" 1601326656 306 c56a323ca5bf9242f54474ced10fca71 ""
|
| 315 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty" 1601326656 443 8c872229db56122037e86bcda49e14f3 ""
|
| 316 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgffor.sty" 1601326656 348 ee405e64380c11319f0e249fed57e6c5 ""
|
| 317 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty" 1601326656 274 5ae372b7df79135d240456a1c6f2cf9a ""
|
| 318 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty" 1601326656 325 f9f16d12354225b7dd52a3321f085955 ""
|
| 319 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgfopts/pgfopts.sty" 1405118212 5540 d5c60cf09c59da351aa4023ed084e4eb ""
|
| 320 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/placeins/placeins.sty" 1137110565 4087 636308456f60d2b31cbf97867db5708d ""
|
| 321 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/pifont.sty" 1586716065 2283 62e73848f29fd8cd37fb7974c7cf2221 ""
|
| 322 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upsy.fd" 1137110629 148 2da0acd77cba348f34823f44cabf0058 ""
|
| 323 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upzd.fd" 1137110629 148 b2a94082cb802f90d3daf6dd0c7188a0 ""
|
| 324 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/refcount/refcount.sty" 1576624809 9878 9e94e8fa600d95f9c7731bb21dfb67a4 ""
|
| 325 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty" 1657483315 9714 ba3194bd52c8499b3f1e3eb91d409670 ""
|
| 326 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/setspace/setspace.sty" 1670275497 22490 8cac309b79a4c53a4ffce4b1b07aead0 ""
|
| 327 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/siunitx/siunitx.sty" 1740687122 340829 1b2c1d1f1f03e683fab4091869acffd3 ""
|
| 328 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tabularray/tabularray.sty" 1708117440 255677 5fa41b80e5de95b8eaa15ecb137e1cf0 ""
|
| 329 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbbreakable.code.tex" 1729629551 34681 5ad5619477798e0585ce97e92ba075bc ""
|
| 330 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbexternal.code.tex" 1729629551 9105 5a50b7066ed23d09205aa37fe69e951c ""
|
| 331 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbfitting.code.tex" 1729629551 17164 002bdbaf09e605e895c644057c268091 ""
|
| 332 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbhooks.code.tex" 1729629551 10373 24c63e0637d2172d23a50c61238bcc41 ""
|
| 333 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcblistings.code.tex" 1729629551 3414 a8758eb0339def475b3e375fe7335593 ""
|
| 334 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcblistingscore.code.tex" 1729629551 16147 2a7437ae9362c29025ce183991034565 ""
|
| 335 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcblistingsutf8.code.tex" 1729629551 1414 3fc2bfc308b8836fb8873071647d1e16 ""
|
| 336 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbmagazine.code.tex" 1729629551 5636 42acbec9b01b769d4ce0f204522d04cc ""
|
| 337 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbminted.code.tex" 1729629551 3452 f78637a51b08f5625a29eae0d24d6756 ""
|
| 338 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbposter.code.tex" 1729629551 12459 57c45692fee879446df2f871fdc3cb2c ""
|
| 339 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbprocessing.code.tex" 1729629551 2349 2e9f0fbe0a111b50863d27c97793e439 ""
|
| 340 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbraster.code.tex" 1729629551 9373 adb4c84606f9c1f78be1266d37bd8578 ""
|
| 341 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbskins.code.tex" 1729629551 83599 50e7ac6ad41eaf61a7736beda9e3416b ""
|
| 342 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbskinsjigsaw.code.tex" 1729629551 10040 cd3f87486a5ead1213ec367836f72332 ""
|
| 343 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbtheorems.code.tex" 1729629551 13125 46ca7b1b209d14b980b66450466ae92d ""
|
| 344 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbvignette.code.tex" 1729629551 12747 cd7c5d326d905561160a07ad889812b5 ""
|
| 345 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcolorbox.sty" 1729629551 105823 554671f5b2d97c7efbbaf6dae11c29a8 ""
|
| 346 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/textpos/textpos.sty" 1658612320 12842 35403bda336a71a37a5edc37b78e3ec3 ""
|
| 347 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzfill-common.sty" 1691524336 2573 42712c9e0a2df004e43df5b3c95f0c1e ""
|
| 348 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzfill.image.sty" 1691524336 1120 ba535da48caa03bf1fd3f03ea87779f8 ""
|
| 349 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzlibraryfill.image.code.tex" 1691524336 10931 717eb52299f416934beb8b2b7cd8cee6 ""
|
| 350 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/titlesec/titlesec.sty" 1736023606 48766 87a17a4ef312a39cd43896e34a679a56 ""
|
| 351 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tocloft/tocloft.sty" 1578692495 36103 3e78d14f0f4b1a30560fea5e04de805d ""
|
| 352 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/todonotes/todonotes.sty" 1704576842 21404 916e19cbd009b6d289c8194b313d3895 ""
|
| 353 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tools/array.sty" 1730496337 14552 27664839421e418b87f56fa4c6f66b1a ""
|
| 354 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tools/bm.sty" 1717359999 13236 1aba485b47e3c79a47dc0906cc971820 ""
|
| 355 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tools/calc.sty" 1717359999 10214 61188260d324e94bc2f66825d7d3fdf4 ""
|
| 356 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tools/longtable.sty" 1730496337 15900 3cb191e576c7a313634d2813c55d4bf1 ""
|
| 357 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tools/shellesc.sty" 1717359999 4121 6039ae6d0916154d7ba5f20a77b9ab2c ""
|
| 358 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tools/verbatim.sty" 1717359999 7532 d2e1111e17bfebb1607d8ffa779705ec ""
|
| 359 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tools/xspace.sty" 1717359999 4545 e3f4de576c914e2000f07f69a891c071 ""
|
| 360 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/translations/translations-basic-dictionary-english.trsl" 1644096163 5588 0c1628daf15f4411ff1f463114c634a3 ""
|
| 361 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/translations/translations.sty" 1644096163 44247 2188b95d0ee74e31eca4d316263c58a7 ""
|
| 362 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/trimspaces/trimspaces.sty" 1253232110 1380 971a51b00a14503ddf754cab24c3f209 ""
|
| 363 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/units/nicefrac.sty" 1137111039 4029 0462ee5ab265cf59dc15a41a3b883101 ""
|
| 364 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/upquote/upquote.sty" 1334873510 1048 517e01cde97c1c0baf72e69d43aa5a2e ""
|
| 365 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/url/url.sty" 1388531844 12796 8edb7d69a20b857904dd0ea757c14ec9 ""
|
| 366 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/varwidth/varwidth.sty" 1238697683 10894 d359a13923460b2a73d4312d613554c8 ""
|
| 367 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/wrapfig/wrapfig.sty" 1137111090 26220 3701aebf80ccdef248c0c20dd062fea9 ""
|
| 368 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/xcolor/xcolor.sty" 1727642399 55384 b454dec21c2d9f45ec0b793f0995b992 ""
|
| 369 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/xkeyval/xkeyval.sty" 1655411236 4937 4ce600ce9bd4ec84d0250eb6892fcf4f ""
|
| 370 |
+
"/usr/local/texlive/2025/texmf-dist/web2c/texmf.cnf" 1739380943 42148 61becc7c670cd061bb319c643c27fdd4 ""
|
| 371 |
+
"/usr/local/texlive/2025/texmf-var/fonts/map/pdftex/updmap/pdftex.map" 1756208942 5467155 19efa205003f9ecad95fbbaa6ff24da1 ""
|
| 372 |
+
"/usr/local/texlive/2025/texmf-var/web2c/pdftex/pdflatex.fmt" 1756208917 3345738 bbbb93a25a0c937f0c0915ef8b1d5cd7 ""
|
| 373 |
+
"/usr/local/texlive/2025/texmf.cnf" 1741450484 577 418a7058ec8e006d8704f60ecd22c938 ""
|
| 374 |
+
"fancyhdr.sty" 1757605857.51878 20521 e5d13d98d57bd53d4fed3aa61bd29c86 ""
|
| 375 |
+
"figures/ch1/ch1-lerobot-figure1.png" 1756736223.17924 2861318 f777846f5ac6f44218febd1394af5ff1 ""
|
| 376 |
+
"figures/ch2/ch2-approaches.png" 1756363243.15974 93262 005742026e1246cb2a02a7da3940d913 ""
|
| 377 |
+
"figures/ch2/ch2-classical-limitations.png" 1756388501.89346 4739243 309d2936bdce81de99e0816abd9460cb ""
|
| 378 |
+
"figures/ch2/ch2-cost-accessibility.png" 1756363243.12453 1962263 e56e539ef7a9776a4df096fefb65e212 ""
|
| 379 |
+
"figures/ch2/ch2-planar-manipulator-floor-box.png" 1756370334.87657 93114 98656618311d026f229a13938a62a442 ""
|
| 380 |
+
"figures/ch2/ch2-planar-manipulator-floor-shelf.png" 1756370334.88893 83589 636f3aecbfef1a4ff0f523e94ec62867 ""
|
| 381 |
+
"figures/ch2/ch2-planar-manipulator-floor.png" 1756370334.90719 58946 620a23d7e28c643dd425c3ea8b531e92 ""
|
| 382 |
+
"figures/ch2/ch2-planar-manipulator-free.png" 1756370334.8963 44656 e3b188009bad9b8d06a191322466cce2 ""
|
| 383 |
+
"figures/ch2/ch2-platforms.png" 1756736223.20117 3616534 13cb402bdc05892634fd8498d9dc3c23 ""
|
| 384 |
+
"figures/ch2/ch2-so100-to-planar-manipulator.png" 1756363243.11776 1555756 843b1df6b743a6f6037a8493f150c4e0 ""
|
| 385 |
+
"figures/ch3/ch3-agent-env.png" 1756714001.90601 42614 fbb34a5f9704f44f1fbe344839b87bf9 ""
|
| 386 |
+
"figures/ch3/ch3-duck-sim-vs-real.png" 1756711221.43685 1762155 1c16f3d559f80f6e171168eaf98ce161 ""
|
| 387 |
+
"figures/ch3/ch3-hil-serl-architecture.png" 1759499113.72045 1209537 1bc20e1b022364846b03ccab15879e84 ""
|
| 388 |
+
"figures/ch3/ch3-hil-serl-examples.png" 1756736287.2838 7216604 c75d7456f5658d28ebb5440da3e32d1f ""
|
| 389 |
+
"figures/ch3/ch3-learning-atlas.png" 1756711221.4577 178001 1941a85b04c9505239c179f189188f5a ""
|
| 390 |
+
"figures/ch3/ch3-learning-benefits.png" 1756711221.46811 6936585 268a364b96875d3d7ec3cfcdefb96f0a ""
|
| 391 |
+
"figures/ch3/ch3-many-ducks.png" 1756711221.43204 4872198 bc4bf3f702712cc7f1dad98f1314e495 ""
|
| 392 |
+
"figures/ch3/ch3-rl-algorithms-atlas.png" 1756711221.43945 194522 120a794a3308557b5c395ff0e593c620 ""
|
| 393 |
+
"figures/ch3/ch3-rl-examples.png" 1756713355.00909 9051359 1b3051a0dd497f57a8996b1305236078 ""
|
| 394 |
+
"figures/ch4/ch4-act-decoder.png" 1757204717.08331 3180391 466bc761acbd5b9974fd329a5e487b74 ""
|
| 395 |
+
"figures/ch4/ch4-act-encoder.png" 1757204032.58975 874336 3adc8d94d9b91627a9a5535daeefecd5 ""
|
| 396 |
+
"figures/ch4/ch4-act.png" 1757204032.59541 1517348 5d9e8566480d037513b80dbeb4f2726c ""
|
| 397 |
+
"figures/ch4/ch4-action-vs-observation-distribution.png" 1757202420.77891 274240 acccab7444f28bcd3090da86b66fb068 ""
|
| 398 |
+
"figures/ch4/ch4-async-inference.png" 1757205296.44907 282300 8154ec14144ade3da7fdcc1fea9e3bad ""
|
| 399 |
+
"figures/ch4/ch4-bc-trajectories.png" 1757188307.41911 2253030 d566d8cd4dd7e8ab54a50ee159c70d28 ""
|
| 400 |
+
"figures/ch4/ch4-diffusion-policy.png" 1760040846.00598 2790820 4428a5bf6de5cc4494d3e422b8b0aed0 ""
|
| 401 |
+
"figures/ch4/ch4-diffusion-robot-actions.png" 1759092616.57656 8924912 8b70ff3c0a0d28137e70cd8627e97ac7 ""
|
| 402 |
+
"figures/ch4/ch4-diffusion-vs-flowmatching.png" 1757437508.98817 189022 6f1db167af6aa55798ed7d1f2fe6e780 ""
|
| 403 |
+
"figures/ch4/ch4-issues-with-bc.png" 1757191594.69529 1560808 59922e5281d23b31061163c476d77186 ""
|
| 404 |
+
"figures/ch4/ch4-latent-variable-model.png" 1757199992.59971 983775 43de010534f17e72e9f84921125cd3ab ""
|
| 405 |
+
"figures/ch4/ch4-many-latents.png" 1757199815.14582 222323 b1b55e72af2573576190e4585f88f060 ""
|
| 406 |
+
"figures/ch4/ch4-normalizing-flows.png" 1759498593.07487 4730614 ce1e41f128cee20733ac913f1b215249 ""
|
| 407 |
+
"figures/ch4/ch4-observation-action-mapping.png" 1757189370.85243 2081981 adb062a7b57a4356512cd0f53b19add6 ""
|
| 408 |
+
"figures/ch4/ch4-queues.png" 1757204032.54175 1971787 69278f0aa3239f04020c064969fd364e ""
|
| 409 |
+
"figures/ch4/ch4-task-effect-on-pairs.png" 1757192350.81608 1186204 787dd13399cb253fdc3663eebc6e1b38 ""
|
| 410 |
+
"figures/ch5/ch5-generalist-policies-timeline.png" 1757433878.16249 121521 55aa78250041db99cbb14853c1380a32 ""
|
| 411 |
+
"figures/ch5/ch5-ml-vs-robotics-foundation.png" 1757433647.43249 3389240 3cda02737587b6a7d9b8fe8d99c97a29 ""
|
| 412 |
+
"figures/ch5/ch5-pi0-sampling-timesteps.png" 1757435073.27621 186917 07c23a07b0d4bd64f4223e4d4d62937c ""
|
| 413 |
+
"figures/ch5/ch5-pi0.png" 1757434767.53001 1242717 1b00b6373d665a6028eb1435344f68f1 ""
|
| 414 |
+
"figures/ch5/ch5-smolvla.png" 1760132303.20073 1630415 aed7c52ca8144dad9f5c1791c69b642a ""
|
| 415 |
+
"figures/ch5/ch5-trends.png" 1757436505.93352 636731 d854e9b2a7df50b0eb372e952bb84abe ""
|
| 416 |
+
"handles.tex" 1759014025.98527 1729 5c2ed8279ada1ee6a07f7e544f930a2d ""
|
| 417 |
+
"hfstyle/assets/huggingface.pdf" 1757605857.52077 69646 8a153444d2c6ab4e542a2695178efe4a ""
|
| 418 |
+
"hfstyle/hf.cls" 1760367736.69266 10479 25a831e3b1923ceeb9fe4725093957b3 ""
|
| 419 |
+
"hfstyle/manrope.sty" 1757605857.52166 971 f4c964b399569ba117cf90a94f5f5267 ""
|
| 420 |
+
"hfstyle/manrope/Manrope-Bold.ttf" 1757605857.52217 96800 69258532ce99ef9abf8220e0276fff04 ""
|
| 421 |
+
"hfstyle/manrope/Manrope-Regular.ttf" 1757605857.52245 96832 f8105661cf5923464f0db8290746d2f9 ""
|
| 422 |
+
"logos/hf.pdf" 1757444600.08131 24570 821623074dcfbebaa29bc6a5c197dcdf ""
|
| 423 |
+
"logos/oxford_logo.png" 1759406839.00857 28687 a8e304254708565a7d8f03c48947006c ""
|
| 424 |
+
"main.aux" 1760378128.81232 95069 e38c0810d72c43c3020821b1066734a2 "pdflatex"
|
| 425 |
+
"main.bbl" 1760375737.43923 77303 a0bf56955842a07e35a27c1c992d2127 "bibtex main"
|
| 426 |
+
"main.out" 1760378128.8196 7176 c1a1034ae4eaf3f94db7f44cb26742a6 "pdflatex"
|
| 427 |
+
"main.tex" 1760378109.98581 6215 dc9e4d14873a65119a75a93ee8fa4891 ""
|
| 428 |
+
"main.toc" 1760378128.82008 3718 715a5db383fc8f6917d529cc0d9ee8d6 "pdflatex"
|
| 429 |
+
"manropebold.tfm" 1757605857.7002 1788 abd9026e2aecbe440ae420e7ad862bf8 ""
|
| 430 |
+
"manroperegular.tfm" 1757605857.70029 1788 c1682e75ce1aeb8ac320bb860b080a3d ""
|
| 431 |
+
"math_commands.tex" 1757605857.70096 13793 9a9c465a67ea1fca63600ef21e1787ff ""
|
| 432 |
+
"natbib.sty" 1754813587.69376 45154 40088ce024445d7ee2bca59e704cdd01 ""
|
| 433 |
+
"preamble.tex" 1757605857.70186 1567 4d07a9279351a29b2c402a92c5fc1b59 ""
|
| 434 |
+
"sections/00_abstract.tex" 1760136366.22253 949 ccaf2b8ee7252fb1eb71aa92614ff54a ""
|
| 435 |
+
"sections/01_introduction.tex" 1760376020.94003 14310 2d0fcbb3f3e67538a37034d930ae868f ""
|
| 436 |
+
"sections/02_classic_robotics.tex" 1760376023.61357 27286 cac72b9ab689d691a091eebd9ac9d5ee ""
|
| 437 |
+
"sections/03_reinforcement_learning.tex" 1760376024.31931 44367 92b2485390e0a39760f1556474727a54 ""
|
| 438 |
+
"sections/04_imitation_learning.tex" 1760376025.09868 75600 0e577b80b47cd919def5d86b9fc43e33 ""
|
| 439 |
+
"sections/05_foundation_models.tex" 1760376025.75253 37090 d22ffddee1c853e7bc6991515927f61e ""
|
| 440 |
+
"sections/07_conclusions.tex" 1760307166.09216 2977 671e6ba6dd75e340ccd28d3c917e159b ""
|
| 441 |
+
"sections/A_foreword.tex" 1760375212.92164 2651 cf0f7ecd69914177b7a56260dd9798ed ""
|
| 442 |
+
"snippets/ch1/01_datasets.py" 1760373469.41334 1548 eca683e393c1acb6946ddd3863c02468 ""
|
| 443 |
+
"snippets/ch1/02_record_data.py" 1760372430.03241 3444 210755119a1a7589d5d21c88d2162e6f ""
|
| 444 |
+
"snippets/ch3/01_reward_classifier.py" 1760374516.95257 2045 8a8c4b3c0bd6934ca641c81cee126713 ""
|
| 445 |
+
"snippets/ch3/02_actor.py" 1760374614.85175 4864 8c2c37f5c64be4a159b8768819c80be9 ""
|
| 446 |
+
"snippets/ch3/03_learner.py" 1760374661.30793 3681 586c133a982ad143a52be1fbbabd048c ""
|
| 447 |
+
"snippets/ch3/04_hil_serl.py" 1759513266.41943 4375 9dc61131752297116fac667ab3d4566f ""
|
| 448 |
+
"snippets/ch4/01_training_act.py" 1760374765.87454 3217 b9da86544f588671b963bb75036000da ""
|
| 449 |
+
"snippets/ch4/02_using_act.py" 1760374798.48847 2238 b53c21d7c156a546253f2fec6a028c53 ""
|
| 450 |
+
"snippets/ch4/03_training_diffusion.py" 1760374818.18355 3399 e572dfe258b3f70f0888228b1dd16c42 ""
|
| 451 |
+
"snippets/ch4/04_using_diffusion.py" 1760374836.58533 2279 51c7c1bfd0390520a830ea6822bf7aad ""
|
| 452 |
+
"snippets/ch4/05_policy_server.py" 1760046148.84102 302 e66b51fdf7e6112b684a3b8efc3e08c6 ""
|
| 453 |
+
"snippets/ch4/06_robot_client.py" 1760374855.75835 2040 82517a544804019aeef14491af04a95c ""
|
| 454 |
+
"snippets/ch5/01_using_pi0.py" 1760375032.92844 2705 83064852e2c54e38d76ff9437ab70b61 ""
|
| 455 |
+
"snippets/ch5/02_using_smolvla.py" 1760374890.46153 2620 3bdf3021c829e3ea41b768bc70d27007 ""
|
| 456 |
+
"snippets/code_specs.tex" 1760373809.91287 870 932340ce248c1454e6c0b84e4a587b57 ""
|
| 457 |
+
"t1manrope.fd" 1757605857.70591 737 fb89db35a388f068c58d61d8beb6150d ""
|
| 458 |
+
(generated)
|
| 459 |
+
"main.aux"
|
| 460 |
+
"main.log"
|
| 461 |
+
"main.out"
|
| 462 |
+
"main.pdf"
|
| 463 |
+
"main.toc"
|
| 464 |
+
(rewritten before read)
|
app/scripts/latex-to-mdx/input/main.fls
ADDED
|
@@ -0,0 +1,1008 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
PWD /Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial
|
| 2 |
+
INPUT /usr/local/texlive/2025/texmf.cnf
|
| 3 |
+
INPUT /usr/local/texlive/2025/texmf-dist/web2c/texmf.cnf
|
| 4 |
+
INPUT /usr/local/texlive/2025/texmf-var/web2c/pdftex/pdflatex.fmt
|
| 5 |
+
INPUT /Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/main.tex
|
| 6 |
+
OUTPUT main.log
|
| 7 |
+
INPUT ./hfstyle/hf.cls
|
| 8 |
+
INPUT hfstyle/hf.cls
|
| 9 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/article.cls
|
| 10 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/article.cls
|
| 11 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/size10.clo
|
| 12 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/size10.clo
|
| 13 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/size10.clo
|
| 14 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/geometry/geometry.sty
|
| 15 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/geometry/geometry.sty
|
| 16 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/keyval.sty
|
| 17 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/keyval.sty
|
| 18 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifvtex.sty
|
| 19 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifvtex.sty
|
| 20 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/iftex/iftex.sty
|
| 21 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/iftex/iftex.sty
|
| 22 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype.sty
|
| 23 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype.sty
|
| 24 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/etoolbox/etoolbox.sty
|
| 25 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/etoolbox/etoolbox.sty
|
| 26 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype-pdftex.def
|
| 27 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype-pdftex.def
|
| 28 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype-pdftex.def
|
| 29 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype.cfg
|
| 30 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype.cfg
|
| 31 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype.cfg
|
| 32 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/placeins/placeins.sty
|
| 33 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/placeins/placeins.sty
|
| 34 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyphenat/hyphenat.sty
|
| 35 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyphenat/hyphenat.sty
|
| 36 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/setspace/setspace.sty
|
| 37 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/setspace/setspace.sty
|
| 38 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/parskip/parskip.sty
|
| 39 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/parskip/parskip.sty
|
| 40 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/kvoptions/kvoptions.sty
|
| 41 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/kvoptions/kvoptions.sty
|
| 42 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty
|
| 43 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty
|
| 44 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty
|
| 45 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty
|
| 46 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/babel.sty
|
| 47 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/babel.sty
|
| 48 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/txtbabel.def
|
| 49 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel-latin/latin.ldf
|
| 50 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel-latin/latin.ldf
|
| 51 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel-latin/latin.ldf
|
| 52 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel-english/english.ldf
|
| 53 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel-english/english.ldf
|
| 54 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel-english/english.ldf
|
| 55 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/en/babel-english.tex
|
| 56 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/en/babel-english.tex
|
| 57 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/en/babel-english.tex
|
| 58 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/en/babel-en.ini
|
| 59 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/la/babel-latin.tex
|
| 60 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/la/babel-latin.tex
|
| 61 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/la/babel-latin.tex
|
| 62 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/la/babel-la.ini
|
| 63 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/lipsum/lipsum.sty
|
| 64 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/lipsum/lipsum.sty
|
| 65 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/l3packages/l3keys2e/l3keys2e.sty
|
| 66 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/l3packages/l3keys2e/l3keys2e.sty
|
| 67 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/l3kernel/expl3.sty
|
| 68 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/l3kernel/expl3.sty
|
| 69 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def
|
| 70 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def
|
| 71 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/lipsum/lipsum.ltd.tex
|
| 72 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/lipsum/lipsum.ltd.tex
|
| 73 |
+
INPUT ./fancyhdr.sty
|
| 74 |
+
INPUT fancyhdr.sty
|
| 75 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphicx.sty
|
| 76 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphicx.sty
|
| 77 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphics.sty
|
| 78 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphics.sty
|
| 79 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/trig.sty
|
| 80 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/trig.sty
|
| 81 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg
|
| 82 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg
|
| 83 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg
|
| 84 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def
|
| 85 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def
|
| 86 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def
|
| 87 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/subcaption.sty
|
| 88 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/subcaption.sty
|
| 89 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption.sty
|
| 90 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption.sty
|
| 91 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption3.sty
|
| 92 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption3.sty
|
| 93 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/booktabs/booktabs.sty
|
| 94 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/booktabs/booktabs.sty
|
| 95 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/nicematrix/nicematrix.sty
|
| 96 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/nicematrix/nicematrix.sty
|
| 97 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty
|
| 98 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty
|
| 99 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty
|
| 100 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty
|
| 101 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty
|
| 102 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty
|
| 103 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-common.tex
|
| 104 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-latex.def
|
| 105 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex
|
| 106 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex
|
| 107 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex
|
| 108 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/pgf.revision.tex
|
| 109 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/pgf.revision.tex
|
| 110 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
|
| 111 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
|
| 112 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
|
| 113 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
|
| 114 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeyslibraryfiltered.code.tex
|
| 115 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgf.cfg
|
| 116 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.def
|
| 117 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.def
|
| 118 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-common-pdf.def
|
| 119 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.tex
|
| 120 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.tex
|
| 121 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.tex
|
| 122 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.tex
|
| 123 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.tex
|
| 124 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.tex
|
| 125 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/xcolor/xcolor.sty
|
| 126 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/xcolor/xcolor.sty
|
| 127 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg
|
| 128 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg
|
| 129 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg
|
| 130 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx
|
| 131 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx
|
| 132 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx
|
| 133 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/colortbl/colortbl.sty
|
| 134 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/colortbl/colortbl.sty
|
| 135 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/array.sty
|
| 136 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/array.sty
|
| 137 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/color.sty
|
| 138 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
|
| 139 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
|
| 140 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
|
| 141 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex
|
| 142 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathutil.code.tex
|
| 143 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathparser.code.tex
|
| 144 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.code.tex
|
| 145 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.basic.code.tex
|
| 146 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.trigonometric.code.tex
|
| 147 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.random.code.tex
|
| 148 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.comparison.code.tex
|
| 149 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.base.code.tex
|
| 150 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.round.code.tex
|
| 151 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.misc.code.tex
|
| 152 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.integerarithmetics.code.tex
|
| 153 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathcalc.code.tex
|
| 154 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfloat.code.tex
|
| 155 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfint.code.tex
|
| 156 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepoints.code.tex
|
| 157 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathconstruct.code.tex
|
| 158 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathusage.code.tex
|
| 159 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorescopes.code.tex
|
| 160 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoregraphicstate.code.tex
|
| 161 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransformations.code.tex
|
| 162 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorequick.code.tex
|
| 163 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreobjects.code.tex
|
| 164 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathprocessing.code.tex
|
| 165 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorearrows.code.tex
|
| 166 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreshade.code.tex
|
| 167 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreimage.code.tex
|
| 168 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreexternal.code.tex
|
| 169 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorelayers.code.tex
|
| 170 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransparency.code.tex
|
| 171 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepatterns.code.tex
|
| 172 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorerdf.code.tex
|
| 173 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleshapes.code.tex
|
| 174 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsmath.sty
|
| 175 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsmath.sty
|
| 176 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsopn.sty
|
| 177 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amstext.sty
|
| 178 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amstext.sty
|
| 179 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsgen.sty
|
| 180 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsgen.sty
|
| 181 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsbsy.sty
|
| 182 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsbsy.sty
|
| 183 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsopn.sty
|
| 184 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/multirow/multirow.sty
|
| 185 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/multirow/multirow.sty
|
| 186 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/bm.sty
|
| 187 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/bm.sty
|
| 188 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcolorbox.sty
|
| 189 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcolorbox.sty
|
| 190 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/frontendlayer/tikz.sty
|
| 191 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/frontendlayer/tikz.sty
|
| 192 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgf.sty
|
| 193 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgf.sty
|
| 194 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleplot.code.tex
|
| 195 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-0-65.sty
|
| 196 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-0-65.sty
|
| 197 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-1-18.sty
|
| 198 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-1-18.sty
|
| 199 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgffor.sty
|
| 200 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgffor.sty
|
| 201 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty
|
| 202 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty
|
| 203 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
|
| 204 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
|
| 205 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
|
| 206 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/math/pgfmath.sty
|
| 207 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/math/pgfmath.sty
|
| 208 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex
|
| 209 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex
|
| 210 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex
|
| 211 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex
|
| 212 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex
|
| 213 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex
|
| 214 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex
|
| 215 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex
|
| 216 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex
|
| 217 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryplothandlers.code.tex
|
| 218 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryplothandlers.code.tex
|
| 219 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmodulematrix.code.tex
|
| 220 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarytopaths.code.tex
|
| 221 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarytopaths.code.tex
|
| 222 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/verbatim.sty
|
| 223 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/verbatim.sty
|
| 224 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/environ/environ.sty
|
| 225 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/environ/environ.sty
|
| 226 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/trimspaces/trimspaces.sty
|
| 227 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/trimspaces/trimspaces.sty
|
| 228 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbraster.code.tex
|
| 229 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbskins.code.tex
|
| 230 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzfill.image.sty
|
| 231 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzfill.image.sty
|
| 232 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzfill-common.sty
|
| 233 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzfill-common.sty
|
| 234 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzlibraryfill.image.code.tex
|
| 235 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzlibraryfill.image.code.tex
|
| 236 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbskinsjigsaw.code.tex
|
| 237 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbbreakable.code.tex
|
| 238 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pdfcol/pdfcol.sty
|
| 239 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pdfcol/pdfcol.sty
|
| 240 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/infwarerr/infwarerr.sty
|
| 241 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/infwarerr/infwarerr.sty
|
| 242 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbhooks.code.tex
|
| 243 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbtheorems.code.tex
|
| 244 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbfitting.code.tex
|
| 245 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcblistingsutf8.code.tex
|
| 246 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcblistings.code.tex
|
| 247 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/listings.sty
|
| 248 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/listings.sty
|
| 249 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstpatch.sty
|
| 250 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstpatch.sty
|
| 251 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstpatch.sty
|
| 252 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstmisc.sty
|
| 253 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstmisc.sty
|
| 254 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstmisc.sty
|
| 255 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/listings.cfg
|
| 256 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/listings.cfg
|
| 257 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/listings.cfg
|
| 258 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcblistingscore.code.tex
|
| 259 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbprocessing.code.tex
|
| 260 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listingsutf8/listingsutf8.sty
|
| 261 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listingsutf8/listingsutf8.sty
|
| 262 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty
|
| 263 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty
|
| 264 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/stringenc/stringenc.sty
|
| 265 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/stringenc/stringenc.sty
|
| 266 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pdfescape/pdfescape.sty
|
| 267 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pdfescape/pdfescape.sty
|
| 268 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbexternal.code.tex
|
| 269 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbmagazine.code.tex
|
| 270 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbvignette.code.tex
|
| 271 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibraryfadings.code.tex
|
| 272 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibraryfadings.code.tex
|
| 273 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryfadings.code.tex
|
| 274 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryfadings.code.tex
|
| 275 |
+
OUTPUT main.pdf
|
| 276 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbposter.code.tex
|
| 277 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hyperref.sty
|
| 278 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hyperref.sty
|
| 279 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty
|
| 280 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty
|
| 281 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hycolor/hycolor.sty
|
| 282 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hycolor/hycolor.sty
|
| 283 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/nameref.sty
|
| 284 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/nameref.sty
|
| 285 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/refcount/refcount.sty
|
| 286 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/refcount/refcount.sty
|
| 287 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty
|
| 288 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty
|
| 289 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def
|
| 290 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def
|
| 291 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def
|
| 292 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/intcalc/intcalc.sty
|
| 293 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/intcalc/intcalc.sty
|
| 294 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def
|
| 295 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def
|
| 296 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def
|
| 297 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/url/url.sty
|
| 298 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/url/url.sty
|
| 299 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/bitset/bitset.sty
|
| 300 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/bitset/bitset.sty
|
| 301 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty
|
| 302 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty
|
| 303 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/atbegshi/atbegshi.sty
|
| 304 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/atbegshi-ltx.sty
|
| 305 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/atbegshi-ltx.sty
|
| 306 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def
|
| 307 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def
|
| 308 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def
|
| 309 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/atveryend/atveryend.sty
|
| 310 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/atveryend-ltx.sty
|
| 311 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/atveryend-ltx.sty
|
| 312 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty
|
| 313 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty
|
| 314 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty
|
| 315 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty
|
| 316 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/cleveref/cleveref.sty
|
| 317 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/cleveref/cleveref.sty
|
| 318 |
+
INPUT ./natbib.sty
|
| 319 |
+
INPUT natbib.sty
|
| 320 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/titlesec/titlesec.sty
|
| 321 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/titlesec/titlesec.sty
|
| 322 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifxetex.sty
|
| 323 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifxetex.sty
|
| 324 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/fontenc.sty
|
| 325 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/fontenc.sty
|
| 326 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/map/fontname/texfonts.map
|
| 327 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm1000.tfm
|
| 328 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/fontenc.sty
|
| 329 |
+
INPUT ./hfstyle/manrope.sty
|
| 330 |
+
INPUT hfstyle/manrope.sty
|
| 331 |
+
INPUT ./t1manrope.fd
|
| 332 |
+
INPUT t1manrope.fd
|
| 333 |
+
INPUT /usr/local/texlive/2025/texmf-var/fonts/map/pdftex/updmap/pdftex.map
|
| 334 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tocloft/tocloft.sty
|
| 335 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tocloft/tocloft.sty
|
| 336 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/inputenc.sty
|
| 337 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/inputenc.sty
|
| 338 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/fontenc.sty
|
| 339 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/lineno/lineno.sty
|
| 340 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/lineno/lineno.sty
|
| 341 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/enumitem/enumitem.sty
|
| 342 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/enumitem/enumitem.sty
|
| 343 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amsfonts.sty
|
| 344 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amsfonts.sty
|
| 345 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amssymb.sty
|
| 346 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amssymb.sty
|
| 347 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/units/nicefrac.sty
|
| 348 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/units/nicefrac.sty
|
| 349 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/ifthen.sty
|
| 350 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/ifthen.sty
|
| 351 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/siunitx/siunitx.sty
|
| 352 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/siunitx/siunitx.sty
|
| 353 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translations/translations.sty
|
| 354 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translations/translations.sty
|
| 355 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/multirow/bigdelim.sty
|
| 356 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/multirow/bigdelim.sty
|
| 357 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/longtable.sty
|
| 358 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/longtable.sty
|
| 359 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tabularray/tabularray.sty
|
| 360 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tabularray/tabularray.sty
|
| 361 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/wrapfig/wrapfig.sty
|
| 362 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/wrapfig/wrapfig.sty
|
| 363 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/makecell/makecell.sty
|
| 364 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/makecell/makecell.sty
|
| 365 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjustbox.sty
|
| 366 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjustbox.sty
|
| 367 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/xkeyval/xkeyval.sty
|
| 368 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/xkeyval/xkeyval.sty
|
| 369 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkeyval.tex
|
| 370 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkvutils.tex
|
| 371 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjcalc.sty
|
| 372 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjcalc.sty
|
| 373 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/trimclip.sty
|
| 374 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/trimclip.sty
|
| 375 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/collectbox/collectbox.sty
|
| 376 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/collectbox/collectbox.sty
|
| 377 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/tc-pdftex.def
|
| 378 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/tc-pdftex.def
|
| 379 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/tc-pdftex.def
|
| 380 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/ifoddpage/ifoddpage.sty
|
| 381 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/ifoddpage/ifoddpage.sty
|
| 382 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/ifoddpage/ifoddpage.sty
|
| 383 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/varwidth/varwidth.sty
|
| 384 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/varwidth/varwidth.sty
|
| 385 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/varwidth/varwidth.sty
|
| 386 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/dvipsnam.def
|
| 387 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/dvipsnam.def
|
| 388 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/dvipsnam.def
|
| 389 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/xspace.sty
|
| 390 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/xspace.sty
|
| 391 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/soul/soul.sty
|
| 392 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/soul/soul.sty
|
| 393 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/soul/soul-ori.sty
|
| 394 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/soul/soul-ori.sty
|
| 395 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ectt1000.tfm
|
| 396 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/etexcmds/etexcmds.sty
|
| 397 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/etexcmds/etexcmds.sty
|
| 398 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.sty
|
| 399 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.sty
|
| 400 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.def
|
| 401 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.def
|
| 402 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.def
|
| 403 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.cfg
|
| 404 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.cfg
|
| 405 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.cfg
|
| 406 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/arydshln/arydshln.sty
|
| 407 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/arydshln/arydshln.sty
|
| 408 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/todonotes/todonotes.sty
|
| 409 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/todonotes/todonotes.sty
|
| 410 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarypositioning.code.tex
|
| 411 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarypositioning.code.tex
|
| 412 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/calc.sty
|
| 413 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/calc.sty
|
| 414 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/textpos/textpos.sty
|
| 415 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/textpos/textpos.sty
|
| 416 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/pifont.sty
|
| 417 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/pifont.sty
|
| 418 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upzd.fd
|
| 419 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upzd.fd
|
| 420 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upzd.fd
|
| 421 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/adobe/zapfding/pzdr.tfm
|
| 422 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upsy.fd
|
| 423 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upsy.fd
|
| 424 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upsy.fd
|
| 425 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/adobe/symbol/psyr.tfm
|
| 426 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/bold-extra/bold-extra.sty
|
| 427 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/bold-extra/bold-extra.sty
|
| 428 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf-pie/pgf-pie.sty
|
| 429 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf-pie/pgf-pie.sty
|
| 430 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf-pie/tikzlibrarypie.code.tex
|
| 431 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf-pie/tikzlibrarypie.code.tex
|
| 432 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/carlisle/scalefnt.sty
|
| 433 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/carlisle/scalefnt.sty
|
| 434 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/epigraph/epigraph.sty
|
| 435 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/epigraph/epigraph.sty
|
| 436 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/nextpage/nextpage.sty
|
| 437 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/nextpage/nextpage.sty
|
| 438 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/algorithms/algorithm.sty
|
| 439 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/algorithms/algorithm.sty
|
| 440 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/float/float.sty
|
| 441 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/float/float.sty
|
| 442 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/algorithmicx/algpseudocode.sty
|
| 443 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/algorithmicx/algpseudocode.sty
|
| 444 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/algorithmicx/algorithmicx.sty
|
| 445 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/algorithmicx/algorithmicx.sty
|
| 446 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbminted.code.tex
|
| 447 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/minted/minted.sty
|
| 448 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/minted/minted.sty
|
| 449 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/catchfile/catchfile.sty
|
| 450 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/catchfile/catchfile.sty
|
| 451 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/fvextra/fvextra.sty
|
| 452 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/fvextra/fvextra.sty
|
| 453 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/fancyvrb/fancyvrb.sty
|
| 454 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/fancyvrb/fancyvrb.sty
|
| 455 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/upquote/upquote.sty
|
| 456 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/upquote/upquote.sty
|
| 457 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/textcomp.sty
|
| 458 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/textcomp.sty
|
| 459 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/latex2pydata/latex2pydata.sty
|
| 460 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/latex2pydata/latex2pydata.sty
|
| 461 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgfopts/pgfopts.sty
|
| 462 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgfopts/pgfopts.sty
|
| 463 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/shellesc.sty
|
| 464 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/shellesc.sty
|
| 465 |
+
INPUT ./preamble.tex
|
| 466 |
+
INPUT ./preamble.tex
|
| 467 |
+
INPUT ./preamble.tex
|
| 468 |
+
INPUT ./preamble.tex
|
| 469 |
+
INPUT preamble.tex
|
| 470 |
+
INPUT ./math_commands.tex
|
| 471 |
+
INPUT ./math_commands.tex
|
| 472 |
+
INPUT ./math_commands.tex
|
| 473 |
+
INPUT ./math_commands.tex
|
| 474 |
+
INPUT math_commands.tex
|
| 475 |
+
INPUT ./handles.tex
|
| 476 |
+
INPUT ./handles.tex
|
| 477 |
+
INPUT ./handles.tex
|
| 478 |
+
INPUT ./handles.tex
|
| 479 |
+
INPUT handles.tex
|
| 480 |
+
INPUT ./snippets/code_specs.tex
|
| 481 |
+
INPUT ./snippets/code_specs.tex
|
| 482 |
+
INPUT ./snippets/code_specs.tex
|
| 483 |
+
INPUT ./snippets/code_specs.tex
|
| 484 |
+
INPUT snippets/code_specs.tex
|
| 485 |
+
INPUT ./main.aux
|
| 486 |
+
INPUT ./main.aux
|
| 487 |
+
INPUT main.aux
|
| 488 |
+
OUTPUT main.aux
|
| 489 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-cmr.cfg
|
| 490 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-cmr.cfg
|
| 491 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-cmr.cfg
|
| 492 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
|
| 493 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
|
| 494 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
|
| 495 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty
|
| 496 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty
|
| 497 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg
|
| 498 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg
|
| 499 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg
|
| 500 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/ltcaption.sty
|
| 501 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/ltcaption.sty
|
| 502 |
+
INPUT ./main.out
|
| 503 |
+
INPUT ./main.out
|
| 504 |
+
INPUT main.out
|
| 505 |
+
INPUT main.out
|
| 506 |
+
INPUT ./main.out
|
| 507 |
+
INPUT ./main.out
|
| 508 |
+
OUTPUT main.out
|
| 509 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translations/translations-basic-dictionary-english.trsl
|
| 510 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translations/translations-basic-dictionary-english.trsl
|
| 511 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translations/translations-basic-dictionary-english.trsl
|
| 512 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/ninecolors/ninecolors.sty
|
| 513 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/ninecolors/ninecolors.sty
|
| 514 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/l3packages/xparse/xparse.sty
|
| 515 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/l3packages/xparse/xparse.sty
|
| 516 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm2488.tfm
|
| 517 |
+
INPUT ./t1manrope.fd
|
| 518 |
+
INPUT ./t1manrope.fd
|
| 519 |
+
INPUT t1manrope.fd
|
| 520 |
+
INPUT manroperegular.tfm
|
| 521 |
+
INPUT manropebold.tfm
|
| 522 |
+
INPUT manroperegular.tfm
|
| 523 |
+
INPUT manropebold.tfm
|
| 524 |
+
INPUT ./logos/oxford_logo.png
|
| 525 |
+
INPUT ./logos/oxford_logo.png
|
| 526 |
+
INPUT ./logos/oxford_logo.png
|
| 527 |
+
INPUT ./logos/oxford_logo.png
|
| 528 |
+
INPUT ./logos/oxford_logo.png
|
| 529 |
+
INPUT ./logos/hf.pdf
|
| 530 |
+
INPUT ./logos/hf.pdf
|
| 531 |
+
INPUT ./logos/hf.pdf
|
| 532 |
+
INPUT ./logos/hf.pdf
|
| 533 |
+
INPUT ./logos/hf.pdf
|
| 534 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex7.tfm
|
| 535 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex7.tfm
|
| 536 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx10.tfm
|
| 537 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx7.tfm
|
| 538 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx5.tfm
|
| 539 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmib10.tfm
|
| 540 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmmib7.tfm
|
| 541 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmmib5.tfm
|
| 542 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbsy10.tfm
|
| 543 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmbsy7.tfm
|
| 544 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmbsy5.tfm
|
| 545 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsa.fd
|
| 546 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsa.fd
|
| 547 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsa.fd
|
| 548 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam10.tfm
|
| 549 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-msa.cfg
|
| 550 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-msa.cfg
|
| 551 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-msa.cfg
|
| 552 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam7.tfm
|
| 553 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam5.tfm
|
| 554 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsb.fd
|
| 555 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsb.fd
|
| 556 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsb.fd
|
| 557 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm10.tfm
|
| 558 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-msb.cfg
|
| 559 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-msb.cfg
|
| 560 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-msb.cfg
|
| 561 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm7.tfm
|
| 562 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm5.tfm
|
| 563 |
+
INPUT ./logos/hf.pdf
|
| 564 |
+
INPUT ./logos/hf.pdf
|
| 565 |
+
INPUT ./logos/hf.pdf
|
| 566 |
+
INPUT ./logos/hf.pdf
|
| 567 |
+
INPUT ./logos/hf.pdf
|
| 568 |
+
INPUT ./logos/hf.pdf
|
| 569 |
+
INPUT ./logos/hf.pdf
|
| 570 |
+
INPUT ./logos/hf.pdf
|
| 571 |
+
INPUT ./logos/hf.pdf
|
| 572 |
+
INPUT ./logos/hf.pdf
|
| 573 |
+
INPUT ./logos/hf.pdf
|
| 574 |
+
INPUT ./logos/hf.pdf
|
| 575 |
+
INPUT ./logos/hf.pdf
|
| 576 |
+
INPUT ./logos/hf.pdf
|
| 577 |
+
INPUT ./logos/hf.pdf
|
| 578 |
+
INPUT ./logos/hf.pdf
|
| 579 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm0900.tfm
|
| 580 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr9.tfm
|
| 581 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr6.tfm
|
| 582 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi9.tfm
|
| 583 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi6.tfm
|
| 584 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy9.tfm
|
| 585 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy6.tfm
|
| 586 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex9.tfm
|
| 587 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex7.tfm
|
| 588 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx9.tfm
|
| 589 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx6.tfm
|
| 590 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmmib9.tfm
|
| 591 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmmib6.tfm
|
| 592 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmbsy9.tfm
|
| 593 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmbsy6.tfm
|
| 594 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam10.tfm
|
| 595 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam7.tfm
|
| 596 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm10.tfm
|
| 597 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm7.tfm
|
| 598 |
+
INPUT ./logos/oxford_logo.png
|
| 599 |
+
INPUT ./logos/oxford_logo.png
|
| 600 |
+
INPUT ./logos/oxford_logo.png
|
| 601 |
+
INPUT ./logos/oxford_logo.png
|
| 602 |
+
INPUT ./logos/hf.pdf
|
| 603 |
+
INPUT ./logos/hf.pdf
|
| 604 |
+
INPUT ./logos/hf.pdf
|
| 605 |
+
INPUT ./logos/hf.pdf
|
| 606 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm1440.tfm
|
| 607 |
+
INPUT manroperegular.tfm
|
| 608 |
+
INPUT manropebold.tfm
|
| 609 |
+
INPUT ./sections/00_abstract.tex
|
| 610 |
+
INPUT ./sections/00_abstract.tex
|
| 611 |
+
INPUT ./sections/00_abstract.tex
|
| 612 |
+
INPUT ./sections/00_abstract.tex
|
| 613 |
+
INPUT sections/00_abstract.tex
|
| 614 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecbx0900.tfm
|
| 615 |
+
INPUT manropebold.tfm
|
| 616 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ectt0900.tfm
|
| 617 |
+
INPUT ./hfstyle/assets/huggingface.pdf
|
| 618 |
+
INPUT ./hfstyle/assets/huggingface.pdf
|
| 619 |
+
INPUT ./hfstyle/assets/huggingface.pdf
|
| 620 |
+
INPUT ./hfstyle/assets/huggingface.pdf
|
| 621 |
+
INPUT ./hfstyle/assets/huggingface.pdf
|
| 622 |
+
INPUT ./main.toc
|
| 623 |
+
INPUT ./main.toc
|
| 624 |
+
INPUT main.toc
|
| 625 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecbx1000.tfm
|
| 626 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/enc/ttf2pk/base/T1-WGL4.enc
|
| 627 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/enc/dvips/cm-super/cm-super-t1.enc
|
| 628 |
+
OUTPUT main.toc
|
| 629 |
+
INPUT ./sections/A_foreword.tex
|
| 630 |
+
INPUT ./sections/A_foreword.tex
|
| 631 |
+
INPUT sections/A_foreword.tex
|
| 632 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/tcrm1000.tfm
|
| 633 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecti1000.tfm
|
| 634 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/enc/dvips/cm-super/cm-super-ts1.enc
|
| 635 |
+
INPUT ./sections/01_introduction.tex
|
| 636 |
+
INPUT ./sections/01_introduction.tex
|
| 637 |
+
INPUT ./sections/01_introduction.tex
|
| 638 |
+
INPUT ./sections/01_introduction.tex
|
| 639 |
+
INPUT sections/01_introduction.tex
|
| 640 |
+
INPUT ./figures/ch1/ch1-lerobot-figure1.png
|
| 641 |
+
INPUT ./figures/ch1/ch1-lerobot-figure1.png
|
| 642 |
+
INPUT ./figures/ch1/ch1-lerobot-figure1.png
|
| 643 |
+
INPUT ./figures/ch1/ch1-lerobot-figure1.png
|
| 644 |
+
INPUT ./figures/ch1/ch1-lerobot-figure1.png
|
| 645 |
+
INPUT manroperegular.tfm
|
| 646 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm1200.tfm
|
| 647 |
+
INPUT manroperegular.tfm
|
| 648 |
+
INPUT manropebold.tfm
|
| 649 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ectt1200.tfm
|
| 650 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm1095.tfm
|
| 651 |
+
INPUT manroperegular.tfm
|
| 652 |
+
INPUT manropebold.tfm
|
| 653 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstlang1.sty
|
| 654 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstlang1.sty
|
| 655 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstlang1.sty
|
| 656 |
+
INPUT ./snippets/ch1/01_datasets.py
|
| 657 |
+
INPUT ./snippets/ch1/01_datasets.py
|
| 658 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ectt0800.tfm
|
| 659 |
+
INPUT snippets/ch1/01_datasets.py
|
| 660 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm0800.tfm
|
| 661 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm0500.tfm
|
| 662 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/tctt0800.tfm
|
| 663 |
+
INPUT ./snippets/ch1/02_record_data.py
|
| 664 |
+
INPUT ./snippets/ch1/02_record_data.py
|
| 665 |
+
INPUT snippets/ch1/02_record_data.py
|
| 666 |
+
INPUT ./sections/02_classic_robotics.tex
|
| 667 |
+
INPUT ./sections/02_classic_robotics.tex
|
| 668 |
+
INPUT ./sections/02_classic_robotics.tex
|
| 669 |
+
INPUT ./sections/02_classic_robotics.tex
|
| 670 |
+
INPUT sections/02_classic_robotics.tex
|
| 671 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecti0900.tfm
|
| 672 |
+
INPUT ./figures/ch2/ch2-approaches.png
|
| 673 |
+
INPUT ./figures/ch2/ch2-approaches.png
|
| 674 |
+
INPUT ./figures/ch2/ch2-approaches.png
|
| 675 |
+
INPUT ./figures/ch2/ch2-approaches.png
|
| 676 |
+
INPUT ./figures/ch2/ch2-approaches.png
|
| 677 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm0700.tfm
|
| 678 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr8.tfm
|
| 679 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi8.tfm
|
| 680 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy8.tfm
|
| 681 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex8.tfm
|
| 682 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx8.tfm
|
| 683 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmmib8.tfm
|
| 684 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmbsy8.tfm
|
| 685 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam10.tfm
|
| 686 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm10.tfm
|
| 687 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecrm0600.tfm
|
| 688 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/ecti0800.tfm
|
| 689 |
+
INPUT ./figures/ch2/ch2-platforms.png
|
| 690 |
+
INPUT ./figures/ch2/ch2-platforms.png
|
| 691 |
+
INPUT ./figures/ch2/ch2-platforms.png
|
| 692 |
+
INPUT ./figures/ch2/ch2-platforms.png
|
| 693 |
+
INPUT ./figures/ch2/ch2-platforms.png
|
| 694 |
+
INPUT ./figures/ch2/ch2-cost-accessibility.png
|
| 695 |
+
INPUT ./figures/ch2/ch2-cost-accessibility.png
|
| 696 |
+
INPUT ./figures/ch2/ch2-cost-accessibility.png
|
| 697 |
+
INPUT ./figures/ch2/ch2-cost-accessibility.png
|
| 698 |
+
INPUT ./figures/ch2/ch2-cost-accessibility.png
|
| 699 |
+
INPUT ./figures/ch2/ch2-so100-to-planar-manipulator.png
|
| 700 |
+
INPUT ./figures/ch2/ch2-so100-to-planar-manipulator.png
|
| 701 |
+
INPUT ./figures/ch2/ch2-so100-to-planar-manipulator.png
|
| 702 |
+
INPUT ./figures/ch2/ch2-so100-to-planar-manipulator.png
|
| 703 |
+
INPUT ./figures/ch2/ch2-so100-to-planar-manipulator.png
|
| 704 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-free.png
|
| 705 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-free.png
|
| 706 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-free.png
|
| 707 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-free.png
|
| 708 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-free.png
|
| 709 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor.png
|
| 710 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor.png
|
| 711 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor.png
|
| 712 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor.png
|
| 713 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor.png
|
| 714 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor-shelf.png
|
| 715 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor-shelf.png
|
| 716 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor-shelf.png
|
| 717 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor-shelf.png
|
| 718 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor-shelf.png
|
| 719 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor-box.png
|
| 720 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor-box.png
|
| 721 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor-box.png
|
| 722 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor-box.png
|
| 723 |
+
INPUT ./figures/ch2/ch2-planar-manipulator-floor-box.png
|
| 724 |
+
INPUT ./figures/ch2/ch2-classical-limitations.png
|
| 725 |
+
INPUT ./figures/ch2/ch2-classical-limitations.png
|
| 726 |
+
INPUT ./figures/ch2/ch2-classical-limitations.png
|
| 727 |
+
INPUT ./figures/ch2/ch2-classical-limitations.png
|
| 728 |
+
INPUT ./figures/ch2/ch2-classical-limitations.png
|
| 729 |
+
INPUT ./sections/03_reinforcement_learning.tex
|
| 730 |
+
INPUT ./sections/03_reinforcement_learning.tex
|
| 731 |
+
INPUT sections/03_reinforcement_learning.tex
|
| 732 |
+
INPUT ./figures/ch3/ch3-learning-benefits.png
|
| 733 |
+
INPUT ./figures/ch3/ch3-learning-benefits.png
|
| 734 |
+
INPUT ./figures/ch3/ch3-learning-benefits.png
|
| 735 |
+
INPUT ./figures/ch3/ch3-learning-benefits.png
|
| 736 |
+
INPUT ./figures/ch3/ch3-learning-benefits.png
|
| 737 |
+
INPUT ./figures/ch3/ch3-learning-atlas.png
|
| 738 |
+
INPUT ./figures/ch3/ch3-learning-atlas.png
|
| 739 |
+
INPUT ./figures/ch3/ch3-learning-atlas.png
|
| 740 |
+
INPUT ./figures/ch3/ch3-learning-atlas.png
|
| 741 |
+
INPUT ./figures/ch3/ch3-learning-atlas.png
|
| 742 |
+
INPUT ./figures/ch3/ch3-rl-examples.png
|
| 743 |
+
INPUT ./figures/ch3/ch3-rl-examples.png
|
| 744 |
+
INPUT ./figures/ch3/ch3-rl-examples.png
|
| 745 |
+
INPUT ./figures/ch3/ch3-rl-examples.png
|
| 746 |
+
INPUT ./figures/ch3/ch3-rl-examples.png
|
| 747 |
+
INPUT ./figures/ch3/ch3-agent-env.png
|
| 748 |
+
INPUT ./figures/ch3/ch3-agent-env.png
|
| 749 |
+
INPUT ./figures/ch3/ch3-agent-env.png
|
| 750 |
+
INPUT ./figures/ch3/ch3-agent-env.png
|
| 751 |
+
INPUT ./figures/ch3/ch3-agent-env.png
|
| 752 |
+
INPUT ./figures/ch3/ch3-rl-algorithms-atlas.png
|
| 753 |
+
INPUT ./figures/ch3/ch3-rl-algorithms-atlas.png
|
| 754 |
+
INPUT ./figures/ch3/ch3-rl-algorithms-atlas.png
|
| 755 |
+
INPUT ./figures/ch3/ch3-rl-algorithms-atlas.png
|
| 756 |
+
INPUT ./figures/ch3/ch3-rl-algorithms-atlas.png
|
| 757 |
+
INPUT ./figures/ch3/ch3-duck-sim-vs-real.png
|
| 758 |
+
INPUT ./figures/ch3/ch3-duck-sim-vs-real.png
|
| 759 |
+
INPUT ./figures/ch3/ch3-duck-sim-vs-real.png
|
| 760 |
+
INPUT ./figures/ch3/ch3-duck-sim-vs-real.png
|
| 761 |
+
INPUT ./figures/ch3/ch3-duck-sim-vs-real.png
|
| 762 |
+
INPUT ./figures/ch3/ch3-many-ducks.png
|
| 763 |
+
INPUT ./figures/ch3/ch3-many-ducks.png
|
| 764 |
+
INPUT ./figures/ch3/ch3-many-ducks.png
|
| 765 |
+
INPUT ./figures/ch3/ch3-many-ducks.png
|
| 766 |
+
INPUT ./figures/ch3/ch3-many-ducks.png
|
| 767 |
+
INPUT ./figures/ch3/ch3-hil-serl-examples.png
|
| 768 |
+
INPUT ./figures/ch3/ch3-hil-serl-examples.png
|
| 769 |
+
INPUT ./figures/ch3/ch3-hil-serl-examples.png
|
| 770 |
+
INPUT ./figures/ch3/ch3-hil-serl-examples.png
|
| 771 |
+
INPUT ./figures/ch3/ch3-hil-serl-examples.png
|
| 772 |
+
INPUT ./figures/ch3/ch3-hil-serl-architecture.png
|
| 773 |
+
INPUT ./figures/ch3/ch3-hil-serl-architecture.png
|
| 774 |
+
INPUT ./figures/ch3/ch3-hil-serl-architecture.png
|
| 775 |
+
INPUT ./figures/ch3/ch3-hil-serl-architecture.png
|
| 776 |
+
INPUT ./figures/ch3/ch3-hil-serl-architecture.png
|
| 777 |
+
INPUT ./snippets/ch3/01_reward_classifier.py
|
| 778 |
+
INPUT ./snippets/ch3/01_reward_classifier.py
|
| 779 |
+
INPUT snippets/ch3/01_reward_classifier.py
|
| 780 |
+
INPUT ./snippets/ch3/02_actor.py
|
| 781 |
+
INPUT ./snippets/ch3/02_actor.py
|
| 782 |
+
INPUT snippets/ch3/02_actor.py
|
| 783 |
+
INPUT ./snippets/ch3/03_learner.py
|
| 784 |
+
INPUT ./snippets/ch3/03_learner.py
|
| 785 |
+
INPUT snippets/ch3/03_learner.py
|
| 786 |
+
INPUT ./snippets/ch3/04_hil_serl.py
|
| 787 |
+
INPUT ./snippets/ch3/04_hil_serl.py
|
| 788 |
+
INPUT snippets/ch3/04_hil_serl.py
|
| 789 |
+
INPUT ./sections/04_imitation_learning.tex
|
| 790 |
+
INPUT ./sections/04_imitation_learning.tex
|
| 791 |
+
INPUT sections/04_imitation_learning.tex
|
| 792 |
+
INPUT ./figures/ch4/ch4-bc-trajectories.png
|
| 793 |
+
INPUT ./figures/ch4/ch4-bc-trajectories.png
|
| 794 |
+
INPUT ./figures/ch4/ch4-bc-trajectories.png
|
| 795 |
+
INPUT ./figures/ch4/ch4-bc-trajectories.png
|
| 796 |
+
INPUT ./figures/ch4/ch4-bc-trajectories.png
|
| 797 |
+
INPUT ./figures/ch4/ch4-observation-action-mapping.png
|
| 798 |
+
INPUT ./figures/ch4/ch4-observation-action-mapping.png
|
| 799 |
+
INPUT ./figures/ch4/ch4-observation-action-mapping.png
|
| 800 |
+
INPUT ./figures/ch4/ch4-observation-action-mapping.png
|
| 801 |
+
INPUT ./figures/ch4/ch4-observation-action-mapping.png
|
| 802 |
+
INPUT ./figures/ch4/ch4-issues-with-bc.png
|
| 803 |
+
INPUT ./figures/ch4/ch4-issues-with-bc.png
|
| 804 |
+
INPUT ./figures/ch4/ch4-issues-with-bc.png
|
| 805 |
+
INPUT ./figures/ch4/ch4-issues-with-bc.png
|
| 806 |
+
INPUT ./figures/ch4/ch4-issues-with-bc.png
|
| 807 |
+
INPUT ./figures/ch4/ch4-task-effect-on-pairs.png
|
| 808 |
+
INPUT ./figures/ch4/ch4-task-effect-on-pairs.png
|
| 809 |
+
INPUT ./figures/ch4/ch4-task-effect-on-pairs.png
|
| 810 |
+
INPUT ./figures/ch4/ch4-task-effect-on-pairs.png
|
| 811 |
+
INPUT ./figures/ch4/ch4-task-effect-on-pairs.png
|
| 812 |
+
INPUT ./figures/ch4/ch4-latent-variable-model.png
|
| 813 |
+
INPUT ./figures/ch4/ch4-latent-variable-model.png
|
| 814 |
+
INPUT ./figures/ch4/ch4-latent-variable-model.png
|
| 815 |
+
INPUT ./figures/ch4/ch4-latent-variable-model.png
|
| 816 |
+
INPUT ./figures/ch4/ch4-latent-variable-model.png
|
| 817 |
+
INPUT ./figures/ch4/ch4-many-latents.png
|
| 818 |
+
INPUT ./figures/ch4/ch4-many-latents.png
|
| 819 |
+
INPUT ./figures/ch4/ch4-many-latents.png
|
| 820 |
+
INPUT ./figures/ch4/ch4-many-latents.png
|
| 821 |
+
INPUT ./figures/ch4/ch4-many-latents.png
|
| 822 |
+
INPUT ./figures/ch4/ch4-diffusion-robot-actions.png
|
| 823 |
+
INPUT ./figures/ch4/ch4-diffusion-robot-actions.png
|
| 824 |
+
INPUT ./figures/ch4/ch4-diffusion-robot-actions.png
|
| 825 |
+
INPUT ./figures/ch4/ch4-diffusion-robot-actions.png
|
| 826 |
+
INPUT ./figures/ch4/ch4-diffusion-robot-actions.png
|
| 827 |
+
INPUT ./figures/ch4/ch4-action-vs-observation-distribution.png
|
| 828 |
+
INPUT ./figures/ch4/ch4-action-vs-observation-distribution.png
|
| 829 |
+
INPUT ./figures/ch4/ch4-action-vs-observation-distribution.png
|
| 830 |
+
INPUT ./figures/ch4/ch4-action-vs-observation-distribution.png
|
| 831 |
+
INPUT ./figures/ch4/ch4-action-vs-observation-distribution.png
|
| 832 |
+
INPUT ./figures/ch4/ch4-normalizing-flows.png
|
| 833 |
+
INPUT ./figures/ch4/ch4-normalizing-flows.png
|
| 834 |
+
INPUT ./figures/ch4/ch4-normalizing-flows.png
|
| 835 |
+
INPUT ./figures/ch4/ch4-normalizing-flows.png
|
| 836 |
+
INPUT ./figures/ch4/ch4-normalizing-flows.png
|
| 837 |
+
INPUT ./figures/ch4/ch4-diffusion-vs-flowmatching.png
|
| 838 |
+
INPUT ./figures/ch4/ch4-diffusion-vs-flowmatching.png
|
| 839 |
+
INPUT ./figures/ch4/ch4-diffusion-vs-flowmatching.png
|
| 840 |
+
INPUT ./figures/ch4/ch4-diffusion-vs-flowmatching.png
|
| 841 |
+
INPUT ./figures/ch4/ch4-diffusion-vs-flowmatching.png
|
| 842 |
+
INPUT ./figures/ch4/ch4-act-encoder.png
|
| 843 |
+
INPUT ./figures/ch4/ch4-act-encoder.png
|
| 844 |
+
INPUT ./figures/ch4/ch4-act-encoder.png
|
| 845 |
+
INPUT ./figures/ch4/ch4-act-encoder.png
|
| 846 |
+
INPUT ./figures/ch4/ch4-act-encoder.png
|
| 847 |
+
INPUT ./figures/ch4/ch4-act-decoder.png
|
| 848 |
+
INPUT ./figures/ch4/ch4-act-decoder.png
|
| 849 |
+
INPUT ./figures/ch4/ch4-act-decoder.png
|
| 850 |
+
INPUT ./figures/ch4/ch4-act-decoder.png
|
| 851 |
+
INPUT ./figures/ch4/ch4-act-decoder.png
|
| 852 |
+
INPUT ./figures/ch4/ch4-act.png
|
| 853 |
+
INPUT ./figures/ch4/ch4-act.png
|
| 854 |
+
INPUT ./figures/ch4/ch4-act.png
|
| 855 |
+
INPUT ./figures/ch4/ch4-act.png
|
| 856 |
+
INPUT ./figures/ch4/ch4-act.png
|
| 857 |
+
INPUT ./snippets/ch4/01_training_act.py
|
| 858 |
+
INPUT ./snippets/ch4/01_training_act.py
|
| 859 |
+
INPUT snippets/ch4/01_training_act.py
|
| 860 |
+
INPUT ./snippets/ch4/02_using_act.py
|
| 861 |
+
INPUT ./snippets/ch4/02_using_act.py
|
| 862 |
+
INPUT snippets/ch4/02_using_act.py
|
| 863 |
+
INPUT ./figures/ch4/ch4-diffusion-policy.png
|
| 864 |
+
INPUT ./figures/ch4/ch4-diffusion-policy.png
|
| 865 |
+
INPUT ./figures/ch4/ch4-diffusion-policy.png
|
| 866 |
+
INPUT ./figures/ch4/ch4-diffusion-policy.png
|
| 867 |
+
INPUT ./figures/ch4/ch4-diffusion-policy.png
|
| 868 |
+
INPUT ./snippets/ch4/03_training_diffusion.py
|
| 869 |
+
INPUT ./snippets/ch4/03_training_diffusion.py
|
| 870 |
+
INPUT snippets/ch4/03_training_diffusion.py
|
| 871 |
+
INPUT ./snippets/ch4/04_using_diffusion.py
|
| 872 |
+
INPUT ./snippets/ch4/04_using_diffusion.py
|
| 873 |
+
INPUT snippets/ch4/04_using_diffusion.py
|
| 874 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/eccc1000.tfm
|
| 875 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/eccc0700.tfm
|
| 876 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/jknappen/ec/eccc0500.tfm
|
| 877 |
+
INPUT ./figures/ch4/ch4-async-inference.png
|
| 878 |
+
INPUT ./figures/ch4/ch4-async-inference.png
|
| 879 |
+
INPUT ./figures/ch4/ch4-async-inference.png
|
| 880 |
+
INPUT ./figures/ch4/ch4-async-inference.png
|
| 881 |
+
INPUT ./figures/ch4/ch4-async-inference.png
|
| 882 |
+
INPUT ./figures/ch4/ch4-queues.png
|
| 883 |
+
INPUT ./figures/ch4/ch4-queues.png
|
| 884 |
+
INPUT ./figures/ch4/ch4-queues.png
|
| 885 |
+
INPUT ./figures/ch4/ch4-queues.png
|
| 886 |
+
INPUT ./figures/ch4/ch4-queues.png
|
| 887 |
+
INPUT ./snippets/ch4/05_policy_server.py
|
| 888 |
+
INPUT ./snippets/ch4/05_policy_server.py
|
| 889 |
+
INPUT snippets/ch4/05_policy_server.py
|
| 890 |
+
INPUT ./snippets/ch4/06_robot_client.py
|
| 891 |
+
INPUT ./snippets/ch4/06_robot_client.py
|
| 892 |
+
INPUT snippets/ch4/06_robot_client.py
|
| 893 |
+
INPUT ./sections/05_foundation_models.tex
|
| 894 |
+
INPUT ./sections/05_foundation_models.tex
|
| 895 |
+
INPUT sections/05_foundation_models.tex
|
| 896 |
+
INPUT ./figures/ch5/ch5-ml-vs-robotics-foundation.png
|
| 897 |
+
INPUT ./figures/ch5/ch5-ml-vs-robotics-foundation.png
|
| 898 |
+
INPUT ./figures/ch5/ch5-ml-vs-robotics-foundation.png
|
| 899 |
+
INPUT ./figures/ch5/ch5-ml-vs-robotics-foundation.png
|
| 900 |
+
INPUT ./figures/ch5/ch5-ml-vs-robotics-foundation.png
|
| 901 |
+
INPUT ./figures/ch5/ch5-generalist-policies-timeline.png
|
| 902 |
+
INPUT ./figures/ch5/ch5-generalist-policies-timeline.png
|
| 903 |
+
INPUT ./figures/ch5/ch5-generalist-policies-timeline.png
|
| 904 |
+
INPUT ./figures/ch5/ch5-generalist-policies-timeline.png
|
| 905 |
+
INPUT ./figures/ch5/ch5-generalist-policies-timeline.png
|
| 906 |
+
INPUT ./figures/ch5/ch5-trends.png
|
| 907 |
+
INPUT ./figures/ch5/ch5-trends.png
|
| 908 |
+
INPUT ./figures/ch5/ch5-trends.png
|
| 909 |
+
INPUT ./figures/ch5/ch5-trends.png
|
| 910 |
+
INPUT ./figures/ch5/ch5-trends.png
|
| 911 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr12.tfm
|
| 912 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi12.tfm
|
| 913 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy10.tfm
|
| 914 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmex10.tfm
|
| 915 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx12.tfm
|
| 916 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmib10.tfm
|
| 917 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbsy10.tfm
|
| 918 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam10.tfm
|
| 919 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm10.tfm
|
| 920 |
+
INPUT ./figures/ch5/ch5-pi0.png
|
| 921 |
+
INPUT ./figures/ch5/ch5-pi0.png
|
| 922 |
+
INPUT ./figures/ch5/ch5-pi0.png
|
| 923 |
+
INPUT ./figures/ch5/ch5-pi0.png
|
| 924 |
+
INPUT ./figures/ch5/ch5-pi0.png
|
| 925 |
+
INPUT ./figures/ch5/ch5-pi0-sampling-timesteps.png
|
| 926 |
+
INPUT ./figures/ch5/ch5-pi0-sampling-timesteps.png
|
| 927 |
+
INPUT ./figures/ch5/ch5-pi0-sampling-timesteps.png
|
| 928 |
+
INPUT ./figures/ch5/ch5-pi0-sampling-timesteps.png
|
| 929 |
+
INPUT ./figures/ch5/ch5-pi0-sampling-timesteps.png
|
| 930 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr10.tfm
|
| 931 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr8.tfm
|
| 932 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr5.tfm
|
| 933 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi10.tfm
|
| 934 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy10.tfm
|
| 935 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmex10.tfm
|
| 936 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex8.tfm
|
| 937 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex7.tfm
|
| 938 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx10.tfm
|
| 939 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx8.tfm
|
| 940 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbx5.tfm
|
| 941 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmib10.tfm
|
| 942 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmbsy10.tfm
|
| 943 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam10.tfm
|
| 944 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam7.tfm
|
| 945 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam5.tfm
|
| 946 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm10.tfm
|
| 947 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm7.tfm
|
| 948 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm5.tfm
|
| 949 |
+
INPUT ./snippets/ch5/01_using_pi0.py
|
| 950 |
+
INPUT ./snippets/ch5/01_using_pi0.py
|
| 951 |
+
INPUT snippets/ch5/01_using_pi0.py
|
| 952 |
+
INPUT ./figures/ch5/ch5-smolvla.png
|
| 953 |
+
INPUT ./figures/ch5/ch5-smolvla.png
|
| 954 |
+
INPUT ./figures/ch5/ch5-smolvla.png
|
| 955 |
+
INPUT ./figures/ch5/ch5-smolvla.png
|
| 956 |
+
INPUT ./figures/ch5/ch5-smolvla.png
|
| 957 |
+
INPUT ./snippets/ch5/02_using_smolvla.py
|
| 958 |
+
INPUT ./snippets/ch5/02_using_smolvla.py
|
| 959 |
+
INPUT snippets/ch5/02_using_smolvla.py
|
| 960 |
+
INPUT ./sections/07_conclusions.tex
|
| 961 |
+
INPUT ./sections/07_conclusions.tex
|
| 962 |
+
INPUT sections/07_conclusions.tex
|
| 963 |
+
INPUT ./main.bbl
|
| 964 |
+
INPUT ./main.bbl
|
| 965 |
+
INPUT main.bbl
|
| 966 |
+
INPUT main.aux
|
| 967 |
+
INPUT ./main.out
|
| 968 |
+
INPUT ./main.out
|
| 969 |
+
INPUT hfstyle/manrope/Manrope-Regular.ttf
|
| 970 |
+
INPUT hfstyle/manrope/Manrope-Bold.ttf
|
| 971 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmbx10.pfb
|
| 972 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmbx7.pfb
|
| 973 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmex10.pfb
|
| 974 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cmextra/cmex7.pfb
|
| 975 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi10.pfb
|
| 976 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi12.pfb
|
| 977 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi5.pfb
|
| 978 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi6.pfb
|
| 979 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi7.pfb
|
| 980 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi9.pfb
|
| 981 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmib10.pfb
|
| 982 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr10.pfb
|
| 983 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr5.pfb
|
| 984 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr6.pfb
|
| 985 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr7.pfb
|
| 986 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr8.pfb
|
| 987 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr9.pfb
|
| 988 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy10.pfb
|
| 989 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy5.pfb
|
| 990 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy7.pfb
|
| 991 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy9.pfb
|
| 992 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/symbols/msbm10.pfb
|
| 993 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/symbols/msbm7.pfb
|
| 994 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfbx1000.pfb
|
| 995 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfcc1000.pfb
|
| 996 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0500.pfb
|
| 997 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0600.pfb
|
| 998 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0700.pfb
|
| 999 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0800.pfb
|
| 1000 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0900.pfb
|
| 1001 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm1000.pfb
|
| 1002 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfti0800.pfb
|
| 1003 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfti0900.pfb
|
| 1004 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfti1000.pfb
|
| 1005 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt0800.pfb
|
| 1006 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt0900.pfb
|
| 1007 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt1000.pfb
|
| 1008 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt1200.pfb
|
app/scripts/latex-to-mdx/input/main.log
ADDED
|
@@ -0,0 +1,2070 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
This is pdfTeX, Version 3.141592653-2.6-1.40.27 (TeX Live 2025) (preloaded format=pdflatex 2025.8.26) 13 OCT 2025 18:55
|
| 2 |
+
entering extended mode
|
| 3 |
+
restricted \write18 enabled.
|
| 4 |
+
file:line:error style messages enabled.
|
| 5 |
+
%&-line parsing enabled.
|
| 6 |
+
**/Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/main.tex
|
| 7 |
+
(/Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/main.tex
|
| 8 |
+
LaTeX2e <2024-11-01> patch level 2
|
| 9 |
+
L3 programming layer <2025-01-18>
|
| 10 |
+
(./hfstyle/hf.cls
|
| 11 |
+
Document Class: hfstyle/hf
|
| 12 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/base/article.cls
|
| 13 |
+
Document Class: article 2024/06/29 v1.4n Standard LaTeX document class
|
| 14 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/base/size10.clo
|
| 15 |
+
File: size10.clo 2024/06/29 v1.4n Standard LaTeX file (size option)
|
| 16 |
+
)
|
| 17 |
+
\c@part=\count196
|
| 18 |
+
\c@section=\count197
|
| 19 |
+
\c@subsection=\count198
|
| 20 |
+
\c@subsubsection=\count199
|
| 21 |
+
\c@paragraph=\count266
|
| 22 |
+
\c@subparagraph=\count267
|
| 23 |
+
\c@figure=\count268
|
| 24 |
+
\c@table=\count269
|
| 25 |
+
\abovecaptionskip=\skip49
|
| 26 |
+
\belowcaptionskip=\skip50
|
| 27 |
+
\bibindent=\dimen141
|
| 28 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/geometry/geometry.sty
|
| 29 |
+
Package: geometry 2020/01/02 v5.9 Page Geometry
|
| 30 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/keyval.sty
|
| 31 |
+
Package: keyval 2022/05/29 v1.15 key=value parser (DPC)
|
| 32 |
+
\KV@toks@=\toks17
|
| 33 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifvtex.sty
|
| 34 |
+
Package: ifvtex 2019/10/25 v1.7 ifvtex legacy package. Use iftex instead.
|
| 35 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/iftex.sty
|
| 36 |
+
Package: iftex 2024/12/12 v1.0g TeX engine tests
|
| 37 |
+
))
|
| 38 |
+
\Gm@cnth=\count270
|
| 39 |
+
\Gm@cntv=\count271
|
| 40 |
+
\c@Gm@tempcnt=\count272
|
| 41 |
+
\Gm@bindingoffset=\dimen142
|
| 42 |
+
\Gm@wd@mp=\dimen143
|
| 43 |
+
\Gm@odd@mp=\dimen144
|
| 44 |
+
\Gm@even@mp=\dimen145
|
| 45 |
+
\Gm@layoutwidth=\dimen146
|
| 46 |
+
\Gm@layoutheight=\dimen147
|
| 47 |
+
\Gm@layouthoffset=\dimen148
|
| 48 |
+
\Gm@layoutvoffset=\dimen149
|
| 49 |
+
\Gm@dimlist=\toks18
|
| 50 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype.sty
|
| 51 |
+
Package: microtype 2025/02/11 v3.2a Micro-typographical refinements (RS)
|
| 52 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/etoolbox/etoolbox.sty
|
| 53 |
+
Package: etoolbox 2025/02/11 v2.5l e-TeX tools for LaTeX (JAW)
|
| 54 |
+
\etb@tempcnta=\count273
|
| 55 |
+
)
|
| 56 |
+
\MT@toks=\toks19
|
| 57 |
+
\MT@tempbox=\box52
|
| 58 |
+
\MT@count=\count274
|
| 59 |
+
LaTeX Info: Redefining \noprotrusionifhmode on input line 1087.
|
| 60 |
+
LaTeX Info: Redefining \leftprotrusion on input line 1088.
|
| 61 |
+
\MT@prot@toks=\toks20
|
| 62 |
+
LaTeX Info: Redefining \rightprotrusion on input line 1107.
|
| 63 |
+
LaTeX Info: Redefining \textls on input line 1449.
|
| 64 |
+
\MT@outer@kern=\dimen150
|
| 65 |
+
LaTeX Info: Redefining \microtypecontext on input line 2053.
|
| 66 |
+
LaTeX Info: Redefining \textmicrotypecontext on input line 2070.
|
| 67 |
+
\MT@listname@count=\count275
|
| 68 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype-pdftex.def
|
| 69 |
+
File: microtype-pdftex.def 2025/02/11 v3.2a Definitions specific to pdftex (RS)
|
| 70 |
+
LaTeX Info: Redefining \lsstyle on input line 944.
|
| 71 |
+
LaTeX Info: Redefining \lslig on input line 944.
|
| 72 |
+
\MT@outer@space=\skip51
|
| 73 |
+
)
|
| 74 |
+
Package microtype Info: Loading configuration file microtype.cfg.
|
| 75 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/microtype.cfg
|
| 76 |
+
File: microtype.cfg 2025/02/11 v3.2a microtype main configuration file (RS)
|
| 77 |
+
)
|
| 78 |
+
LaTeX Info: Redefining \microtypesetup on input line 3065.
|
| 79 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/placeins/placeins.sty
|
| 80 |
+
Package: placeins 2005/04/18 v 2.2
|
| 81 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/hyphenat/hyphenat.sty
|
| 82 |
+
Package: hyphenat 2009/09/02 v2.3c hyphenation utilities
|
| 83 |
+
\langwohyphens=\language90
|
| 84 |
+
LaTeX Info: Redefining \_ on input line 43.
|
| 85 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/setspace/setspace.sty
|
| 86 |
+
Package: setspace 2022/12/04 v6.7b set line spacing
|
| 87 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/parskip/parskip.sty
|
| 88 |
+
Package: parskip 2021-03-14 v2.0h non-zero parskip adjustments
|
| 89 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/kvoptions/kvoptions.sty
|
| 90 |
+
Package: kvoptions 2022-06-15 v3.15 Key value format for package options (HO)
|
| 91 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty
|
| 92 |
+
Package: ltxcmds 2023-12-04 v1.26 LaTeX kernel commands for general use (HO)
|
| 93 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty
|
| 94 |
+
Package: kvsetkeys 2022-10-05 v1.19 Key value parser (HO)
|
| 95 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/generic/babel/babel.sty
|
| 96 |
+
Package: babel 2025/02/14 v25.4 The multilingual framework for pdfLaTeX, LuaLaTeX and XeLaTeX
|
| 97 |
+
\babel@savecnt=\count276
|
| 98 |
+
\U@D=\dimen151
|
| 99 |
+
\l@unhyphenated=\language91
|
| 100 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/babel/txtbabel.def)
|
| 101 |
+
\bbl@readstream=\read2
|
| 102 |
+
\bbl@dirlevel=\count277
|
| 103 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/babel-latin/latin.ldf
|
| 104 |
+
Language: latin 2021-06-27 v4.0 Latin support from the babel system
|
| 105 |
+
Package babel Info: Making " an active character on input line 135.
|
| 106 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/babel-english/english.ldf
|
| 107 |
+
Language: english 2017/06/06 v3.3r English support from the babel system
|
| 108 |
+
Package babel Info: Hyphen rules for 'canadian' set to \l@english
|
| 109 |
+
(babel) (\language0). Reported on input line 102.
|
| 110 |
+
Package babel Info: Hyphen rules for 'australian' set to \l@ukenglish
|
| 111 |
+
(babel) (\language22). Reported on input line 105.
|
| 112 |
+
Package babel Info: Hyphen rules for 'newzealand' set to \l@ukenglish
|
| 113 |
+
(babel) (\language22). Reported on input line 108.
|
| 114 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/en/babel-english.tex
|
| 115 |
+
Package babel Info: Importing font and identification data for english
|
| 116 |
+
(babel) from babel-en.ini. Reported on input line 11.
|
| 117 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/babel/locale/la/babel-latin.tex
|
| 118 |
+
Package babel Info: Importing font and identification data for latin
|
| 119 |
+
(babel) from babel-la.ini. Reported on input line 9.
|
| 120 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/lipsum/lipsum.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/l3packages/l3keys2e/l3keys2e.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/l3kernel/expl3.sty
|
| 121 |
+
Package: expl3 2025-01-18 L3 programming layer (loader)
|
| 122 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def
|
| 123 |
+
File: l3backend-pdftex.def 2024-05-08 L3 backend support: PDF output (pdfTeX)
|
| 124 |
+
\l__color_backend_stack_int=\count278
|
| 125 |
+
\l__pdf_internal_box=\box53
|
| 126 |
+
))
|
| 127 |
+
Package: l3keys2e 2024-08-16 LaTeX2e option processing using LaTeX3 keys
|
| 128 |
+
)
|
| 129 |
+
Package: lipsum 2021-09-20 v2.7 150 paragraphs of Lorem Ipsum dummy text
|
| 130 |
+
\g__lipsum_par_int=\count279
|
| 131 |
+
\l__lipsum_a_int=\count280
|
| 132 |
+
\l__lipsum_b_int=\count281
|
| 133 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/lipsum/lipsum.ltd.tex)) (./fancyhdr.sty
|
| 134 |
+
\fancy@headwidth=\skip52
|
| 135 |
+
\f@ncyO@elh=\skip53
|
| 136 |
+
\f@ncyO@erh=\skip54
|
| 137 |
+
\f@ncyO@olh=\skip55
|
| 138 |
+
\f@ncyO@orh=\skip56
|
| 139 |
+
\f@ncyO@elf=\skip57
|
| 140 |
+
\f@ncyO@erf=\skip58
|
| 141 |
+
\f@ncyO@olf=\skip59
|
| 142 |
+
\f@ncyO@orf=\skip60
|
| 143 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphicx.sty
|
| 144 |
+
Package: graphicx 2021/09/16 v1.2d Enhanced LaTeX Graphics (DPC,SPQR)
|
| 145 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphics.sty
|
| 146 |
+
Package: graphics 2024/08/06 v1.4g Standard LaTeX Graphics (DPC,SPQR)
|
| 147 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/trig.sty
|
| 148 |
+
Package: trig 2023/12/02 v1.11 sin cos tan (DPC)
|
| 149 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg
|
| 150 |
+
File: graphics.cfg 2016/06/04 v1.11 sample graphics configuration
|
| 151 |
+
)
|
| 152 |
+
Package graphics Info: Driver file: pdftex.def on input line 106.
|
| 153 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def
|
| 154 |
+
File: pdftex.def 2024/04/13 v1.2c Graphics/color driver for pdftex
|
| 155 |
+
))
|
| 156 |
+
\Gin@req@height=\dimen152
|
| 157 |
+
\Gin@req@width=\dimen153
|
| 158 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/caption/subcaption.sty
|
| 159 |
+
Package: subcaption 2023/07/28 v1.6b Sub-captions (AR)
|
| 160 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption.sty
|
| 161 |
+
Package: caption 2023/08/05 v3.6o Customizing captions (AR)
|
| 162 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption3.sty
|
| 163 |
+
Package: caption3 2023/07/31 v2.4d caption3 kernel (AR)
|
| 164 |
+
\caption@tempdima=\dimen154
|
| 165 |
+
\captionmargin=\dimen155
|
| 166 |
+
\caption@leftmargin=\dimen156
|
| 167 |
+
\caption@rightmargin=\dimen157
|
| 168 |
+
\caption@width=\dimen158
|
| 169 |
+
\caption@indent=\dimen159
|
| 170 |
+
\caption@parindent=\dimen160
|
| 171 |
+
\caption@hangindent=\dimen161
|
| 172 |
+
Package caption Info: Standard document class detected.
|
| 173 |
+
)
|
| 174 |
+
\c@caption@flags=\count282
|
| 175 |
+
\c@continuedfloat=\count283
|
| 176 |
+
)
|
| 177 |
+
Package caption Info: New subtype `subfigure' on input line 238.
|
| 178 |
+
\c@subfigure=\count284
|
| 179 |
+
Package caption Info: New subtype `subtable' on input line 238.
|
| 180 |
+
\c@subtable=\count285
|
| 181 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/booktabs/booktabs.sty
|
| 182 |
+
Package: booktabs 2020/01/12 v1.61803398 Publication quality tables
|
| 183 |
+
\heavyrulewidth=\dimen162
|
| 184 |
+
\lightrulewidth=\dimen163
|
| 185 |
+
\cmidrulewidth=\dimen164
|
| 186 |
+
\belowrulesep=\dimen165
|
| 187 |
+
\belowbottomsep=\dimen166
|
| 188 |
+
\aboverulesep=\dimen167
|
| 189 |
+
\abovetopsep=\dimen168
|
| 190 |
+
\cmidrulesep=\dimen169
|
| 191 |
+
\cmidrulekern=\dimen170
|
| 192 |
+
\defaultaddspace=\dimen171
|
| 193 |
+
\@cmidla=\count286
|
| 194 |
+
\@cmidlb=\count287
|
| 195 |
+
\@aboverulesep=\dimen172
|
| 196 |
+
\@belowrulesep=\dimen173
|
| 197 |
+
\@thisruleclass=\count288
|
| 198 |
+
\@lastruleclass=\count289
|
| 199 |
+
\@thisrulewidth=\dimen174
|
| 200 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/nicematrix/nicematrix.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-common.tex
|
| 201 |
+
\pgfutil@everybye=\toks21
|
| 202 |
+
\pgfutil@tempdima=\dimen175
|
| 203 |
+
\pgfutil@tempdimb=\dimen176
|
| 204 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-latex.def
|
| 205 |
+
\pgfutil@abb=\box54
|
| 206 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/pgf.revision.tex)
|
| 207 |
+
Package: pgfrcs 2023-01-15 v3.1.10 (3.1.10)
|
| 208 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
|
| 209 |
+
Package: pgfsys 2023-01-15 v3.1.10 (3.1.10)
|
| 210 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
|
| 211 |
+
\pgfkeys@pathtoks=\toks22
|
| 212 |
+
\pgfkeys@temptoks=\toks23
|
| 213 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeyslibraryfiltered.code.tex
|
| 214 |
+
\pgfkeys@tmptoks=\toks24
|
| 215 |
+
))
|
| 216 |
+
\pgf@x=\dimen177
|
| 217 |
+
\pgf@y=\dimen178
|
| 218 |
+
\pgf@xa=\dimen179
|
| 219 |
+
\pgf@ya=\dimen180
|
| 220 |
+
\pgf@xb=\dimen181
|
| 221 |
+
\pgf@yb=\dimen182
|
| 222 |
+
\pgf@xc=\dimen183
|
| 223 |
+
\pgf@yc=\dimen184
|
| 224 |
+
\pgf@xd=\dimen185
|
| 225 |
+
\pgf@yd=\dimen186
|
| 226 |
+
\w@pgf@writea=\write3
|
| 227 |
+
\r@pgf@reada=\read3
|
| 228 |
+
\c@pgf@counta=\count290
|
| 229 |
+
\c@pgf@countb=\count291
|
| 230 |
+
\c@pgf@countc=\count292
|
| 231 |
+
\c@pgf@countd=\count293
|
| 232 |
+
\t@pgf@toka=\toks25
|
| 233 |
+
\t@pgf@tokb=\toks26
|
| 234 |
+
\t@pgf@tokc=\toks27
|
| 235 |
+
\pgf@sys@id@count=\count294
|
| 236 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgf.cfg
|
| 237 |
+
File: pgf.cfg 2023-01-15 v3.1.10 (3.1.10)
|
| 238 |
+
)
|
| 239 |
+
Driver file for pgf: pgfsys-pdftex.def
|
| 240 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.def
|
| 241 |
+
File: pgfsys-pdftex.def 2023-01-15 v3.1.10 (3.1.10)
|
| 242 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-common-pdf.def
|
| 243 |
+
File: pgfsys-common-pdf.def 2023-01-15 v3.1.10 (3.1.10)
|
| 244 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.tex
|
| 245 |
+
File: pgfsyssoftpath.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 246 |
+
\pgfsyssoftpath@smallbuffer@items=\count295
|
| 247 |
+
\pgfsyssoftpath@bigbuffer@items=\count296
|
| 248 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.tex
|
| 249 |
+
File: pgfsysprotocol.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 250 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/xcolor/xcolor.sty
|
| 251 |
+
Package: xcolor 2024/09/29 v3.02 LaTeX color extensions (UK)
|
| 252 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg
|
| 253 |
+
File: color.cfg 2016/01/02 v1.6 sample color configuration
|
| 254 |
+
)
|
| 255 |
+
Package xcolor Info: Driver file: pdftex.def on input line 274.
|
| 256 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx)
|
| 257 |
+
Package xcolor Info: Model `cmy' substituted by `cmy0' on input line 1349.
|
| 258 |
+
Package xcolor Info: Model `hsb' substituted by `rgb' on input line 1353.
|
| 259 |
+
Package xcolor Info: Model `RGB' extended on input line 1365.
|
| 260 |
+
Package xcolor Info: Model `HTML' substituted by `rgb' on input line 1367.
|
| 261 |
+
Package xcolor Info: Model `Hsb' substituted by `hsb' on input line 1368.
|
| 262 |
+
Package xcolor Info: Model `tHsb' substituted by `hsb' on input line 1369.
|
| 263 |
+
Package xcolor Info: Model `HSB' substituted by `hsb' on input line 1370.
|
| 264 |
+
Package xcolor Info: Model `Gray' substituted by `gray' on input line 1371.
|
| 265 |
+
Package xcolor Info: Model `wave' substituted by `hsb' on input line 1372.
|
| 266 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/colortbl/colortbl.sty
|
| 267 |
+
Package: colortbl 2024/07/06 v1.0i Color table columns (DPC)
|
| 268 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/tools/array.sty
|
| 269 |
+
Package: array 2024/10/17 v2.6g Tabular extension package (FMi)
|
| 270 |
+
\col@sep=\dimen187
|
| 271 |
+
\ar@mcellbox=\box55
|
| 272 |
+
\extrarowheight=\dimen188
|
| 273 |
+
\NC@list=\toks28
|
| 274 |
+
\extratabsurround=\skip61
|
| 275 |
+
\backup@length=\skip62
|
| 276 |
+
\ar@cellbox=\box56
|
| 277 |
+
)
|
| 278 |
+
\everycr=\toks29
|
| 279 |
+
\minrowclearance=\skip63
|
| 280 |
+
\rownum=\count297
|
| 281 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
|
| 282 |
+
Package: pgfcore 2023-01-15 v3.1.10 (3.1.10)
|
| 283 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathutil.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathparser.code.tex
|
| 284 |
+
\pgfmath@dimen=\dimen189
|
| 285 |
+
\pgfmath@count=\count298
|
| 286 |
+
\pgfmath@box=\box57
|
| 287 |
+
\pgfmath@toks=\toks30
|
| 288 |
+
\pgfmath@stack@operand=\toks31
|
| 289 |
+
\pgfmath@stack@operation=\toks32
|
| 290 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.basic.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.trigonometric.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.random.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.comparison.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.base.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.round.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.misc.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.integerarithmetics.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathcalc.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfloat.code.tex
|
| 291 |
+
\c@pgfmathroundto@lastzeros=\count299
|
| 292 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfint.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepoints.code.tex
|
| 293 |
+
File: pgfcorepoints.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 294 |
+
\pgf@picminx=\dimen190
|
| 295 |
+
\pgf@picmaxx=\dimen191
|
| 296 |
+
\pgf@picminy=\dimen192
|
| 297 |
+
\pgf@picmaxy=\dimen193
|
| 298 |
+
\pgf@pathminx=\dimen194
|
| 299 |
+
\pgf@pathmaxx=\dimen195
|
| 300 |
+
\pgf@pathminy=\dimen196
|
| 301 |
+
\pgf@pathmaxy=\dimen197
|
| 302 |
+
\pgf@xx=\dimen198
|
| 303 |
+
\pgf@xy=\dimen199
|
| 304 |
+
\pgf@yx=\dimen256
|
| 305 |
+
\pgf@yy=\dimen257
|
| 306 |
+
\pgf@zx=\dimen258
|
| 307 |
+
\pgf@zy=\dimen259
|
| 308 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathconstruct.code.tex
|
| 309 |
+
File: pgfcorepathconstruct.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 310 |
+
\pgf@path@lastx=\dimen260
|
| 311 |
+
\pgf@path@lasty=\dimen261
|
| 312 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathusage.code.tex
|
| 313 |
+
File: pgfcorepathusage.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 314 |
+
\pgf@shorten@end@additional=\dimen262
|
| 315 |
+
\pgf@shorten@start@additional=\dimen263
|
| 316 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorescopes.code.tex
|
| 317 |
+
File: pgfcorescopes.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 318 |
+
\pgfpic=\box58
|
| 319 |
+
\pgf@hbox=\box59
|
| 320 |
+
\pgf@layerbox@main=\box60
|
| 321 |
+
\pgf@picture@serial@count=\count300
|
| 322 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoregraphicstate.code.tex
|
| 323 |
+
File: pgfcoregraphicstate.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 324 |
+
\pgflinewidth=\dimen264
|
| 325 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransformations.code.tex
|
| 326 |
+
File: pgfcoretransformations.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 327 |
+
\pgf@pt@x=\dimen265
|
| 328 |
+
\pgf@pt@y=\dimen266
|
| 329 |
+
\pgf@pt@temp=\dimen267
|
| 330 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorequick.code.tex
|
| 331 |
+
File: pgfcorequick.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 332 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreobjects.code.tex
|
| 333 |
+
File: pgfcoreobjects.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 334 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathprocessing.code.tex
|
| 335 |
+
File: pgfcorepathprocessing.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 336 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorearrows.code.tex
|
| 337 |
+
File: pgfcorearrows.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 338 |
+
\pgfarrowsep=\dimen268
|
| 339 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreshade.code.tex
|
| 340 |
+
File: pgfcoreshade.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 341 |
+
\pgf@max=\dimen269
|
| 342 |
+
\pgf@sys@shading@range@num=\count301
|
| 343 |
+
\pgf@shadingcount=\count302
|
| 344 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreimage.code.tex
|
| 345 |
+
File: pgfcoreimage.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 346 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreexternal.code.tex
|
| 347 |
+
File: pgfcoreexternal.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 348 |
+
\pgfexternal@startupbox=\box61
|
| 349 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorelayers.code.tex
|
| 350 |
+
File: pgfcorelayers.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 351 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransparency.code.tex
|
| 352 |
+
File: pgfcoretransparency.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 353 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepatterns.code.tex
|
| 354 |
+
File: pgfcorepatterns.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 355 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorerdf.code.tex
|
| 356 |
+
File: pgfcorerdf.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 357 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleshapes.code.tex
|
| 358 |
+
File: pgfmoduleshapes.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 359 |
+
\pgfnodeparttextbox=\box62
|
| 360 |
+
)
|
| 361 |
+
Package: nicematrix 2025/03/04 v7.1a Enhanced arrays with the help of PGF/TikZ
|
| 362 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsmath.sty
|
| 363 |
+
Package: amsmath 2024/11/05 v2.17t AMS math features
|
| 364 |
+
\@mathmargin=\skip64
|
| 365 |
+
|
| 366 |
+
For additional information on amsmath, use the `?' option.
|
| 367 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amstext.sty
|
| 368 |
+
Package: amstext 2021/08/26 v2.01 AMS text
|
| 369 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsgen.sty
|
| 370 |
+
File: amsgen.sty 1999/11/30 v2.0 generic functions
|
| 371 |
+
\@emptytoks=\toks33
|
| 372 |
+
\ex@=\dimen270
|
| 373 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsbsy.sty
|
| 374 |
+
Package: amsbsy 1999/11/29 v1.2d Bold Symbols
|
| 375 |
+
\pmbraise@=\dimen271
|
| 376 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsopn.sty
|
| 377 |
+
Package: amsopn 2022/04/08 v2.04 operator names
|
| 378 |
+
)
|
| 379 |
+
\inf@bad=\count303
|
| 380 |
+
LaTeX Info: Redefining \frac on input line 233.
|
| 381 |
+
\uproot@=\count304
|
| 382 |
+
\leftroot@=\count305
|
| 383 |
+
LaTeX Info: Redefining \overline on input line 398.
|
| 384 |
+
LaTeX Info: Redefining \colon on input line 409.
|
| 385 |
+
\classnum@=\count306
|
| 386 |
+
\DOTSCASE@=\count307
|
| 387 |
+
LaTeX Info: Redefining \ldots on input line 495.
|
| 388 |
+
LaTeX Info: Redefining \dots on input line 498.
|
| 389 |
+
LaTeX Info: Redefining \cdots on input line 619.
|
| 390 |
+
\Mathstrutbox@=\box63
|
| 391 |
+
\strutbox@=\box64
|
| 392 |
+
LaTeX Info: Redefining \big on input line 721.
|
| 393 |
+
LaTeX Info: Redefining \Big on input line 722.
|
| 394 |
+
LaTeX Info: Redefining \bigg on input line 723.
|
| 395 |
+
LaTeX Info: Redefining \Bigg on input line 724.
|
| 396 |
+
\big@size=\dimen272
|
| 397 |
+
LaTeX Font Info: Redeclaring font encoding OML on input line 742.
|
| 398 |
+
LaTeX Font Info: Redeclaring font encoding OMS on input line 743.
|
| 399 |
+
\macc@depth=\count308
|
| 400 |
+
LaTeX Info: Redefining \bmod on input line 904.
|
| 401 |
+
LaTeX Info: Redefining \pmod on input line 909.
|
| 402 |
+
LaTeX Info: Redefining \smash on input line 939.
|
| 403 |
+
LaTeX Info: Redefining \relbar on input line 969.
|
| 404 |
+
LaTeX Info: Redefining \Relbar on input line 970.
|
| 405 |
+
\c@MaxMatrixCols=\count309
|
| 406 |
+
\dotsspace@=\muskip17
|
| 407 |
+
\c@parentequation=\count310
|
| 408 |
+
\dspbrk@lvl=\count311
|
| 409 |
+
\tag@help=\toks34
|
| 410 |
+
\row@=\count312
|
| 411 |
+
\column@=\count313
|
| 412 |
+
\maxfields@=\count314
|
| 413 |
+
\andhelp@=\toks35
|
| 414 |
+
\eqnshift@=\dimen273
|
| 415 |
+
\alignsep@=\dimen274
|
| 416 |
+
\tagshift@=\dimen275
|
| 417 |
+
\tagwidth@=\dimen276
|
| 418 |
+
\totwidth@=\dimen277
|
| 419 |
+
\lineht@=\dimen278
|
| 420 |
+
\@envbody=\toks36
|
| 421 |
+
\multlinegap=\skip65
|
| 422 |
+
\multlinetaggap=\skip66
|
| 423 |
+
\mathdisplay@stack=\toks37
|
| 424 |
+
LaTeX Info: Redefining \[ on input line 2953.
|
| 425 |
+
LaTeX Info: Redefining \] on input line 2954.
|
| 426 |
+
)
|
| 427 |
+
\g__nicematrix_env_int=\count315
|
| 428 |
+
\g__nicematrix_NiceMatrixBlock_int=\count316
|
| 429 |
+
\g__nicematrix_notes_caption_int=\count317
|
| 430 |
+
\l__nicematrix_columns_width_dim=\dimen279
|
| 431 |
+
\l__nicematrix_col_width_dim=\dimen280
|
| 432 |
+
\g__nicematrix_row_total_int=\count318
|
| 433 |
+
\g__nicematrix_col_total_int=\count319
|
| 434 |
+
\g__nicematrix_last_row_node_int=\count320
|
| 435 |
+
\l__nicematrix_key_nb_rows_int=\count321
|
| 436 |
+
\g__nicematrix_blocks_wd_dim=\dimen281
|
| 437 |
+
\g__nicematrix_blocks_ht_dim=\dimen282
|
| 438 |
+
\g__nicematrix_blocks_dp_dim=\dimen283
|
| 439 |
+
\l__nicematrix_width_dim=\dimen284
|
| 440 |
+
\l__nicematrix_tabular_width_dim=\dimen285
|
| 441 |
+
\l__nicematrix_rule_width_dim=\dimen286
|
| 442 |
+
\l__nicematrix_old_iRow_int=\count322
|
| 443 |
+
\l__nicematrix_old_jCol_int=\count323
|
| 444 |
+
\g__nicematrix_total_X_weight_int=\count324
|
| 445 |
+
\l__nicematrix_X_columns_dim=\dimen287
|
| 446 |
+
\l__nicematrix_x_initial_dim=\dimen288
|
| 447 |
+
\l__nicematrix_y_initial_dim=\dimen289
|
| 448 |
+
\l__nicematrix_x_final_dim=\dimen290
|
| 449 |
+
\l__nicematrix_y_final_dim=\dimen291
|
| 450 |
+
\l__nicematrix_tmpc_dim=\dimen292
|
| 451 |
+
\l__nicematrix_tmpd_dim=\dimen293
|
| 452 |
+
\l__nicematrix_tmpe_dim=\dimen294
|
| 453 |
+
\l__nicematrix_tmpf_dim=\dimen295
|
| 454 |
+
\g__nicematrix_dp_row_zero_dim=\dimen296
|
| 455 |
+
\g__nicematrix_ht_row_zero_dim=\dimen297
|
| 456 |
+
\g__nicematrix_ht_row_one_dim=\dimen298
|
| 457 |
+
\g__nicematrix_dp_ante_last_row_dim=\dimen299
|
| 458 |
+
\g__nicematrix_ht_last_row_dim=\dimen300
|
| 459 |
+
\g__nicematrix_dp_last_row_dim=\dimen301
|
| 460 |
+
\g__nicematrix_width_last_col_dim=\dimen302
|
| 461 |
+
\g__nicematrix_width_first_col_dim=\dimen303
|
| 462 |
+
\l__nicematrix_row_min_int=\count325
|
| 463 |
+
\l__nicematrix_row_max_int=\count326
|
| 464 |
+
\l__nicematrix_col_min_int=\count327
|
| 465 |
+
\l__nicematrix_col_max_int=\count328
|
| 466 |
+
\l__nicematrix_start_int=\count329
|
| 467 |
+
\l__nicematrix_end_int=\count330
|
| 468 |
+
\l__nicematrix_local_start_int=\count331
|
| 469 |
+
\l__nicematrix_local_end_int=\count332
|
| 470 |
+
\g__nicematrix_static_num_of_col_int=\count333
|
| 471 |
+
\l__nicematrix_rounded_corners_dim=\dimen304
|
| 472 |
+
\l__nicematrix_tab_rounded_corners_dim=\dimen305
|
| 473 |
+
\l__nicematrix_offset_dim=\dimen306
|
| 474 |
+
\l__nicematrix_line_width_dim=\dimen307
|
| 475 |
+
\g__nicematrix_block_box_int=\count334
|
| 476 |
+
\l__nicematrix_submatrix_extra_height_dim=\dimen308
|
| 477 |
+
\l__nicematrix_submatrix_left_xshift_dim=\dimen309
|
| 478 |
+
\l__nicematrix_submatrix_right_xshift_dim=\dimen310
|
| 479 |
+
\l__nicematrix_first_row_int=\count335
|
| 480 |
+
\l__nicematrix_first_col_int=\count336
|
| 481 |
+
\l__nicematrix_last_row_int=\count337
|
| 482 |
+
\l__nicematrix_last_col_int=\count338
|
| 483 |
+
\c@tabularnote=\count339
|
| 484 |
+
\g__nicematrix_tabularnote_int=\count340
|
| 485 |
+
\c@nicematrix_draft=\count341
|
| 486 |
+
\l__nicematrix_cell_space_top_limit_dim=\dimen311
|
| 487 |
+
\l__nicematrix_cell_space_bottom_limit_dim=\dimen312
|
| 488 |
+
\l__nicematrix_xdots_inter_dim=\dimen313
|
| 489 |
+
\l__nicematrix_xdots_shorten_start_dim=\dimen314
|
| 490 |
+
\l__nicematrix_xdots_shorten_end_dim=\dimen315
|
| 491 |
+
\l__nicematrix_xdots_radius_dim=\dimen316
|
| 492 |
+
\l__nicematrix_notes_above_space_dim=\dimen317
|
| 493 |
+
\l__nicematrix_left_margin_dim=\dimen318
|
| 494 |
+
\l__nicematrix_right_margin_dim=\dimen319
|
| 495 |
+
\l__nicematrix_extra_left_margin_dim=\dimen320
|
| 496 |
+
\l__nicematrix_extra_right_margin_dim=\dimen321
|
| 497 |
+
\c__nicematrix_max_l_dim=\dimen322
|
| 498 |
+
\l__nicematrix_position_int=\count342
|
| 499 |
+
\l__nicematrix_multiplicity_int=\count343
|
| 500 |
+
\l__nicematrix_brace_yshift_dim=\dimen323
|
| 501 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/multirow/multirow.sty
|
| 502 |
+
Package: multirow 2024/11/12 v2.9 Span multiple rows of a table
|
| 503 |
+
\multirow@colwidth=\skip67
|
| 504 |
+
\multirow@cntb=\count344
|
| 505 |
+
\multirow@dima=\skip68
|
| 506 |
+
\bigstrutjot=\dimen324
|
| 507 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tools/bm.sty
|
| 508 |
+
Package: bm 2023/12/19 v1.2f Bold Symbol Support (DPC/FMi)
|
| 509 |
+
\symboldoperators=\mathgroup4
|
| 510 |
+
\symboldletters=\mathgroup5
|
| 511 |
+
\symboldsymbols=\mathgroup6
|
| 512 |
+
Package bm Info: No bold for \OMX/cmex/m/n, using \pmb.
|
| 513 |
+
LaTeX Font Info: Redeclaring math alphabet \mathbf on input line 149.
|
| 514 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcolorbox.sty
|
| 515 |
+
Package: tcolorbox 2024/10/22 version 6.4.1 text color boxes
|
| 516 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/frontendlayer/tikz.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgf.sty
|
| 517 |
+
Package: pgf 2023-01-15 v3.1.10 (3.1.10)
|
| 518 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleplot.code.tex
|
| 519 |
+
File: pgfmoduleplot.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 520 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-0-65.sty
|
| 521 |
+
Package: pgfcomp-version-0-65 2023-01-15 v3.1.10 (3.1.10)
|
| 522 |
+
\pgf@nodesepstart=\dimen325
|
| 523 |
+
\pgf@nodesepend=\dimen326
|
| 524 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-1-18.sty
|
| 525 |
+
Package: pgfcomp-version-1-18 2023-01-15 v3.1.10 (3.1.10)
|
| 526 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgffor.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex)) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/math/pgfmath.sty (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex)) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex
|
| 527 |
+
Package: pgffor 2023-01-15 v3.1.10 (3.1.10)
|
| 528 |
+
\pgffor@iter=\dimen327
|
| 529 |
+
\pgffor@skip=\dimen328
|
| 530 |
+
\pgffor@stack=\toks38
|
| 531 |
+
\pgffor@toks=\toks39
|
| 532 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex
|
| 533 |
+
Package: tikz 2023-01-15 v3.1.10 (3.1.10)
|
| 534 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryplothandlers.code.tex
|
| 535 |
+
File: pgflibraryplothandlers.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 536 |
+
\pgf@plot@mark@count=\count345
|
| 537 |
+
\pgfplotmarksize=\dimen329
|
| 538 |
+
)
|
| 539 |
+
\tikz@lastx=\dimen330
|
| 540 |
+
\tikz@lasty=\dimen331
|
| 541 |
+
\tikz@lastxsaved=\dimen332
|
| 542 |
+
\tikz@lastysaved=\dimen333
|
| 543 |
+
\tikz@lastmovetox=\dimen334
|
| 544 |
+
\tikz@lastmovetoy=\dimen335
|
| 545 |
+
\tikzleveldistance=\dimen336
|
| 546 |
+
\tikzsiblingdistance=\dimen337
|
| 547 |
+
\tikz@figbox=\box65
|
| 548 |
+
\tikz@figbox@bg=\box66
|
| 549 |
+
\tikz@tempbox=\box67
|
| 550 |
+
\tikz@tempbox@bg=\box68
|
| 551 |
+
\tikztreelevel=\count346
|
| 552 |
+
\tikznumberofchildren=\count347
|
| 553 |
+
\tikznumberofcurrentchild=\count348
|
| 554 |
+
\tikz@fig@count=\count349
|
| 555 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmodulematrix.code.tex
|
| 556 |
+
File: pgfmodulematrix.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 557 |
+
\pgfmatrixcurrentrow=\count350
|
| 558 |
+
\pgfmatrixcurrentcolumn=\count351
|
| 559 |
+
\pgf@matrix@numberofcolumns=\count352
|
| 560 |
+
)
|
| 561 |
+
\tikz@expandcount=\count353
|
| 562 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarytopaths.code.tex
|
| 563 |
+
File: tikzlibrarytopaths.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 564 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/latex/tools/verbatim.sty
|
| 565 |
+
Package: verbatim 2024-01-22 v1.5x LaTeX2e package for verbatim enhancements
|
| 566 |
+
\every@verbatim=\toks40
|
| 567 |
+
\verbatim@line=\toks41
|
| 568 |
+
\verbatim@in@stream=\read4
|
| 569 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/environ/environ.sty
|
| 570 |
+
Package: environ 2014/05/04 v0.3 A new way to define environments
|
| 571 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/trimspaces/trimspaces.sty
|
| 572 |
+
Package: trimspaces 2009/09/17 v1.1 Trim spaces around a token list
|
| 573 |
+
))
|
| 574 |
+
\tcb@titlebox=\box69
|
| 575 |
+
\tcb@upperbox=\box70
|
| 576 |
+
\tcb@lowerbox=\box71
|
| 577 |
+
\tcb@phantombox=\box72
|
| 578 |
+
\c@tcbbreakpart=\count354
|
| 579 |
+
\c@tcblayer=\count355
|
| 580 |
+
\c@tcolorbox@number=\count356
|
| 581 |
+
\l__tcobox_tmpa_box=\box73
|
| 582 |
+
\l__tcobox_tmpa_dim=\dimen338
|
| 583 |
+
\tcb@temp=\box74
|
| 584 |
+
\tcb@temp=\box75
|
| 585 |
+
\tcb@temp=\box76
|
| 586 |
+
\tcb@temp=\box77
|
| 587 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbraster.code.tex
|
| 588 |
+
Library (tcolorbox): 'tcbraster.code.tex' version '6.4.1'
|
| 589 |
+
\c@tcbrastercolumn=\count357
|
| 590 |
+
\c@tcbrasterrow=\count358
|
| 591 |
+
\c@tcbrasternum=\count359
|
| 592 |
+
\c@tcbraster=\count360
|
| 593 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbskins.code.tex
|
| 594 |
+
Library (tcolorbox): 'tcbskins.code.tex' version '6.4.1'
|
| 595 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzfill.image.sty
|
| 596 |
+
Package: tikzfill.image 2023/08/08 v1.0.1 Image filling library for TikZ
|
| 597 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzfill-common.sty
|
| 598 |
+
Package: tikzfill-common 2023/08/08 v1.0.1 Auxiliary code for tikzfill
|
| 599 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tikzfill/tikzlibraryfill.image.code.tex
|
| 600 |
+
File: tikzlibraryfill.image.code.tex 2023/08/08 v1.0.1 Image filling library
|
| 601 |
+
\l__tikzfill_img_box=\box78
|
| 602 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbskinsjigsaw.code.tex
|
| 603 |
+
Library (tcolorbox): 'tcbskinsjigsaw.code.tex' version '6.4.1'
|
| 604 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbbreakable.code.tex
|
| 605 |
+
Library (tcolorbox): 'tcbbreakable.code.tex' version '6.4.1'
|
| 606 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/pdfcol/pdfcol.sty
|
| 607 |
+
Package: pdfcol 2022-09-21 v1.7 Handle new color stacks for pdfTeX (HO)
|
| 608 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/infwarerr/infwarerr.sty
|
| 609 |
+
Package: infwarerr 2019/12/03 v1.5 Providing info/warning/error messages (HO)
|
| 610 |
+
))
|
| 611 |
+
Package pdfcol Info: New color stack `tcb@breakable' = 1 on input line 23.
|
| 612 |
+
\tcb@testbox=\box79
|
| 613 |
+
\tcb@totalupperbox=\box80
|
| 614 |
+
\tcb@totallowerbox=\box81
|
| 615 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbhooks.code.tex
|
| 616 |
+
Library (tcolorbox): 'tcbhooks.code.tex' version '6.4.1'
|
| 617 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbtheorems.code.tex
|
| 618 |
+
Library (tcolorbox): 'tcbtheorems.code.tex' version '6.4.1'
|
| 619 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbfitting.code.tex
|
| 620 |
+
Library (tcolorbox): 'tcbfitting.code.tex' version '6.4.1'
|
| 621 |
+
\tcbfitdim=\dimen339
|
| 622 |
+
\tcb@lowerfitdim=\dimen340
|
| 623 |
+
\tcb@upperfitdim=\dimen341
|
| 624 |
+
\tcb@cur@hbadness=\count361
|
| 625 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcblistingsutf8.code.tex
|
| 626 |
+
Library (tcolorbox): 'tcblistingsutf8.code.tex' version '6.4.1'
|
| 627 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcblistings.code.tex
|
| 628 |
+
Library (tcolorbox): 'tcblistings.code.tex' version '6.4.1'
|
| 629 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/listings/listings.sty
|
| 630 |
+
\lst@mode=\count362
|
| 631 |
+
\lst@gtempboxa=\box82
|
| 632 |
+
\lst@token=\toks42
|
| 633 |
+
\lst@length=\count363
|
| 634 |
+
\lst@currlwidth=\dimen342
|
| 635 |
+
\lst@column=\count364
|
| 636 |
+
\lst@pos=\count365
|
| 637 |
+
\lst@lostspace=\dimen343
|
| 638 |
+
\lst@width=\dimen344
|
| 639 |
+
\lst@newlines=\count366
|
| 640 |
+
\lst@lineno=\count367
|
| 641 |
+
\lst@maxwidth=\dimen345
|
| 642 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstpatch.sty
|
| 643 |
+
File: lstpatch.sty 2024/09/23 1.10c (Carsten Heinz)
|
| 644 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstmisc.sty
|
| 645 |
+
File: lstmisc.sty 2024/09/23 1.10c (Carsten Heinz)
|
| 646 |
+
\c@lstnumber=\count368
|
| 647 |
+
\lst@skipnumbers=\count369
|
| 648 |
+
\lst@framebox=\box83
|
| 649 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/listings/listings.cfg
|
| 650 |
+
File: listings.cfg 2024/09/23 1.10c listings configuration
|
| 651 |
+
))
|
| 652 |
+
Package: listings 2024/09/23 1.10c (Carsten Heinz)
|
| 653 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcblistingscore.code.tex
|
| 654 |
+
Library (tcolorbox): 'tcblistingscore.code.tex' version '6.4.1'
|
| 655 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbprocessing.code.tex
|
| 656 |
+
Library (tcolorbox): 'tcbprocessing.code.tex' version '6.4.1'
|
| 657 |
+
)
|
| 658 |
+
\c@tcblisting=\count370
|
| 659 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/listingsutf8/listingsutf8.sty
|
| 660 |
+
Package: listingsutf8 2019-12-10 v1.5 Allow UTF-8 in listings input (HO)
|
| 661 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty
|
| 662 |
+
Package: pdftexcmds 2020-06-27 v0.33 Utility functions of pdfTeX for LuaTeX (HO)
|
| 663 |
+
Package pdftexcmds Info: \pdf@primitive is available.
|
| 664 |
+
Package pdftexcmds Info: \pdf@ifprimitive is available.
|
| 665 |
+
Package pdftexcmds Info: \pdfdraftmode found.
|
| 666 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/stringenc/stringenc.sty
|
| 667 |
+
Package: stringenc 2019/11/29 v1.12 Convert strings between diff. encodings (HO)
|
| 668 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pdfescape/pdfescape.sty
|
| 669 |
+
Package: pdfescape 2019/12/09 v1.15 Implements pdfTeX's escape features (HO)
|
| 670 |
+
)))) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbexternal.code.tex
|
| 671 |
+
Library (tcolorbox): 'tcbexternal.code.tex' version '6.4.1'
|
| 672 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbmagazine.code.tex
|
| 673 |
+
Library (tcolorbox): 'tcbmagazine.code.tex' version '6.4.1'
|
| 674 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbvignette.code.tex
|
| 675 |
+
Library (tcolorbox): 'tcbvignette.code.tex' version '6.4.1'
|
| 676 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibraryfadings.code.tex
|
| 677 |
+
File: tikzlibraryfadings.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 678 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryfadings.code.tex
|
| 679 |
+
File: pgflibraryfadings.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 680 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbposter.code.tex
|
| 681 |
+
Library (tcolorbox): 'tcbposter.code.tex' version '6.4.1'
|
| 682 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hyperref.sty
|
| 683 |
+
Package: hyperref 2024-11-05 v7.01l Hypertext links for LaTeX
|
| 684 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty
|
| 685 |
+
Package: kvdefinekeys 2019-12-19 v1.6 Define keys (HO)
|
| 686 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/hycolor/hycolor.sty
|
| 687 |
+
Package: hycolor 2020-01-27 v1.10 Color options for hyperref/bookmark (HO)
|
| 688 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/nameref.sty
|
| 689 |
+
Package: nameref 2023-11-26 v2.56 Cross-referencing by name of section
|
| 690 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/refcount/refcount.sty
|
| 691 |
+
Package: refcount 2019/12/15 v3.6 Data extraction from label references (HO)
|
| 692 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty
|
| 693 |
+
Package: gettitlestring 2019/12/15 v1.6 Cleanup title references (HO)
|
| 694 |
+
)
|
| 695 |
+
\c@section@level=\count371
|
| 696 |
+
)
|
| 697 |
+
\@linkdim=\dimen346
|
| 698 |
+
\Hy@linkcounter=\count372
|
| 699 |
+
\Hy@pagecounter=\count373
|
| 700 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def
|
| 701 |
+
File: pd1enc.def 2024-11-05 v7.01l Hyperref: PDFDocEncoding definition (HO)
|
| 702 |
+
Now handling font encoding PD1 ...
|
| 703 |
+
... no UTF-8 mapping file for font encoding PD1
|
| 704 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/intcalc/intcalc.sty
|
| 705 |
+
Package: intcalc 2019/12/15 v1.3 Expandable calculations with integers (HO)
|
| 706 |
+
)
|
| 707 |
+
\Hy@SavedSpaceFactor=\count374
|
| 708 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def
|
| 709 |
+
File: puenc.def 2024-11-05 v7.01l Hyperref: PDF Unicode definition (HO)
|
| 710 |
+
Now handling font encoding PU ...
|
| 711 |
+
... no UTF-8 mapping file for font encoding PU
|
| 712 |
+
)
|
| 713 |
+
Package hyperref Info: Hyper figures OFF on input line 4157.
|
| 714 |
+
Package hyperref Info: Link nesting OFF on input line 4162.
|
| 715 |
+
Package hyperref Info: Hyper index ON on input line 4165.
|
| 716 |
+
Package hyperref Info: Plain pages OFF on input line 4172.
|
| 717 |
+
Package hyperref Info: Backreferencing OFF on input line 4177.
|
| 718 |
+
Package hyperref Info: Implicit mode ON; LaTeX internals redefined.
|
| 719 |
+
Package hyperref Info: Bookmarks ON on input line 4424.
|
| 720 |
+
\c@Hy@tempcnt=\count375
|
| 721 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/url/url.sty
|
| 722 |
+
\Urlmuskip=\muskip18
|
| 723 |
+
Package: url 2013/09/16 ver 3.4 Verb mode for urls, etc.
|
| 724 |
+
)
|
| 725 |
+
LaTeX Info: Redefining \url on input line 4763.
|
| 726 |
+
\XeTeXLinkMargin=\dimen347
|
| 727 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/bitset/bitset.sty
|
| 728 |
+
Package: bitset 2019/12/09 v1.3 Handle bit-vector datatype (HO)
|
| 729 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty
|
| 730 |
+
Package: bigintcalc 2019/12/15 v1.5 Expandable calculations on big integers (HO)
|
| 731 |
+
))
|
| 732 |
+
\Fld@menulength=\count376
|
| 733 |
+
\Field@Width=\dimen348
|
| 734 |
+
\Fld@charsize=\dimen349
|
| 735 |
+
Package hyperref Info: Hyper figures OFF on input line 6042.
|
| 736 |
+
Package hyperref Info: Link nesting OFF on input line 6047.
|
| 737 |
+
Package hyperref Info: Hyper index ON on input line 6050.
|
| 738 |
+
Package hyperref Info: backreferencing OFF on input line 6057.
|
| 739 |
+
Package hyperref Info: Link coloring OFF on input line 6062.
|
| 740 |
+
Package hyperref Info: Link coloring with OCG OFF on input line 6067.
|
| 741 |
+
Package hyperref Info: PDF/A mode OFF on input line 6072.
|
| 742 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/base/atbegshi-ltx.sty
|
| 743 |
+
Package: atbegshi-ltx 2021/01/10 v1.0c Emulation of the original atbegshi
|
| 744 |
+
package with kernel methods
|
| 745 |
+
)
|
| 746 |
+
\Hy@abspage=\count377
|
| 747 |
+
\c@Item=\count378
|
| 748 |
+
\c@Hfootnote=\count379
|
| 749 |
+
)
|
| 750 |
+
Package hyperref Info: Driver (autodetected): hpdftex.
|
| 751 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def
|
| 752 |
+
File: hpdftex.def 2024-11-05 v7.01l Hyperref driver for pdfTeX
|
| 753 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/base/atveryend-ltx.sty
|
| 754 |
+
Package: atveryend-ltx 2020/08/19 v1.0a Emulation of the original atveryend package
|
| 755 |
+
with kernel methods
|
| 756 |
+
)
|
| 757 |
+
\Fld@listcount=\count380
|
| 758 |
+
\c@bookmark@seq@number=\count381
|
| 759 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty
|
| 760 |
+
Package: rerunfilecheck 2022-07-10 v1.10 Rerun checks for auxiliary files (HO)
|
| 761 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty
|
| 762 |
+
Package: uniquecounter 2019/12/15 v1.4 Provide unlimited unique counter (HO)
|
| 763 |
+
)
|
| 764 |
+
Package uniquecounter Info: New unique counter `rerunfilecheck' on input line 285.
|
| 765 |
+
)
|
| 766 |
+
\Hy@SectionHShift=\skip69
|
| 767 |
+
)
|
| 768 |
+
Package hyperref Info: Option `colorlinks' set `true' on input line 66.
|
| 769 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/cleveref/cleveref.sty
|
| 770 |
+
Package: cleveref 2018/03/27 v0.21.4 Intelligent cross-referencing
|
| 771 |
+
Package cleveref Info: `hyperref' support loaded on input line 2370.
|
| 772 |
+
LaTeX Info: Redefining \cref on input line 2370.
|
| 773 |
+
LaTeX Info: Redefining \Cref on input line 2370.
|
| 774 |
+
LaTeX Info: Redefining \crefrange on input line 2370.
|
| 775 |
+
LaTeX Info: Redefining \Crefrange on input line 2370.
|
| 776 |
+
LaTeX Info: Redefining \cpageref on input line 2370.
|
| 777 |
+
LaTeX Info: Redefining \Cpageref on input line 2370.
|
| 778 |
+
LaTeX Info: Redefining \cpagerefrange on input line 2370.
|
| 779 |
+
LaTeX Info: Redefining \Cpagerefrange on input line 2370.
|
| 780 |
+
LaTeX Info: Redefining \labelcref on input line 2370.
|
| 781 |
+
LaTeX Info: Redefining \labelcpageref on input line 2370.
|
| 782 |
+
Package cleveref Info: `listings' support loaded on input line 3131.
|
| 783 |
+
Package cleveref Info: include cross-reference names in hyperlinks on input line 7836.
|
| 784 |
+
Package cleveref Info: no abbreviation of names on input line 7852.
|
| 785 |
+
Package cleveref Info: include cross-reference names in hyperlinks on input line 7852.
|
| 786 |
+
) (./natbib.sty
|
| 787 |
+
Package: natbib 2009/07/16 8.31 (PWD, AO)
|
| 788 |
+
\bibhang=\skip70
|
| 789 |
+
\bibsep=\skip71
|
| 790 |
+
LaTeX Info: Redefining \cite on input line 694.
|
| 791 |
+
\c@NAT@ctr=\count382
|
| 792 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/titlesec/titlesec.sty
|
| 793 |
+
Package: titlesec 2025/01/04 v2.17 Sectioning titles
|
| 794 |
+
\ttl@box=\box84
|
| 795 |
+
\beforetitleunit=\skip72
|
| 796 |
+
\aftertitleunit=\skip73
|
| 797 |
+
\ttl@plus=\dimen350
|
| 798 |
+
\ttl@minus=\dimen351
|
| 799 |
+
\ttl@toksa=\toks43
|
| 800 |
+
\titlewidth=\dimen352
|
| 801 |
+
\titlewidthlast=\dimen353
|
| 802 |
+
\titlewidthfirst=\dimen354
|
| 803 |
+
)
|
| 804 |
+
LaTeX Info: Redefining \textbf on input line 113.
|
| 805 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifxetex.sty
|
| 806 |
+
Package: ifxetex 2019/10/25 v0.7 ifxetex legacy package. Use iftex instead.
|
| 807 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/base/fontenc.sty
|
| 808 |
+
Package: fontenc 2021/04/29 v2.0v Standard LaTeX package
|
| 809 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/base/fontenc.sty
|
| 810 |
+
Package: fontenc 2021/04/29 v2.0v Standard LaTeX package
|
| 811 |
+
) (./hfstyle/manrope.sty
|
| 812 |
+
Package: hfstyle/manrope Provides the manrope font
|
| 813 |
+
|
| 814 |
+
|
| 815 |
+
LaTeX Info: File `t1manrope.fd' already exists on the system.
|
| 816 |
+
Not generating it from this source.
|
| 817 |
+
|
| 818 |
+
{/usr/local/texlive/2025/texmf-var/fonts/map/pdftex/updmap/pdftex.map}) (/usr/local/texlive/2025/texmf-dist/tex/latex/tocloft/tocloft.sty
|
| 819 |
+
Package: tocloft 2017/08/31 v2.3i parameterised ToC, etc., typesetting
|
| 820 |
+
Package tocloft Info: The document has section divisions on input line 51.
|
| 821 |
+
\cftparskip=\skip74
|
| 822 |
+
\cftbeforetoctitleskip=\skip75
|
| 823 |
+
\cftaftertoctitleskip=\skip76
|
| 824 |
+
\cftbeforepartskip=\skip77
|
| 825 |
+
\cftpartnumwidth=\skip78
|
| 826 |
+
\cftpartindent=\skip79
|
| 827 |
+
\cftbeforesecskip=\skip80
|
| 828 |
+
\cftsecindent=\skip81
|
| 829 |
+
\cftsecnumwidth=\skip82
|
| 830 |
+
\cftbeforesubsecskip=\skip83
|
| 831 |
+
\cftsubsecindent=\skip84
|
| 832 |
+
\cftsubsecnumwidth=\skip85
|
| 833 |
+
\cftbeforesubsubsecskip=\skip86
|
| 834 |
+
\cftsubsubsecindent=\skip87
|
| 835 |
+
\cftsubsubsecnumwidth=\skip88
|
| 836 |
+
\cftbeforeparaskip=\skip89
|
| 837 |
+
\cftparaindent=\skip90
|
| 838 |
+
\cftparanumwidth=\skip91
|
| 839 |
+
\cftbeforesubparaskip=\skip92
|
| 840 |
+
\cftsubparaindent=\skip93
|
| 841 |
+
\cftsubparanumwidth=\skip94
|
| 842 |
+
\cftbeforeloftitleskip=\skip95
|
| 843 |
+
\cftafterloftitleskip=\skip96
|
| 844 |
+
\cftbeforefigskip=\skip97
|
| 845 |
+
\cftfigindent=\skip98
|
| 846 |
+
\cftfignumwidth=\skip99
|
| 847 |
+
\c@lofdepth=\count383
|
| 848 |
+
\c@lotdepth=\count384
|
| 849 |
+
\cftbeforelottitleskip=\skip100
|
| 850 |
+
\cftafterlottitleskip=\skip101
|
| 851 |
+
\cftbeforetabskip=\skip102
|
| 852 |
+
\cfttabindent=\skip103
|
| 853 |
+
\cfttabnumwidth=\skip104
|
| 854 |
+
|
| 855 |
+
|
| 856 |
+
Package tocloft Warning: \@starttoc has already been redefined; tocloft bailing out. on input line 1156.
|
| 857 |
+
|
| 858 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/base/inputenc.sty
|
| 859 |
+
Package: inputenc 2024/02/08 v1.3d Input encoding file
|
| 860 |
+
\inpenc@prehook=\toks44
|
| 861 |
+
\inpenc@posthook=\toks45
|
| 862 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/base/fontenc.sty
|
| 863 |
+
Package: fontenc 2021/04/29 v2.0v Standard LaTeX package
|
| 864 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/lineno/lineno.sty
|
| 865 |
+
Package: lineno 2025/01/29 line numbers on paragraphs v5.4
|
| 866 |
+
\linenopenalty=\count385
|
| 867 |
+
\output=\toks46
|
| 868 |
+
\linenoprevgraf=\count386
|
| 869 |
+
\linenumbersep=\dimen355
|
| 870 |
+
\linenumberwidth=\dimen356
|
| 871 |
+
\c@linenumber=\count387
|
| 872 |
+
\c@pagewiselinenumber=\count388
|
| 873 |
+
\c@LN@truepage=\count389
|
| 874 |
+
\c@internallinenumber=\count390
|
| 875 |
+
\c@internallinenumbers=\count391
|
| 876 |
+
\quotelinenumbersep=\dimen357
|
| 877 |
+
\bframerule=\dimen358
|
| 878 |
+
\bframesep=\dimen359
|
| 879 |
+
\bframebox=\box85
|
| 880 |
+
\@LN@amsmath@ams@eqpen=\count392
|
| 881 |
+
LaTeX Info: Redefining \\ on input line 3187.
|
| 882 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/enumitem/enumitem.sty
|
| 883 |
+
Package: enumitem 2025/02/06 v3.11 Customized lists
|
| 884 |
+
\labelindent=\skip105
|
| 885 |
+
\enit@outerparindent=\dimen360
|
| 886 |
+
\enit@toks=\toks47
|
| 887 |
+
\enit@inbox=\box86
|
| 888 |
+
\enit@count@id=\count393
|
| 889 |
+
\enitdp@description=\count394
|
| 890 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amsfonts.sty
|
| 891 |
+
Package: amsfonts 2013/01/14 v3.01 Basic AMSFonts support
|
| 892 |
+
\symAMSa=\mathgroup7
|
| 893 |
+
\symAMSb=\mathgroup8
|
| 894 |
+
LaTeX Font Info: Redeclaring math symbol \hbar on input line 98.
|
| 895 |
+
LaTeX Font Info: Overwriting math alphabet `\mathfrak' in version `bold'
|
| 896 |
+
(Font) U/euf/m/n --> U/euf/b/n on input line 106.
|
| 897 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amssymb.sty
|
| 898 |
+
Package: amssymb 2013/01/14 v3.01 AMS font symbols
|
| 899 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/units/nicefrac.sty
|
| 900 |
+
Package: nicefrac 1998/08/04 v0.9b Nice fractions
|
| 901 |
+
\L@UnitsRaiseDisplaystyle=\skip106
|
| 902 |
+
\L@UnitsRaiseTextstyle=\skip107
|
| 903 |
+
\L@UnitsRaiseScriptstyle=\skip108
|
| 904 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/base/ifthen.sty
|
| 905 |
+
Package: ifthen 2024/03/16 v1.1e Standard LaTeX ifthen package (DPC)
|
| 906 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/siunitx/siunitx.sty
|
| 907 |
+
Package: siunitx 2025-02-27 v3.4.6 A comprehensive (SI) units package
|
| 908 |
+
\l__siunitx_number_uncert_offset_int=\count395
|
| 909 |
+
\l__siunitx_number_exponent_fixed_int=\count396
|
| 910 |
+
\l__siunitx_number_min_decimal_int=\count397
|
| 911 |
+
\l__siunitx_number_min_integer_int=\count398
|
| 912 |
+
\l__siunitx_number_round_precision_int=\count399
|
| 913 |
+
\l__siunitx_number_lower_threshold_int=\count400
|
| 914 |
+
\l__siunitx_number_upper_threshold_int=\count401
|
| 915 |
+
\l__siunitx_number_group_first_int=\count402
|
| 916 |
+
\l__siunitx_number_group_size_int=\count403
|
| 917 |
+
\l__siunitx_number_group_minimum_int=\count404
|
| 918 |
+
\l__siunitx_angle_tmp_dim=\dimen361
|
| 919 |
+
\l__siunitx_angle_marker_box=\box87
|
| 920 |
+
\l__siunitx_angle_unit_box=\box88
|
| 921 |
+
\l__siunitx_compound_count_int=\count405
|
| 922 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/translations/translations.sty
|
| 923 |
+
Package: translations 2022/02/05 v1.12 internationalization of LaTeX2e packages (CN)
|
| 924 |
+
)
|
| 925 |
+
\l__siunitx_table_tmp_box=\box89
|
| 926 |
+
\l__siunitx_table_tmp_dim=\dimen362
|
| 927 |
+
\l__siunitx_table_column_width_dim=\dimen363
|
| 928 |
+
\l__siunitx_table_integer_box=\box90
|
| 929 |
+
\l__siunitx_table_decimal_box=\box91
|
| 930 |
+
\l__siunitx_table_uncert_box=\box92
|
| 931 |
+
\l__siunitx_table_before_box=\box93
|
| 932 |
+
\l__siunitx_table_after_box=\box94
|
| 933 |
+
\l__siunitx_table_before_dim=\dimen364
|
| 934 |
+
\l__siunitx_table_carry_dim=\dimen365
|
| 935 |
+
\l__siunitx_unit_tmp_int=\count406
|
| 936 |
+
\l__siunitx_unit_position_int=\count407
|
| 937 |
+
\l__siunitx_unit_total_int=\count408
|
| 938 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/multirow/bigdelim.sty
|
| 939 |
+
Package: bigdelim 2024/11/12 v2.9 Create big delimiters in tabular or array
|
| 940 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tools/longtable.sty
|
| 941 |
+
Package: longtable 2024-10-27 v4.22 Multi-page Table package (DPC)
|
| 942 |
+
\LTleft=\skip109
|
| 943 |
+
\LTright=\skip110
|
| 944 |
+
\LTpre=\skip111
|
| 945 |
+
\LTpost=\skip112
|
| 946 |
+
\LTchunksize=\count409
|
| 947 |
+
\LTcapwidth=\dimen366
|
| 948 |
+
\LT@head=\box95
|
| 949 |
+
\LT@firsthead=\box96
|
| 950 |
+
\LT@foot=\box97
|
| 951 |
+
\LT@lastfoot=\box98
|
| 952 |
+
\LT@gbox=\box99
|
| 953 |
+
\LT@cols=\count410
|
| 954 |
+
\LT@rows=\count411
|
| 955 |
+
\c@LT@tables=\count412
|
| 956 |
+
\c@LT@chunks=\count413
|
| 957 |
+
\LT@p@ftn=\toks48
|
| 958 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tabularray/tabularray.sty
|
| 959 |
+
Package: tabularray 2024-02-16 v2024A Typeset tabulars and arrays with LaTeX3
|
| 960 |
+
\l__tblr_a_int=\count414
|
| 961 |
+
\l__tblr_c_int=\count415
|
| 962 |
+
\l__tblr_r_int=\count416
|
| 963 |
+
\l__tblr_d_dim=\dimen367
|
| 964 |
+
\l__tblr_h_dim=\dimen368
|
| 965 |
+
\l__tblr_o_dim=\dimen369
|
| 966 |
+
\l__tblr_p_dim=\dimen370
|
| 967 |
+
\l__tblr_q_dim=\dimen371
|
| 968 |
+
\l__tblr_r_dim=\dimen372
|
| 969 |
+
\l__tblr_s_dim=\dimen373
|
| 970 |
+
\l__tblr_t_dim=\dimen374
|
| 971 |
+
\l__tblr_v_dim=\dimen375
|
| 972 |
+
\l__tblr_w_dim=\dimen376
|
| 973 |
+
\l__tblr_a_box=\box100
|
| 974 |
+
\l__tblr_b_box=\box101
|
| 975 |
+
\l__tblr_c_box=\box102
|
| 976 |
+
\l__tblr_d_box=\box103
|
| 977 |
+
\g__tblr_table_count_int=\count417
|
| 978 |
+
\c@colnum=\count418
|
| 979 |
+
\c@rowcount=\count419
|
| 980 |
+
\c@colcount=\count420
|
| 981 |
+
\abovesep=\dimen377
|
| 982 |
+
\belowsep=\dimen378
|
| 983 |
+
\leftsep=\dimen379
|
| 984 |
+
\rightsep=\dimen380
|
| 985 |
+
\g_tblr_level_int=\count421
|
| 986 |
+
\g__tblr_data_row_key_count_int=\count422
|
| 987 |
+
\g__tblr_data_column_key_count_int=\count423
|
| 988 |
+
\g__tblr_data_cell_key_count_int=\count424
|
| 989 |
+
\g__tblr_array_int=\count425
|
| 990 |
+
\l__tblr_key_count_int=\count426
|
| 991 |
+
\l__tblr_key_quotient_int=\count427
|
| 992 |
+
\l__tblr_key_quotient_two_int=\count428
|
| 993 |
+
\l__tblr_key_remainder_int=\count429
|
| 994 |
+
\g__tblr_data_str_value_count_int=\count430
|
| 995 |
+
\rulewidth=\dimen381
|
| 996 |
+
\l__tblr_strut_dp_dim=\dimen382
|
| 997 |
+
\l__tblr_strut_ht_dim=\dimen383
|
| 998 |
+
\g__tblr_cell_wd_dim=\dimen384
|
| 999 |
+
\g__tblr_cell_ht_dim=\dimen385
|
| 1000 |
+
\g__tblr_cell_head_dim=\dimen386
|
| 1001 |
+
\g__tblr_cell_foot_dim=\dimen387
|
| 1002 |
+
\l__column_target_dim=\dimen388
|
| 1003 |
+
\l__tblr_caption_box=\box104
|
| 1004 |
+
\l__tblr_caption_left_box=\box105
|
| 1005 |
+
\l__tblr_row_head_box=\box106
|
| 1006 |
+
\l__tblr_row_foot_box=\box107
|
| 1007 |
+
\l__tblr_row_head_foot_dim=\dimen389
|
| 1008 |
+
\tablewidth=\dimen390
|
| 1009 |
+
\l__tblr_table_firsthead_box=\box108
|
| 1010 |
+
\l__tblr_table_middlehead_box=\box109
|
| 1011 |
+
\l__tblr_table_lasthead_box=\box110
|
| 1012 |
+
\l__tblr_table_firstfoot_box=\box111
|
| 1013 |
+
\l__tblr_table_middlefoot_box=\box112
|
| 1014 |
+
\l__tblr_table_lastfoot_box=\box113
|
| 1015 |
+
\l__tblr_remain_height_dim=\dimen391
|
| 1016 |
+
\l__tblr_long_from_int=\count431
|
| 1017 |
+
\l__tblr_long_to_int=\count432
|
| 1018 |
+
\l__tblr_curr_i_int=\count433
|
| 1019 |
+
\l__tblr_prev_i_int=\count434
|
| 1020 |
+
\l__tblr_table_page_int=\count435
|
| 1021 |
+
\l__tblr_table_head_box=\box114
|
| 1022 |
+
\l__tblr_table_foot_box=\box115
|
| 1023 |
+
\l__tblr_table_head_foot_dim=\dimen392
|
| 1024 |
+
\l__tblr_table_head_body_foot_dim=\dimen393
|
| 1025 |
+
\l__tblr_table_box=\box116
|
| 1026 |
+
\l__tblr_table_hlines_box=\box117
|
| 1027 |
+
\l__tblr_hline_box=\box118
|
| 1028 |
+
\l__tblr_row_box=\box119
|
| 1029 |
+
\l__tblr_col_o_wd_dim=\dimen394
|
| 1030 |
+
\l__tblr_col_b_wd_dim=\dimen395
|
| 1031 |
+
\l__tblr_hline_leftskip_dim=\dimen396
|
| 1032 |
+
\l__tblr_hline_rightskip_dim=\dimen397
|
| 1033 |
+
\l__tblr_row_ht_dim=\dimen398
|
| 1034 |
+
\l__tblr_row_dp_dim=\dimen399
|
| 1035 |
+
\l__tblr_row_abovesep_dim=\dimen400
|
| 1036 |
+
\l__tblr_row_belowsep_dim=\dimen401
|
| 1037 |
+
\l__tblr_row_vlines_box=\box120
|
| 1038 |
+
\l__tblr_vline_box=\box121
|
| 1039 |
+
\l__tblr_cell_box=\box122
|
| 1040 |
+
\l__row_upper_dim=\dimen402
|
| 1041 |
+
\l__row_lower_dim=\dimen403
|
| 1042 |
+
\l__row_vpace_dim=\dimen404
|
| 1043 |
+
\l__tblr_vline_aboveskip_dim=\dimen405
|
| 1044 |
+
\l__tblr_vline_belowskip_dim=\dimen406
|
| 1045 |
+
\l__tblr_cell_wd_dim=\dimen407
|
| 1046 |
+
\l__tblr_cell_ht_dim=\dimen408
|
| 1047 |
+
\l__tblr_diag_box=\box123
|
| 1048 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/wrapfig/wrapfig.sty
|
| 1049 |
+
\wrapoverhang=\dimen409
|
| 1050 |
+
\WF@size=\dimen410
|
| 1051 |
+
\c@WF@wrappedlines=\count436
|
| 1052 |
+
\WF@box=\box124
|
| 1053 |
+
\WF@everypar=\toks49
|
| 1054 |
+
Package: wrapfig 2003/01/31 v 3.6
|
| 1055 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/makecell/makecell.sty
|
| 1056 |
+
Package: makecell 2009/08/03 V0.1e Managing of Tab Column Heads and Cells
|
| 1057 |
+
\rotheadsize=\dimen411
|
| 1058 |
+
\c@nlinenum=\count437
|
| 1059 |
+
\TeXr@lab=\toks50
|
| 1060 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjustbox.sty
|
| 1061 |
+
Package: adjustbox 2025/02/26 v1.3c Adjusting TeX boxes (trim, clip, ...)
|
| 1062 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/xkeyval/xkeyval.sty
|
| 1063 |
+
Package: xkeyval 2022/06/16 v2.9 package option processing (HA)
|
| 1064 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkeyval.tex (/usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkvutils.tex
|
| 1065 |
+
\XKV@toks=\toks51
|
| 1066 |
+
\XKV@tempa@toks=\toks52
|
| 1067 |
+
)
|
| 1068 |
+
\XKV@depth=\count438
|
| 1069 |
+
File: xkeyval.tex 2014/12/03 v2.7a key=value parser (HA)
|
| 1070 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjcalc.sty
|
| 1071 |
+
Package: adjcalc 2012/05/16 v1.1 Provides advanced setlength with multiple back-ends (calc, etex, pgfmath)
|
| 1072 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/trimclip.sty
|
| 1073 |
+
Package: trimclip 2025/02/21 v1.2a Trim and clip general TeX material
|
| 1074 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/collectbox/collectbox.sty
|
| 1075 |
+
Package: collectbox 2022/10/17 v0.4c Collect macro arguments as boxes
|
| 1076 |
+
\collectedbox=\box125
|
| 1077 |
+
)
|
| 1078 |
+
\tc@llx=\dimen412
|
| 1079 |
+
\tc@lly=\dimen413
|
| 1080 |
+
\tc@urx=\dimen414
|
| 1081 |
+
\tc@ury=\dimen415
|
| 1082 |
+
Package trimclip Info: Using driver 'tc-pdftex.def'.
|
| 1083 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/tc-pdftex.def
|
| 1084 |
+
File: tc-pdftex.def 2025/02/26 v2.3 Clipping driver for pdftex
|
| 1085 |
+
))
|
| 1086 |
+
\adjbox@Width=\dimen416
|
| 1087 |
+
\adjbox@Height=\dimen417
|
| 1088 |
+
\adjbox@Depth=\dimen418
|
| 1089 |
+
\adjbox@Totalheight=\dimen419
|
| 1090 |
+
\adjbox@pwidth=\dimen420
|
| 1091 |
+
\adjbox@pheight=\dimen421
|
| 1092 |
+
\adjbox@pdepth=\dimen422
|
| 1093 |
+
\adjbox@ptotalheight=\dimen423
|
| 1094 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/ifoddpage/ifoddpage.sty
|
| 1095 |
+
Package: ifoddpage 2022/10/18 v1.2 Conditionals for odd/even page detection
|
| 1096 |
+
\c@checkoddpage=\count439
|
| 1097 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/varwidth/varwidth.sty
|
| 1098 |
+
Package: varwidth 2009/03/30 ver 0.92; Variable-width minipages
|
| 1099 |
+
\@vwid@box=\box126
|
| 1100 |
+
\sift@deathcycles=\count440
|
| 1101 |
+
\@vwid@loff=\dimen424
|
| 1102 |
+
\@vwid@roff=\dimen425
|
| 1103 |
+
))
|
| 1104 |
+
|
| 1105 |
+
LaTeX Warning: Package "xcolor" has already been loaded: ignoring load-time
|
| 1106 |
+
(LaTeX) option "xcdraw".
|
| 1107 |
+
|
| 1108 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/dvipsnam.def
|
| 1109 |
+
File: dvipsnam.def 2016/06/17 v3.0m Driver-dependent file (DPC,SPQR)
|
| 1110 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tools/xspace.sty
|
| 1111 |
+
Package: xspace 2014/10/28 v1.13 Space after command names (DPC,MH)
|
| 1112 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/soul/soul.sty
|
| 1113 |
+
Package: soul 2023-06-14 v3.1 Permit use of UTF-8 characters in soul (HO)
|
| 1114 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/soul/soul-ori.sty
|
| 1115 |
+
Package: soul-ori 2023-06-14 v3.1 letterspacing/underlining (mf)
|
| 1116 |
+
\SOUL@word=\toks53
|
| 1117 |
+
\SOUL@lasttoken=\toks54
|
| 1118 |
+
\SOUL@syllable=\toks55
|
| 1119 |
+
\SOUL@cmds=\toks56
|
| 1120 |
+
\SOUL@buffer=\toks57
|
| 1121 |
+
\SOUL@token=\toks58
|
| 1122 |
+
\SOUL@syllgoal=\dimen426
|
| 1123 |
+
\SOUL@syllwidth=\dimen427
|
| 1124 |
+
\SOUL@charkern=\dimen428
|
| 1125 |
+
\SOUL@hyphkern=\dimen429
|
| 1126 |
+
\SOUL@dimen=\dimen430
|
| 1127 |
+
\SOUL@dimeni=\dimen431
|
| 1128 |
+
\SOUL@minus=\count441
|
| 1129 |
+
\SOUL@comma=\count442
|
| 1130 |
+
\SOUL@apo=\count443
|
| 1131 |
+
\SOUL@grave=\count444
|
| 1132 |
+
\SOUL@spaceskip=\skip113
|
| 1133 |
+
\SOUL@ttwidth=\dimen432
|
| 1134 |
+
\SOUL@uldp=\dimen433
|
| 1135 |
+
\SOUL@ulht=\dimen434
|
| 1136 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/etexcmds/etexcmds.sty
|
| 1137 |
+
Package: etexcmds 2019/12/15 v1.7 Avoid name clashes with e-TeX commands (HO)
|
| 1138 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.sty
|
| 1139 |
+
Package: csquotes 2024-04-04 v5.2o context-sensitive quotations (JAW)
|
| 1140 |
+
\csq@reset=\count445
|
| 1141 |
+
\csq@gtype=\count446
|
| 1142 |
+
\csq@glevel=\count447
|
| 1143 |
+
\csq@qlevel=\count448
|
| 1144 |
+
\csq@maxlvl=\count449
|
| 1145 |
+
\csq@tshold=\count450
|
| 1146 |
+
\csq@ltx@everypar=\toks59
|
| 1147 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.def
|
| 1148 |
+
File: csquotes.def 2024-04-04 v5.2o csquotes generic definitions (JAW)
|
| 1149 |
+
)
|
| 1150 |
+
Package csquotes Info: Trying to load configuration file 'csquotes.cfg'...
|
| 1151 |
+
Package csquotes Info: ... configuration file loaded successfully.
|
| 1152 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/csquotes/csquotes.cfg
|
| 1153 |
+
File: csquotes.cfg
|
| 1154 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/arydshln/arydshln.sty
|
| 1155 |
+
Package: arydshln 2019/02/21 v1.76
|
| 1156 |
+
\dashlinedash=\dimen435
|
| 1157 |
+
\dashlinegap=\dimen436
|
| 1158 |
+
\adl@box=\box127
|
| 1159 |
+
\adl@height=\dimen437
|
| 1160 |
+
\adl@heightsave=\dimen438
|
| 1161 |
+
\adl@depth=\dimen439
|
| 1162 |
+
\adl@depthsave=\dimen440
|
| 1163 |
+
\adl@finaldepth=\dimen441
|
| 1164 |
+
\adl@columns=\count451
|
| 1165 |
+
\adl@ncol=\count452
|
| 1166 |
+
\adl@currentcolumn=\count453
|
| 1167 |
+
\adl@currentcolumnsave=\count454
|
| 1168 |
+
\adl@totalheight=\count455
|
| 1169 |
+
\adl@totalheightsave=\count456
|
| 1170 |
+
\adl@dash=\count457
|
| 1171 |
+
\adl@gap=\count458
|
| 1172 |
+
\adl@cla=\count459
|
| 1173 |
+
\adl@clb=\count460
|
| 1174 |
+
\adl@everyvbox=\toks60
|
| 1175 |
+
\adl@LTpagetotal=\dimen442
|
| 1176 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/todonotes/todonotes.sty
|
| 1177 |
+
Package: todonotes 2024/01/05 v1.1.7 Todonotes source and documentation.
|
| 1178 |
+
Package: todonotes 2024/01/05
|
| 1179 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarypositioning.code.tex
|
| 1180 |
+
File: tikzlibrarypositioning.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 1181 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tools/calc.sty
|
| 1182 |
+
Package: calc 2023/07/08 v4.3 Infix arithmetic (KKT,FJ)
|
| 1183 |
+
\calc@Acount=\count461
|
| 1184 |
+
\calc@Bcount=\count462
|
| 1185 |
+
\calc@Adimen=\dimen443
|
| 1186 |
+
\calc@Bdimen=\dimen444
|
| 1187 |
+
\calc@Askip=\skip114
|
| 1188 |
+
\calc@Bskip=\skip115
|
| 1189 |
+
LaTeX Info: Redefining \setlength on input line 80.
|
| 1190 |
+
LaTeX Info: Redefining \addtolength on input line 81.
|
| 1191 |
+
\calc@Ccount=\count463
|
| 1192 |
+
\calc@Cskip=\skip116
|
| 1193 |
+
)
|
| 1194 |
+
\c@@todonotes@numberoftodonotes=\count464
|
| 1195 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/textpos/textpos.sty
|
| 1196 |
+
Package: textpos 2022/07/23 v1.10.1
|
| 1197 |
+
Package textpos Info: choosing support for LaTeX3 on input line 60.
|
| 1198 |
+
\TP@textbox=\box128
|
| 1199 |
+
\TP@holdbox=\box129
|
| 1200 |
+
\TPHorizModule=\dimen445
|
| 1201 |
+
\TPVertModule=\dimen446
|
| 1202 |
+
\TP@margin=\dimen447
|
| 1203 |
+
\TP@absmargin=\dimen448
|
| 1204 |
+
|
| 1205 |
+
Grid set 16 x 16 = 38.39343pt x 49.68562pt
|
| 1206 |
+
\TPboxrulesize=\dimen449
|
| 1207 |
+
\TP@ox=\dimen450
|
| 1208 |
+
\TP@oy=\dimen451
|
| 1209 |
+
\TP@tbargs=\toks61
|
| 1210 |
+
TextBlockOrigin set to 0pt x 0pt
|
| 1211 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/pifont.sty
|
| 1212 |
+
Package: pifont 2020/03/25 PSNFSS-v9.3 Pi font support (SPQR)
|
| 1213 |
+
LaTeX Font Info: Trying to load font information for U+pzd on input line 63.
|
| 1214 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upzd.fd
|
| 1215 |
+
File: upzd.fd 2001/06/04 font definitions for U/pzd.
|
| 1216 |
+
)
|
| 1217 |
+
LaTeX Font Info: Trying to load font information for U+psy on input line 64.
|
| 1218 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upsy.fd
|
| 1219 |
+
File: upsy.fd 2001/06/04 font definitions for U/psy.
|
| 1220 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/bold-extra/bold-extra.sty
|
| 1221 |
+
Package: bold-extra 2001/11/13 v0.1 Use fonts from cm/mf-extra/bold
|
| 1222 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf-pie/pgf-pie.sty
|
| 1223 |
+
Package: pgf-pie 2022/06/14 v0.7 Some LaTeX macros for pie chart by using PGF/Tikz package.
|
| 1224 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/pgf-pie/tikzlibrarypie.code.tex (/usr/local/texlive/2025/texmf-dist/tex/latex/carlisle/scalefnt.sty)
|
| 1225 |
+
\pgfpie@angleEnd=\dimen452
|
| 1226 |
+
\pgfpie@explodeLength=\count465
|
| 1227 |
+
\pgfpie@colorLength=\count466
|
| 1228 |
+
\pgfpie@sliceLength=\count467
|
| 1229 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/epigraph/epigraph.sty
|
| 1230 |
+
Package: epigraph 2020/01/02 v1.5e typesetting epigraphs
|
| 1231 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/nextpage/nextpage.sty
|
| 1232 |
+
Package: nextpage 2009/09/03 v1.1a additional page commands
|
| 1233 |
+
)
|
| 1234 |
+
\beforeepigraphskip=\skip117
|
| 1235 |
+
\afterepigraphskip=\skip118
|
| 1236 |
+
\epigraphwidth=\skip119
|
| 1237 |
+
\epigraphrule=\skip120
|
| 1238 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/algorithms/algorithm.sty
|
| 1239 |
+
Package: algorithm 2009/08/24 v0.1 Document Style `algorithm' - floating environment
|
| 1240 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/float/float.sty
|
| 1241 |
+
Package: float 2001/11/08 v1.3d Float enhancements (AL)
|
| 1242 |
+
\c@float@type=\count468
|
| 1243 |
+
\float@exts=\toks62
|
| 1244 |
+
\float@box=\box130
|
| 1245 |
+
\@float@everytoks=\toks63
|
| 1246 |
+
\@floatcapt=\box131
|
| 1247 |
+
)
|
| 1248 |
+
\@float@every@algorithm=\toks64
|
| 1249 |
+
\c@algorithm=\count469
|
| 1250 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/algorithmicx/algpseudocode.sty
|
| 1251 |
+
Package: algpseudocode
|
| 1252 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/algorithmicx/algorithmicx.sty
|
| 1253 |
+
Package: algorithmicx 2005/04/27 v1.2 Algorithmicx
|
| 1254 |
+
|
| 1255 |
+
Document Style algorithmicx 1.2 - a greatly improved `algorithmic' style
|
| 1256 |
+
\c@ALG@line=\count470
|
| 1257 |
+
\c@ALG@rem=\count471
|
| 1258 |
+
\c@ALG@nested=\count472
|
| 1259 |
+
\ALG@tlm=\skip121
|
| 1260 |
+
\ALG@thistlm=\skip122
|
| 1261 |
+
\c@ALG@Lnr=\count473
|
| 1262 |
+
\c@ALG@blocknr=\count474
|
| 1263 |
+
\c@ALG@storecount=\count475
|
| 1264 |
+
\c@ALG@tmpcounter=\count476
|
| 1265 |
+
\ALG@tmplength=\skip123
|
| 1266 |
+
)
|
| 1267 |
+
Document Style - pseudocode environments for use with the `algorithmicx' style
|
| 1268 |
+
)
|
| 1269 |
+
Package hyperref Info: Option `colorlinks' set `true' on input line 64.
|
| 1270 |
+
defining Unicode char U+2212 (decimal 8722)
|
| 1271 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/tcolorbox/tcbminted.code.tex
|
| 1272 |
+
Library (tcolorbox): 'tcbminted.code.tex' version '6.4.1'
|
| 1273 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/minted/minted.sty
|
| 1274 |
+
Package: minted 2025/03/06 v3.6.0 Yet another Pygments shim for LaTeX
|
| 1275 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/catchfile/catchfile.sty
|
| 1276 |
+
Package: catchfile 2019/12/09 v1.8 Catch the contents of a file (HO)
|
| 1277 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/fvextra/fvextra.sty
|
| 1278 |
+
Package: fvextra 2025/03/04 v1.12.0 fvextra - extensions and patches for fancyvrb
|
| 1279 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/fancyvrb/fancyvrb.sty
|
| 1280 |
+
Package: fancyvrb 2024/01/20 4.5c verbatim text (tvz,hv)
|
| 1281 |
+
\FV@CodeLineNo=\count477
|
| 1282 |
+
\FV@InFile=\read5
|
| 1283 |
+
\FV@TabBox=\box132
|
| 1284 |
+
\c@FancyVerbLine=\count478
|
| 1285 |
+
\FV@StepNumber=\count479
|
| 1286 |
+
\FV@OutFile=\write4
|
| 1287 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/upquote/upquote.sty
|
| 1288 |
+
Package: upquote 2012/04/19 v1.3 upright-quote and grave-accent glyphs in verbatim
|
| 1289 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/base/textcomp.sty
|
| 1290 |
+
Package: textcomp 2024/04/24 v2.1b Standard LaTeX package
|
| 1291 |
+
))
|
| 1292 |
+
|
| 1293 |
+
Package fvextra Warning: csquotes should be loaded after fvextra, to avoid a warning from the lineno package on input line 37.
|
| 1294 |
+
|
| 1295 |
+
\c@FancyVerbWriteLine=\count480
|
| 1296 |
+
\c@FancyVerbBufferLine=\count481
|
| 1297 |
+
\c@FV@TrueTabGroupLevel=\count482
|
| 1298 |
+
\c@FV@TrueTabCounter=\count483
|
| 1299 |
+
\FV@TabBox@Group=\box133
|
| 1300 |
+
\FV@bgcolorstructbox=\box134
|
| 1301 |
+
\FV@TmpLength=\skip124
|
| 1302 |
+
\c@FV@HighlightLinesStart=\count484
|
| 1303 |
+
\c@FV@HighlightLinesStop=\count485
|
| 1304 |
+
\FV@LoopCount=\count486
|
| 1305 |
+
\FV@NCharsBox=\box135
|
| 1306 |
+
\FV@BreakIndent=\dimen453
|
| 1307 |
+
\FV@BreakIndentNChars=\count487
|
| 1308 |
+
\FV@BreakSymbolSepLeft=\dimen454
|
| 1309 |
+
\FV@BreakSymbolSepLeftNChars=\count488
|
| 1310 |
+
\FV@BreakSymbolSepRight=\dimen455
|
| 1311 |
+
\FV@BreakSymbolSepRightNChars=\count489
|
| 1312 |
+
\FV@BreakSymbolIndentLeft=\dimen456
|
| 1313 |
+
\FV@BreakSymbolIndentLeftNChars=\count490
|
| 1314 |
+
\FV@BreakSymbolIndentRight=\dimen457
|
| 1315 |
+
\FV@BreakSymbolIndentRightNChars=\count491
|
| 1316 |
+
\c@FancyVerbLineBreakLast=\count492
|
| 1317 |
+
\FV@LineBox=\box136
|
| 1318 |
+
\FV@LineIndentBox=\box137
|
| 1319 |
+
\c@FV@BreakBufferDepth=\count493
|
| 1320 |
+
\FV@LineWidth=\dimen458
|
| 1321 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/latex2pydata/latex2pydata.sty
|
| 1322 |
+
Package: latex2pydata 2025/03/03 v0.5.0 latex2pydata - write data to file in Python literal format
|
| 1323 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgfopts/pgfopts.sty
|
| 1324 |
+
Package: pgfopts 2014/07/10 v2.1a LaTeX package options with pgfkeys
|
| 1325 |
+
\pgfopts@list@add@a@toks=\toks65
|
| 1326 |
+
\pgfopts@list@add@b@toks=\toks66
|
| 1327 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tools/shellesc.sty
|
| 1328 |
+
Package: shellesc 2023/07/08 v1.0d unified shell escape interface for LaTeX
|
| 1329 |
+
Package shellesc Info: Restricted shell escape enabled on input line 77.
|
| 1330 |
+
)
|
| 1331 |
+
\c@minted@FancyVerbLineTemp=\count494
|
| 1332 |
+
\@float@every@listing=\toks67
|
| 1333 |
+
\c@listing=\count495
|
| 1334 |
+
))
|
| 1335 |
+
\c@tcb@cnt@pbox=\count496
|
| 1336 |
+
(./preamble.tex
|
| 1337 |
+
\savewidth=\skip125
|
| 1338 |
+
\thinwidth=\skip126
|
| 1339 |
+
) (./math_commands.tex
|
| 1340 |
+
LaTeX Font Info: Overwriting math alphabet `\mathsfit' in version `bold'
|
| 1341 |
+
(Font) T1/manrope/m/sl --> T1/manrope/bx/n on input line 377.
|
| 1342 |
+
) (./handles.tex) (./snippets/code_specs.tex)
|
| 1343 |
+
Package csquotes Info: Checking for multilingual support...
|
| 1344 |
+
Package csquotes Info: ... found 'babel' package.
|
| 1345 |
+
Package csquotes Info: Adjusting default style.
|
| 1346 |
+
Package csquotes Info: Redefining alias 'default' -> 'english'.
|
| 1347 |
+
(./main.aux)
|
| 1348 |
+
\openout1 = `main.aux'.
|
| 1349 |
+
|
| 1350 |
+
LaTeX Font Info: Checking defaults for OML/cmm/m/it on input line 222.
|
| 1351 |
+
LaTeX Font Info: ... okay on input line 222.
|
| 1352 |
+
LaTeX Font Info: Checking defaults for OMS/cmsy/m/n on input line 222.
|
| 1353 |
+
LaTeX Font Info: ... okay on input line 222.
|
| 1354 |
+
LaTeX Font Info: Checking defaults for OT1/cmr/m/n on input line 222.
|
| 1355 |
+
LaTeX Font Info: ... okay on input line 222.
|
| 1356 |
+
LaTeX Font Info: Checking defaults for T1/cmr/m/n on input line 222.
|
| 1357 |
+
LaTeX Font Info: ... okay on input line 222.
|
| 1358 |
+
LaTeX Font Info: Checking defaults for TS1/cmr/m/n on input line 222.
|
| 1359 |
+
LaTeX Font Info: ... okay on input line 222.
|
| 1360 |
+
LaTeX Font Info: Checking defaults for OMX/cmex/m/n on input line 222.
|
| 1361 |
+
LaTeX Font Info: ... okay on input line 222.
|
| 1362 |
+
LaTeX Font Info: Checking defaults for U/cmr/m/n on input line 222.
|
| 1363 |
+
LaTeX Font Info: ... okay on input line 222.
|
| 1364 |
+
LaTeX Font Info: Checking defaults for PD1/pdf/m/n on input line 222.
|
| 1365 |
+
LaTeX Font Info: ... okay on input line 222.
|
| 1366 |
+
LaTeX Font Info: Checking defaults for PU/pdf/m/n on input line 222.
|
| 1367 |
+
LaTeX Font Info: ... okay on input line 222.
|
| 1368 |
+
|
| 1369 |
+
*geometry* driver: auto-detecting
|
| 1370 |
+
*geometry* detected driver: pdftex
|
| 1371 |
+
*geometry* verbose mode - [ preamble ] result:
|
| 1372 |
+
* driver: pdftex
|
| 1373 |
+
* paper: <default>
|
| 1374 |
+
* layout: <same size as paper>
|
| 1375 |
+
* layoutoffset:(h,v)=(0.0pt,0.0pt)
|
| 1376 |
+
* modes:
|
| 1377 |
+
* h-part:(L,W,R)=(54.06006pt, 506.17488pt, 54.06006pt)
|
| 1378 |
+
* v-part:(T,H,B)=(54.06006pt, 686.84987pt, 54.06006pt)
|
| 1379 |
+
* \paperwidth=614.295pt
|
| 1380 |
+
* \paperheight=794.96999pt
|
| 1381 |
+
* \textwidth=506.17488pt
|
| 1382 |
+
* \textheight=686.84987pt
|
| 1383 |
+
* \oddsidemargin=-18.20993pt
|
| 1384 |
+
* \evensidemargin=-18.20993pt
|
| 1385 |
+
* \topmargin=-55.20993pt
|
| 1386 |
+
* \headheight=12.0pt
|
| 1387 |
+
* \headsep=25.0pt
|
| 1388 |
+
* \topskip=10.0pt
|
| 1389 |
+
* \footskip=30.0pt
|
| 1390 |
+
* \marginparwidth=65.0pt
|
| 1391 |
+
* \marginparsep=11.0pt
|
| 1392 |
+
* \columnsep=18.49411pt
|
| 1393 |
+
* \skip\footins=9.0pt plus 4.0pt minus 2.0pt
|
| 1394 |
+
* \hoffset=0.0pt
|
| 1395 |
+
* \voffset=0.0pt
|
| 1396 |
+
* \mag=1000
|
| 1397 |
+
* \@twocolumnfalse
|
| 1398 |
+
* \@twosidefalse
|
| 1399 |
+
* \@mparswitchfalse
|
| 1400 |
+
* \@reversemarginfalse
|
| 1401 |
+
* (1in=72.27pt=25.4mm, 1cm=28.453pt)
|
| 1402 |
+
|
| 1403 |
+
Package microtype Info: Patching varwidth to enable character protrusion.
|
| 1404 |
+
\MT@vwid@leftmargin=\dimen459
|
| 1405 |
+
\MT@vwid@rightmargin=\dimen460
|
| 1406 |
+
LaTeX Info: Redefining \microtypecontext on input line 222.
|
| 1407 |
+
Package microtype Info: Applying patch `item' on input line 222.
|
| 1408 |
+
Package microtype Info: Applying patch `toc' on input line 222.
|
| 1409 |
+
Package microtype Info: Applying patch `eqnum' on input line 222.
|
| 1410 |
+
Package microtype Info: Applying patch `footnote' on input line 222.
|
| 1411 |
+
Package microtype Info: Applying patch `verbatim' on input line 222.
|
| 1412 |
+
LaTeX Info: Redefining \microtypesetup on input line 222.
|
| 1413 |
+
Package microtype Info: Generating PDF output.
|
| 1414 |
+
Package microtype Info: Character protrusion enabled (level 2).
|
| 1415 |
+
Package microtype Info: Using default protrusion set `alltext'.
|
| 1416 |
+
Package microtype Info: Automatic font expansion enabled (level 2),
|
| 1417 |
+
(microtype) stretch: 20, shrink: 20, step: 1, non-selected.
|
| 1418 |
+
Package microtype Info: Using default expansion set `alltext-nott'.
|
| 1419 |
+
LaTeX Info: Redefining \showhyphens on input line 222.
|
| 1420 |
+
Package microtype Info: No adjustment of tracking.
|
| 1421 |
+
Package microtype Info: No adjustment of interword spacing.
|
| 1422 |
+
Package microtype Info: No adjustment of character kerning.
|
| 1423 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-cmr.cfg
|
| 1424 |
+
File: mt-cmr.cfg 2013/05/19 v2.2 microtype config. file: Computer Modern Roman (RS)
|
| 1425 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
|
| 1426 |
+
[Loading MPS to PDF converter (version 2006.09.02).]
|
| 1427 |
+
\scratchcounter=\count497
|
| 1428 |
+
\scratchdimen=\dimen461
|
| 1429 |
+
\scratchbox=\box138
|
| 1430 |
+
\nofMPsegments=\count498
|
| 1431 |
+
\nofMParguments=\count499
|
| 1432 |
+
\everyMPshowfont=\toks68
|
| 1433 |
+
\MPscratchCnt=\count500
|
| 1434 |
+
\MPscratchDim=\dimen462
|
| 1435 |
+
\MPnumerator=\count501
|
| 1436 |
+
\makeMPintoPDFobject=\count502
|
| 1437 |
+
\everyMPtoPDFconversion=\toks69
|
| 1438 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty
|
| 1439 |
+
Package: epstopdf-base 2020-01-24 v2.11 Base part for package epstopdf
|
| 1440 |
+
Package epstopdf-base Info: Redefining graphics rule for `.eps' on input line 485.
|
| 1441 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg
|
| 1442 |
+
File: epstopdf-sys.cfg 2010/07/13 v1.3 Configuration of (r)epstopdf for TeX Live
|
| 1443 |
+
))
|
| 1444 |
+
Package caption Info: Begin \AtBeginDocument code.
|
| 1445 |
+
Package caption Info: float package is loaded.
|
| 1446 |
+
Package caption Info: hyperref package is loaded.
|
| 1447 |
+
Package caption Info: listings package is loaded.
|
| 1448 |
+
Package caption Info: longtable package is loaded.
|
| 1449 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/caption/ltcaption.sty
|
| 1450 |
+
Package: ltcaption 2021/01/08 v1.4c longtable captions (AR)
|
| 1451 |
+
)
|
| 1452 |
+
Package caption Info: wrapfig package is loaded.
|
| 1453 |
+
Package caption Info: End \AtBeginDocument code.
|
| 1454 |
+
\c__nicematrix_shift_Ldots_last_row_dim=\dimen463
|
| 1455 |
+
\c__nicematrix_shift_exterior_Vdots_dim=\dimen464
|
| 1456 |
+
\c__nicematrix_innersep_middle_dim=\dimen465
|
| 1457 |
+
\c@tabularnotesi=\count503
|
| 1458 |
+
\enitdp@tabularnotes=\count504
|
| 1459 |
+
\c@tabularnotes*i=\count505
|
| 1460 |
+
\enitdp@tabularnotes*=\count506
|
| 1461 |
+
\c@lstlisting=\count507
|
| 1462 |
+
Package hyperref Info: Link coloring ON on input line 222.
|
| 1463 |
+
(./main.out) (./main.out)
|
| 1464 |
+
\@outlinefile=\write5
|
| 1465 |
+
\openout5 = `main.out'.
|
| 1466 |
+
|
| 1467 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/translations/translations-basic-dictionary-english.trsl
|
| 1468 |
+
File: translations-basic-dictionary-english.trsl (english translation file `translations-basic-dictionary')
|
| 1469 |
+
)
|
| 1470 |
+
Package translations Info: loading dictionary `translations-basic-dictionary' for `english'. on input line 222.
|
| 1471 |
+
Package translations Info: no dictionary file `translations-basic-dictionary-latin.trsl' found. on input line 222.
|
| 1472 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/ninecolors/ninecolors.sty
|
| 1473 |
+
Package: ninecolors 2022-02-13 v2022D Select colors with proper color contrast
|
| 1474 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/l3packages/xparse/xparse.sty
|
| 1475 |
+
Package: xparse 2024-08-16 L3 Experimental document command parser
|
| 1476 |
+
))
|
| 1477 |
+
LaTeX Font Info: Trying to load font information for T1+manrope on input line 225.
|
| 1478 |
+
(./t1manrope.fd
|
| 1479 |
+
File: t1manrope.fd Font definitions for T1/manrope.
|
| 1480 |
+
)
|
| 1481 |
+
Package microtype Info: Loading generic protrusion settings for font family
|
| 1482 |
+
(microtype) `manrope' (encoding: T1).
|
| 1483 |
+
(microtype) For optimal results, create family-specific settings.
|
| 1484 |
+
(microtype) See the microtype manual for details.
|
| 1485 |
+
<logos/oxford_logo.png, id=187, 301.125pt x 301.125pt>
|
| 1486 |
+
File: logos/oxford_logo.png Graphic file (type png)
|
| 1487 |
+
<use logos/oxford_logo.png>
|
| 1488 |
+
Package pdftex.def Info: logos/oxford_logo.png used on input line 225.
|
| 1489 |
+
(pdftex.def) Requested size: 9.99826pt x 10.0pt.
|
| 1490 |
+
<logos/hf.pdf, id=188, 1027.84pt x 1027.84pt>
|
| 1491 |
+
File: logos/hf.pdf Graphic file (type pdf)
|
| 1492 |
+
<use logos/hf.pdf>
|
| 1493 |
+
Package pdftex.def Info: logos/hf.pdf used on input line 225.
|
| 1494 |
+
(pdftex.def) Requested size: 9.99042pt x 10.0pt.
|
| 1495 |
+
LaTeX Font Info: Trying to load font information for U+msa on input line 225.
|
| 1496 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsa.fd
|
| 1497 |
+
File: umsa.fd 2013/01/14 v3.01 AMS symbols A
|
| 1498 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-msa.cfg
|
| 1499 |
+
File: mt-msa.cfg 2006/02/04 v1.1 microtype config. file: AMS symbols (a) (RS)
|
| 1500 |
+
)
|
| 1501 |
+
LaTeX Font Info: Trying to load font information for U+msb on input line 225.
|
| 1502 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsb.fd
|
| 1503 |
+
File: umsb.fd 2013/01/14 v3.01 AMS symbols B
|
| 1504 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/microtype/mt-msb.cfg
|
| 1505 |
+
File: mt-msb.cfg 2005/06/01 v1.0 microtype config. file: AMS symbols (b) (RS)
|
| 1506 |
+
)
|
| 1507 |
+
File: logos/hf.pdf Graphic file (type pdf)
|
| 1508 |
+
<use logos/hf.pdf>
|
| 1509 |
+
Package pdftex.def Info: logos/hf.pdf used on input line 225.
|
| 1510 |
+
(pdftex.def) Requested size: 9.99042pt x 10.0pt.
|
| 1511 |
+
File: logos/hf.pdf Graphic file (type pdf)
|
| 1512 |
+
<use logos/hf.pdf>
|
| 1513 |
+
Package pdftex.def Info: logos/hf.pdf used on input line 225.
|
| 1514 |
+
(pdftex.def) Requested size: 9.99042pt x 10.0pt.
|
| 1515 |
+
File: logos/hf.pdf Graphic file (type pdf)
|
| 1516 |
+
<use logos/hf.pdf>
|
| 1517 |
+
Package pdftex.def Info: logos/hf.pdf used on input line 225.
|
| 1518 |
+
(pdftex.def) Requested size: 9.99042pt x 10.0pt.
|
| 1519 |
+
File: logos/hf.pdf Graphic file (type pdf)
|
| 1520 |
+
<use logos/hf.pdf>
|
| 1521 |
+
Package pdftex.def Info: logos/hf.pdf used on input line 225.
|
| 1522 |
+
(pdftex.def) Requested size: 9.99042pt x 10.0pt.
|
| 1523 |
+
File: logos/oxford_logo.png Graphic file (type png)
|
| 1524 |
+
<use logos/oxford_logo.png>
|
| 1525 |
+
Package pdftex.def Info: logos/oxford_logo.png used on input line 225.
|
| 1526 |
+
(pdftex.def) Requested size: 9.2447pt x 9.24774pt.
|
| 1527 |
+
File: logos/hf.pdf Graphic file (type pdf)
|
| 1528 |
+
<use logos/hf.pdf>
|
| 1529 |
+
Package pdftex.def Info: logos/hf.pdf used on input line 225.
|
| 1530 |
+
(pdftex.def) Requested size: 9.23761pt x 9.24774pt.
|
| 1531 |
+
(./sections/00_abstract.tex
|
| 1532 |
+
Package microtype Info: Loading generic protrusion settings for font family
|
| 1533 |
+
(microtype) `cmtt' (encoding: T1).
|
| 1534 |
+
(microtype) For optimal results, create family-specific settings.
|
| 1535 |
+
(microtype) See the microtype manual for details.
|
| 1536 |
+
|
| 1537 |
+
Underfull \hbox (badness 10000) in paragraph at lines 1--6
|
| 1538 |
+
|
| 1539 |
+
[]
|
| 1540 |
+
|
| 1541 |
+
LaTeX Font Info: Font shape `T1/cmtt/bx/n' in size <9> not available
|
| 1542 |
+
(Font) Font shape `T1/cmtt/m/n' tried instead on input line 7.
|
| 1543 |
+
)
|
| 1544 |
+
<hfstyle/assets/huggingface.pdf, id=190, 966.66142pt x 256.97005pt>
|
| 1545 |
+
File: hfstyle/assets/huggingface.pdf Graphic file (type pdf)
|
| 1546 |
+
<use hfstyle/assets/huggingface.pdf>
|
| 1547 |
+
Package pdftex.def Info: hfstyle/assets/huggingface.pdf used on input line 225.
|
| 1548 |
+
(pdftex.def) Requested size: 101.57654pt x 27.00029pt.
|
| 1549 |
+
(./main.toc
|
| 1550 |
+
|
| 1551 |
+
[1
|
| 1552 |
+
|
| 1553 |
+
{/usr/local/texlive/2025/texmf-dist/fonts/enc/ttf2pk/base/T1-WGL4.enc}{/usr/local/texlive/2025/texmf-dist/fonts/enc/dvips/cm-super/cm-super-t1.enc} <./logos/oxford_logo.png> <./logos/hf.pdf> <./hfstyle/assets/huggingface.pdf>])
|
| 1554 |
+
\tf@toc=\write6
|
| 1555 |
+
\openout6 = `main.toc'.
|
| 1556 |
+
|
| 1557 |
+
(./sections/A_foreword.tex)
|
| 1558 |
+
|
| 1559 |
+
[2{/usr/local/texlive/2025/texmf-dist/fonts/enc/dvips/cm-super/cm-super-ts1.enc}] (./sections/01_introduction.tex
|
| 1560 |
+
<figures/ch1/ch1-lerobot-figure1.png, id=278, 5524.64pt x 1405.25pt>
|
| 1561 |
+
File: figures/ch1/ch1-lerobot-figure1.png Graphic file (type png)
|
| 1562 |
+
<use figures/ch1/ch1-lerobot-figure1.png>
|
| 1563 |
+
Package pdftex.def Info: figures/ch1/ch1-lerobot-figure1.png used on input line 5.
|
| 1564 |
+
(pdftex.def) Requested size: 506.17488pt x 128.73993pt.
|
| 1565 |
+
|
| 1566 |
+
|
| 1567 |
+
[3 <./figures/ch1/ch1-lerobot-figure1.png>]
|
| 1568 |
+
LaTeX Font Info: Font shape `T1/cmtt/bx/n' in size <12> not available
|
| 1569 |
+
(Font) Font shape `T1/cmtt/m/n' tried instead on input line 43.
|
| 1570 |
+
|
| 1571 |
+
|
| 1572 |
+
[4] (/usr/local/texlive/2025/texmf-dist/tex/latex/listings/lstlang1.sty
|
| 1573 |
+
File: lstlang1.sty 2024/09/23 1.10c listings language file
|
| 1574 |
+
)
|
| 1575 |
+
Package hyperref Info: bookmark level for unknown lstlisting defaults to 0 on input line 97.
|
| 1576 |
+
(./snippets/ch1/01_datasets.py
|
| 1577 |
+
Package microtype Info: Loading generic protrusion settings for font family
|
| 1578 |
+
(microtype) `cmtt' (encoding: TS1).
|
| 1579 |
+
(microtype) For optimal results, create family-specific settings.
|
| 1580 |
+
(microtype) See the microtype manual for details.
|
| 1581 |
+
)
|
| 1582 |
+
|
| 1583 |
+
[5] (./snippets/ch1/02_record_data.py)
|
| 1584 |
+
|
| 1585 |
+
[6]
|
| 1586 |
+
|
| 1587 |
+
[7])
|
| 1588 |
+
|
| 1589 |
+
[8] (./sections/02_classic_robotics.tex
|
| 1590 |
+
<figures/ch2/ch2-approaches.png, id=491, 1090.0725pt x 921.4425pt>
|
| 1591 |
+
File: figures/ch2/ch2-approaches.png Graphic file (type png)
|
| 1592 |
+
<use figures/ch2/ch2-approaches.png>
|
| 1593 |
+
Package pdftex.def Info: figures/ch2/ch2-approaches.png used on input line 14.
|
| 1594 |
+
(pdftex.def) Requested size: 253.08743pt x 213.92384pt.
|
| 1595 |
+
|
| 1596 |
+
|
| 1597 |
+
[9 <./figures/ch2/ch2-approaches.png>]
|
| 1598 |
+
<figures/ch2/ch2-platforms.png, id=527, 3131.7pt x 1172.38pt>
|
| 1599 |
+
File: figures/ch2/ch2-platforms.png Graphic file (type png)
|
| 1600 |
+
<use figures/ch2/ch2-platforms.png>
|
| 1601 |
+
Package pdftex.def Info: figures/ch2/ch2-platforms.png used on input line 37.
|
| 1602 |
+
(pdftex.def) Requested size: 354.32086pt x 132.62946pt.
|
| 1603 |
+
|
| 1604 |
+
|
| 1605 |
+
[10 <./figures/ch2/ch2-platforms.png>]
|
| 1606 |
+
<figures/ch2/ch2-cost-accessibility.png, id=574, 1445.4pt x 1087.06125pt>
|
| 1607 |
+
File: figures/ch2/ch2-cost-accessibility.png Graphic file (type png)
|
| 1608 |
+
<use figures/ch2/ch2-cost-accessibility.png>
|
| 1609 |
+
Package pdftex.def Info: figures/ch2/ch2-cost-accessibility.png used on input line 68.
|
| 1610 |
+
(pdftex.def) Requested size: 202.46686pt x 152.27046pt.
|
| 1611 |
+
<figures/ch2/ch2-so100-to-planar-manipulator.png, id=582, 1487.5575pt x 532.99126pt>
|
| 1612 |
+
File: figures/ch2/ch2-so100-to-planar-manipulator.png Graphic file (type png)
|
| 1613 |
+
<use figures/ch2/ch2-so100-to-planar-manipulator.png>
|
| 1614 |
+
Package pdftex.def Info: figures/ch2/ch2-so100-to-planar-manipulator.png used on input line 79.
|
| 1615 |
+
(pdftex.def) Requested size: 354.32086pt x 126.95271pt.
|
| 1616 |
+
|
| 1617 |
+
|
| 1618 |
+
[11 <./figures/ch2/ch2-cost-accessibility.png> <./figures/ch2/ch2-so100-to-planar-manipulator.png>]
|
| 1619 |
+
<figures/ch2/ch2-planar-manipulator-free.png, id=604, 707.64375pt x 622.325pt>
|
| 1620 |
+
File: figures/ch2/ch2-planar-manipulator-free.png Graphic file (type png)
|
| 1621 |
+
<use figures/ch2/ch2-planar-manipulator-free.png>
|
| 1622 |
+
Package pdftex.def Info: figures/ch2/ch2-planar-manipulator-free.png used on input line 101.
|
| 1623 |
+
(pdftex.def) Requested size: 103.52892pt x 91.0467pt.
|
| 1624 |
+
<figures/ch2/ch2-planar-manipulator-floor.png, id=605, 983.675pt x 627.34375pt>
|
| 1625 |
+
File: figures/ch2/ch2-planar-manipulator-floor.png Graphic file (type png)
|
| 1626 |
+
<use figures/ch2/ch2-planar-manipulator-floor.png>
|
| 1627 |
+
Package pdftex.def Info: figures/ch2/ch2-planar-manipulator-floor.png used on input line 107.
|
| 1628 |
+
(pdftex.def) Requested size: 142.7568pt x 91.04387pt.
|
| 1629 |
+
<figures/ch2/ch2-planar-manipulator-floor-shelf.png, id=606, 1033.8625pt x 622.325pt>
|
| 1630 |
+
File: figures/ch2/ch2-planar-manipulator-floor-shelf.png Graphic file (type png)
|
| 1631 |
+
<use figures/ch2/ch2-planar-manipulator-floor-shelf.png>
|
| 1632 |
+
Package pdftex.def Info: figures/ch2/ch2-planar-manipulator-floor-shelf.png used on input line 113.
|
| 1633 |
+
(pdftex.def) Requested size: 151.25502pt x 91.0467pt.
|
| 1634 |
+
|
| 1635 |
+
|
| 1636 |
+
[12 <./figures/ch2/ch2-planar-manipulator-free.png> <./figures/ch2/ch2-planar-manipulator-floor.png> <./figures/ch2/ch2-planar-manipulator-floor-shelf.png>]
|
| 1637 |
+
<figures/ch2/ch2-planar-manipulator-floor-box.png, id=628, 1134.2375pt x 622.325pt>
|
| 1638 |
+
File: figures/ch2/ch2-planar-manipulator-floor-box.png Graphic file (type png)
|
| 1639 |
+
<use figures/ch2/ch2-planar-manipulator-floor-box.png>
|
| 1640 |
+
Package pdftex.def Info: figures/ch2/ch2-planar-manipulator-floor-box.png used on input line 180.
|
| 1641 |
+
(pdftex.def) Requested size: 151.854pt x 83.31705pt.
|
| 1642 |
+
|
| 1643 |
+
|
| 1644 |
+
[13 <./figures/ch2/ch2-planar-manipulator-floor-box.png>]
|
| 1645 |
+
<figures/ch2/ch2-classical-limitations.png, id=648, 3091.55pt x 2368.85pt>
|
| 1646 |
+
File: figures/ch2/ch2-classical-limitations.png Graphic file (type png)
|
| 1647 |
+
<use figures/ch2/ch2-classical-limitations.png>
|
| 1648 |
+
Package pdftex.def Info: figures/ch2/ch2-classical-limitations.png used on input line 205.
|
| 1649 |
+
(pdftex.def) Requested size: 455.55429pt x 349.05896pt.
|
| 1650 |
+
|
| 1651 |
+
|
| 1652 |
+
[14 <./figures/ch2/ch2-classical-limitations.png>])
|
| 1653 |
+
|
| 1654 |
+
[15] (./sections/03_reinforcement_learning.tex
|
| 1655 |
+
<figures/ch3/ch3-learning-benefits.png, id=677, 2930.95pt x 2091.815pt>
|
| 1656 |
+
File: figures/ch3/ch3-learning-benefits.png Graphic file (type png)
|
| 1657 |
+
<use figures/ch3/ch3-learning-benefits.png>
|
| 1658 |
+
Package pdftex.def Info: figures/ch3/ch3-learning-benefits.png used on input line 12.
|
| 1659 |
+
(pdftex.def) Requested size: 455.55429pt x 325.12169pt.
|
| 1660 |
+
|
| 1661 |
+
|
| 1662 |
+
[16 <./figures/ch3/ch3-learning-benefits.png>]
|
| 1663 |
+
<figures/ch3/ch3-learning-atlas.png, id=688, 1369.115pt x 1529.715pt>
|
| 1664 |
+
File: figures/ch3/ch3-learning-atlas.png Graphic file (type png)
|
| 1665 |
+
<use figures/ch3/ch3-learning-atlas.png>
|
| 1666 |
+
Package pdftex.def Info: figures/ch3/ch3-learning-atlas.png used on input line 33.
|
| 1667 |
+
(pdftex.def) Requested size: 151.854pt x 169.64632pt.
|
| 1668 |
+
|
| 1669 |
+
Underfull \hbox (badness 1221) in paragraph at lines 34--34
|
| 1670 |
+
[]\T1/manrope/b/n/9 (+20) Figure 10 $\OMS/cmsy/m/n/9 j$ |\T1/cmr/m/n/9 (+20) Overview of the
|
| 1671 |
+
[]
|
| 1672 |
+
|
| 1673 |
+
<figures/ch3/ch3-rl-examples.png, id=723, 3220.03pt x 1015.795pt>
|
| 1674 |
+
File: figures/ch3/ch3-rl-examples.png Graphic file (type png)
|
| 1675 |
+
<use figures/ch3/ch3-rl-examples.png>
|
| 1676 |
+
Package pdftex.def Info: figures/ch3/ch3-rl-examples.png used on input line 46.
|
| 1677 |
+
(pdftex.def) Requested size: 404.94144pt x 127.73355pt.
|
| 1678 |
+
<figures/ch3/ch3-agent-env.png, id=731, 1107.13625pt x 466.74374pt>
|
| 1679 |
+
File: figures/ch3/ch3-agent-env.png Graphic file (type png)
|
| 1680 |
+
<use figures/ch3/ch3-agent-env.png>
|
| 1681 |
+
Package pdftex.def Info: figures/ch3/ch3-agent-env.png used on input line 65.
|
| 1682 |
+
(pdftex.def) Requested size: 253.08743pt x 106.69357pt.
|
| 1683 |
+
|
| 1684 |
+
|
| 1685 |
+
[17 <./figures/ch3/ch3-rl-examples.png> <./figures/ch3/ch3-learning-atlas.png>]
|
| 1686 |
+
|
| 1687 |
+
[18 <./figures/ch3/ch3-agent-env.png>]
|
| 1688 |
+
<figures/ch3/ch3-rl-algorithms-atlas.png, id=767, 1489.565pt x 1730.465pt>
|
| 1689 |
+
File: figures/ch3/ch3-rl-algorithms-atlas.png Graphic file (type png)
|
| 1690 |
+
<use figures/ch3/ch3-rl-algorithms-atlas.png>
|
| 1691 |
+
Package pdftex.def Info: figures/ch3/ch3-rl-algorithms-atlas.png used on input line 139.
|
| 1692 |
+
(pdftex.def) Requested size: 202.46686pt x 235.18698pt.
|
| 1693 |
+
|
| 1694 |
+
|
| 1695 |
+
[19 <./figures/ch3/ch3-rl-algorithms-atlas.png>]
|
| 1696 |
+
<figures/ch3/ch3-duck-sim-vs-real.png, id=805, 1850.915pt x 1043.9pt>
|
| 1697 |
+
File: figures/ch3/ch3-duck-sim-vs-real.png Graphic file (type png)
|
| 1698 |
+
<use figures/ch3/ch3-duck-sim-vs-real.png>
|
| 1699 |
+
Package pdftex.def Info: figures/ch3/ch3-duck-sim-vs-real.png used on input line 162.
|
| 1700 |
+
(pdftex.def) Requested size: 354.32086pt x 199.82442pt.
|
| 1701 |
+
<figures/ch3/ch3-many-ducks.png, id=818, 3203.97pt x 682.55pt>
|
| 1702 |
+
File: figures/ch3/ch3-many-ducks.png Graphic file (type png)
|
| 1703 |
+
<use figures/ch3/ch3-many-ducks.png>
|
| 1704 |
+
Package pdftex.def Info: figures/ch3/ch3-many-ducks.png used on input line 177.
|
| 1705 |
+
(pdftex.def) Requested size: 455.55429pt x 97.04567pt.
|
| 1706 |
+
|
| 1707 |
+
|
| 1708 |
+
[20 <./figures/ch3/ch3-duck-sim-vs-real.png>]
|
| 1709 |
+
|
| 1710 |
+
[21 <./figures/ch3/ch3-many-ducks.png>]
|
| 1711 |
+
|
| 1712 |
+
[22]
|
| 1713 |
+
<figures/ch3/ch3-hil-serl-examples.png, id=921, 3741.98pt x 2095.83pt>
|
| 1714 |
+
File: figures/ch3/ch3-hil-serl-examples.png Graphic file (type png)
|
| 1715 |
+
<use figures/ch3/ch3-hil-serl-examples.png>
|
| 1716 |
+
Package pdftex.def Info: figures/ch3/ch3-hil-serl-examples.png used on input line 306.
|
| 1717 |
+
(pdftex.def) Requested size: 404.94144pt x 226.80038pt.
|
| 1718 |
+
|
| 1719 |
+
|
| 1720 |
+
[23]
|
| 1721 |
+
<figures/ch3/ch3-hil-serl-architecture.png, id=941, 2971.1pt x 1533.73pt>
|
| 1722 |
+
File: figures/ch3/ch3-hil-serl-architecture.png Graphic file (type png)
|
| 1723 |
+
<use figures/ch3/ch3-hil-serl-architecture.png>
|
| 1724 |
+
Package pdftex.def Info: figures/ch3/ch3-hil-serl-architecture.png used on input line 324.
|
| 1725 |
+
(pdftex.def) Requested size: 455.55429pt x 235.15138pt.
|
| 1726 |
+
|
| 1727 |
+
|
| 1728 |
+
[24 <./figures/ch3/ch3-hil-serl-examples.png>] (./snippets/ch3/01_reward_classifier.py
|
| 1729 |
+
Overfull \hbox (9.50912pt too wide) in paragraph at lines 27--30
|
| 1730 |
+
[][][][][][][][][][][][][][][][][][][][][][][][][][][]
|
| 1731 |
+
[]
|
| 1732 |
+
|
| 1733 |
+
|
| 1734 |
+
Overfull \hbox (4.41032pt too wide) in paragraph at lines 59--60
|
| 1735 |
+
[][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][]
|
| 1736 |
+
[]
|
| 1737 |
+
|
| 1738 |
+
)
|
| 1739 |
+
|
| 1740 |
+
[25 <./figures/ch3/ch3-hil-serl-architecture.png>] (./snippets/ch3/02_actor.py
|
| 1741 |
+
Overfull \hbox (14.60793pt too wide) in paragraph at lines 18--19
|
| 1742 |
+
[][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][]
|
| 1743 |
+
[]
|
| 1744 |
+
|
| 1745 |
+
)
|
| 1746 |
+
|
| 1747 |
+
[26]
|
| 1748 |
+
|
| 1749 |
+
[27] (./snippets/ch3/03_learner.py)
|
| 1750 |
+
|
| 1751 |
+
[28]
|
| 1752 |
+
|
| 1753 |
+
[29] (./snippets/ch3/04_hil_serl.py)
|
| 1754 |
+
|
| 1755 |
+
[30]
|
| 1756 |
+
|
| 1757 |
+
[31])
|
| 1758 |
+
|
| 1759 |
+
[32] (./sections/04_imitation_learning.tex
|
| 1760 |
+
<figures/ch4/ch4-bc-trajectories.png, id=1448, 4187.645pt x 1533.73pt>
|
| 1761 |
+
File: figures/ch4/ch4-bc-trajectories.png Graphic file (type png)
|
| 1762 |
+
<use figures/ch4/ch4-bc-trajectories.png>
|
| 1763 |
+
Package pdftex.def Info: figures/ch4/ch4-bc-trajectories.png used on input line 17.
|
| 1764 |
+
(pdftex.def) Requested size: 404.94144pt x 148.30357pt.
|
| 1765 |
+
<figures/ch4/ch4-observation-action-mapping.png, id=1454, 5468.43pt x 1585.925pt>
|
| 1766 |
+
File: figures/ch4/ch4-observation-action-mapping.png Graphic file (type png)
|
| 1767 |
+
<use figures/ch4/ch4-observation-action-mapping.png>
|
| 1768 |
+
Package pdftex.def Info: figures/ch4/ch4-observation-action-mapping.png used on input line 40.
|
| 1769 |
+
(pdftex.def) Requested size: 455.55429pt x 132.10362pt.
|
| 1770 |
+
|
| 1771 |
+
|
| 1772 |
+
[33 <./figures/ch4/ch4-bc-trajectories.png>]
|
| 1773 |
+
<figures/ch4/ch4-issues-with-bc.png, id=1473, 2188.175pt x 859.21pt>
|
| 1774 |
+
File: figures/ch4/ch4-issues-with-bc.png Graphic file (type png)
|
| 1775 |
+
<use figures/ch4/ch4-issues-with-bc.png>
|
| 1776 |
+
Package pdftex.def Info: figures/ch4/ch4-issues-with-bc.png used on input line 69.
|
| 1777 |
+
(pdftex.def) Requested size: 404.94144pt x 159.0038pt.
|
| 1778 |
+
|
| 1779 |
+
|
| 1780 |
+
[34 <./figures/ch4/ch4-observation-action-mapping.png>]
|
| 1781 |
+
<figures/ch4/ch4-task-effect-on-pairs.png, id=1507, 2493.315pt x 971.63pt>
|
| 1782 |
+
File: figures/ch4/ch4-task-effect-on-pairs.png Graphic file (type png)
|
| 1783 |
+
<use figures/ch4/ch4-task-effect-on-pairs.png>
|
| 1784 |
+
Package pdftex.def Info: figures/ch4/ch4-task-effect-on-pairs.png used on input line 94.
|
| 1785 |
+
(pdftex.def) Requested size: 404.94144pt x 157.79163pt.
|
| 1786 |
+
|
| 1787 |
+
|
| 1788 |
+
[35 <./figures/ch4/ch4-issues-with-bc.png> <./figures/ch4/ch4-task-effect-on-pairs.png>]
|
| 1789 |
+
<figures/ch4/ch4-latent-variable-model.png, id=1521, 4516.875pt x 1104.125pt>
|
| 1790 |
+
File: figures/ch4/ch4-latent-variable-model.png Graphic file (type png)
|
| 1791 |
+
<use figures/ch4/ch4-latent-variable-model.png>
|
| 1792 |
+
Package pdftex.def Info: figures/ch4/ch4-latent-variable-model.png used on input line 113.
|
| 1793 |
+
(pdftex.def) Requested size: 455.55429pt x 111.34558pt.
|
| 1794 |
+
|
| 1795 |
+
|
| 1796 |
+
[36 <./figures/ch4/ch4-latent-variable-model.png>]
|
| 1797 |
+
|
| 1798 |
+
[37]
|
| 1799 |
+
<figures/ch4/ch4-many-latents.png, id=1568, 2485.285pt x 1308.89pt>
|
| 1800 |
+
File: figures/ch4/ch4-many-latents.png Graphic file (type png)
|
| 1801 |
+
<use figures/ch4/ch4-many-latents.png>
|
| 1802 |
+
Package pdftex.def Info: figures/ch4/ch4-many-latents.png used on input line 189.
|
| 1803 |
+
(pdftex.def) Requested size: 253.08743pt x 133.27333pt.
|
| 1804 |
+
|
| 1805 |
+
|
| 1806 |
+
[38 <./figures/ch4/ch4-many-latents.png>]
|
| 1807 |
+
<figures/ch4/ch4-diffusion-robot-actions.png, id=1596, 2637.855pt x 3131.7pt>
|
| 1808 |
+
File: figures/ch4/ch4-diffusion-robot-actions.png Graphic file (type png)
|
| 1809 |
+
<use figures/ch4/ch4-diffusion-robot-actions.png>
|
| 1810 |
+
Package pdftex.def Info: figures/ch4/ch4-diffusion-robot-actions.png used on input line 251.
|
| 1811 |
+
(pdftex.def) Requested size: 455.55429pt x 540.84006pt.
|
| 1812 |
+
|
| 1813 |
+
|
| 1814 |
+
[39]
|
| 1815 |
+
|
| 1816 |
+
[40 <./figures/ch4/ch4-diffusion-robot-actions.png>]
|
| 1817 |
+
<figures/ch4/ch4-action-vs-observation-distribution.png, id=1633, 1232.605pt x 987.69pt>
|
| 1818 |
+
File: figures/ch4/ch4-action-vs-observation-distribution.png Graphic file (type png)
|
| 1819 |
+
<use figures/ch4/ch4-action-vs-observation-distribution.png>
|
| 1820 |
+
Package pdftex.def Info: figures/ch4/ch4-action-vs-observation-distribution.png used on input line 273.
|
| 1821 |
+
(pdftex.def) Requested size: 151.854pt x 121.6675pt.
|
| 1822 |
+
|
| 1823 |
+
|
| 1824 |
+
[41 <./figures/ch4/ch4-action-vs-observation-distribution.png>]
|
| 1825 |
+
<figures/ch4/ch4-normalizing-flows.png, id=1663, 3966.82pt x 1473.505pt>
|
| 1826 |
+
File: figures/ch4/ch4-normalizing-flows.png Graphic file (type png)
|
| 1827 |
+
<use figures/ch4/ch4-normalizing-flows.png>
|
| 1828 |
+
Package pdftex.def Info: figures/ch4/ch4-normalizing-flows.png used on input line 318.
|
| 1829 |
+
(pdftex.def) Requested size: 455.55429pt x 169.21342pt.
|
| 1830 |
+
|
| 1831 |
+
|
| 1832 |
+
[42 <./figures/ch4/ch4-normalizing-flows.png>]
|
| 1833 |
+
<figures/ch4/ch4-diffusion-vs-flowmatching.png, id=1680, 1084.05pt x 361.35pt>
|
| 1834 |
+
File: figures/ch4/ch4-diffusion-vs-flowmatching.png Graphic file (type png)
|
| 1835 |
+
<use figures/ch4/ch4-diffusion-vs-flowmatching.png>
|
| 1836 |
+
Package pdftex.def Info: figures/ch4/ch4-diffusion-vs-flowmatching.png used on input line 332.
|
| 1837 |
+
(pdftex.def) Requested size: 506.17488pt x 168.72626pt.
|
| 1838 |
+
|
| 1839 |
+
|
| 1840 |
+
[43 <./figures/ch4/ch4-diffusion-vs-flowmatching.png>]
|
| 1841 |
+
<figures/ch4/ch4-act-encoder.png, id=1728, 4131.435pt x 2750.275pt>
|
| 1842 |
+
File: figures/ch4/ch4-act-encoder.png Graphic file (type png)
|
| 1843 |
+
<use figures/ch4/ch4-act-encoder.png>
|
| 1844 |
+
Package pdftex.def Info: figures/ch4/ch4-act-encoder.png used on input line 387.
|
| 1845 |
+
(pdftex.def) Requested size: 379.63115pt x 252.71782pt.
|
| 1846 |
+
<figures/ch4/ch4-act-decoder.png, id=1731, 5994.395pt x 3830.31pt>
|
| 1847 |
+
File: figures/ch4/ch4-act-decoder.png Graphic file (type png)
|
| 1848 |
+
<use figures/ch4/ch4-act-decoder.png>
|
| 1849 |
+
Package pdftex.def Info: figures/ch4/ch4-act-decoder.png used on input line 399.
|
| 1850 |
+
(pdftex.def) Requested size: 379.63115pt x 242.54985pt.
|
| 1851 |
+
<figures/ch4/ch4-act.png, id=1732, 3926.67pt x 1698.345pt>
|
| 1852 |
+
File: figures/ch4/ch4-act.png Graphic file (type png)
|
| 1853 |
+
<use figures/ch4/ch4-act.png>
|
| 1854 |
+
Package pdftex.def Info: figures/ch4/ch4-act.png used on input line 408.
|
| 1855 |
+
(pdftex.def) Requested size: 455.55429pt x 197.02888pt.
|
| 1856 |
+
(./snippets/ch4/01_training_act.py)
|
| 1857 |
+
|
| 1858 |
+
[44]
|
| 1859 |
+
|
| 1860 |
+
[45 <./figures/ch4/ch4-act-encoder.png> <./figures/ch4/ch4-act-decoder.png>]
|
| 1861 |
+
|
| 1862 |
+
[46 <./figures/ch4/ch4-act.png>] (./snippets/ch4/02_using_act.py)
|
| 1863 |
+
|
| 1864 |
+
[47]
|
| 1865 |
+
|
| 1866 |
+
[48]
|
| 1867 |
+
<figures/ch4/ch4-diffusion-policy.png, id=1944, 6114.845pt x 2971.1pt>
|
| 1868 |
+
File: figures/ch4/ch4-diffusion-policy.png Graphic file (type png)
|
| 1869 |
+
<use figures/ch4/ch4-diffusion-policy.png>
|
| 1870 |
+
Package pdftex.def Info: figures/ch4/ch4-diffusion-policy.png used on input line 442.
|
| 1871 |
+
(pdftex.def) Requested size: 455.55429pt x 221.32681pt.
|
| 1872 |
+
|
| 1873 |
+
|
| 1874 |
+
[49 <./figures/ch4/ch4-diffusion-policy.png>]
|
| 1875 |
+
Underfull \hbox (badness 10000) in paragraph at lines 465--465
|
| 1876 |
+
[][]$\T1/cmtt/m/n/9 https : / / github . com / fracapuano / robot-[]learning-[]tutorial / blob / main / snippets / ch4 / 03 _ training _ diffusion .
|
| 1877 |
+
[]
|
| 1878 |
+
|
| 1879 |
+
(./snippets/ch4/03_training_diffusion.py)
|
| 1880 |
+
|
| 1881 |
+
[50] (./snippets/ch4/04_using_diffusion.py)
|
| 1882 |
+
|
| 1883 |
+
[51]
|
| 1884 |
+
|
| 1885 |
+
[52]
|
| 1886 |
+
<figures/ch4/ch4-async-inference.png, id=2173, 2413.015pt x 1200.485pt>
|
| 1887 |
+
File: figures/ch4/ch4-async-inference.png Graphic file (type png)
|
| 1888 |
+
<use figures/ch4/ch4-async-inference.png>
|
| 1889 |
+
Package pdftex.def Info: figures/ch4/ch4-async-inference.png used on input line 497.
|
| 1890 |
+
(pdftex.def) Requested size: 455.55429pt x 226.62909pt.
|
| 1891 |
+
Package hyperref Info: bookmark level for unknown algorithm defaults to 0 on input line 505.
|
| 1892 |
+
|
| 1893 |
+
|
| 1894 |
+
[53 <./figures/ch4/ch4-async-inference.png>]
|
| 1895 |
+
<figures/ch4/ch4-queues.png, id=2186, 3292.3pt x 983.675pt>
|
| 1896 |
+
File: figures/ch4/ch4-queues.png Graphic file (type png)
|
| 1897 |
+
<use figures/ch4/ch4-queues.png>
|
| 1898 |
+
Package pdftex.def Info: figures/ch4/ch4-queues.png used on input line 550.
|
| 1899 |
+
(pdftex.def) Requested size: 501.1159pt x 149.72128pt.
|
| 1900 |
+
|
| 1901 |
+
|
| 1902 |
+
[54 <./figures/ch4/ch4-queues.png>] (./snippets/ch4/05_policy_server.py) (./snippets/ch4/06_robot_client.py
|
| 1903 |
+
Overfull \hbox (14.60793pt too wide) in paragraph at lines 9--10
|
| 1904 |
+
[][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][]
|
| 1905 |
+
[]
|
| 1906 |
+
|
| 1907 |
+
)
|
| 1908 |
+
|
| 1909 |
+
[55])
|
| 1910 |
+
|
| 1911 |
+
[56] (./sections/05_foundation_models.tex
|
| 1912 |
+
<figures/ch5/ch5-ml-vs-robotics-foundation.png, id=2290, 1995.455pt x 1114.1625pt>
|
| 1913 |
+
File: figures/ch5/ch5-ml-vs-robotics-foundation.png Graphic file (type png)
|
| 1914 |
+
<use figures/ch5/ch5-ml-vs-robotics-foundation.png>
|
| 1915 |
+
Package pdftex.def Info: figures/ch5/ch5-ml-vs-robotics-foundation.png used on input line 18.
|
| 1916 |
+
(pdftex.def) Requested size: 455.55429pt x 254.3479pt.
|
| 1917 |
+
|
| 1918 |
+
|
| 1919 |
+
[57 <./figures/ch5/ch5-ml-vs-robotics-foundation.png>]
|
| 1920 |
+
<figures/ch5/ch5-generalist-policies-timeline.png, id=2304, 1469.49pt x 548.0475pt>
|
| 1921 |
+
File: figures/ch5/ch5-generalist-policies-timeline.png Graphic file (type png)
|
| 1922 |
+
<use figures/ch5/ch5-generalist-policies-timeline.png>
|
| 1923 |
+
Package pdftex.def Info: figures/ch5/ch5-generalist-policies-timeline.png used on input line 36.
|
| 1924 |
+
(pdftex.def) Requested size: 404.94144pt x 151.01875pt.
|
| 1925 |
+
|
| 1926 |
+
|
| 1927 |
+
[58 <./figures/ch5/ch5-generalist-policies-timeline.png>]
|
| 1928 |
+
<figures/ch5/ch5-trends.png, id=2368, 3411.74625pt x 888.31876pt>
|
| 1929 |
+
File: figures/ch5/ch5-trends.png Graphic file (type png)
|
| 1930 |
+
<use figures/ch5/ch5-trends.png>
|
| 1931 |
+
Package pdftex.def Info: figures/ch5/ch5-trends.png used on input line 71.
|
| 1932 |
+
(pdftex.def) Requested size: 455.55429pt x 118.60306pt.
|
| 1933 |
+
|
| 1934 |
+
|
| 1935 |
+
[59 <./figures/ch5/ch5-trends.png>]
|
| 1936 |
+
|
| 1937 |
+
Package hyperref Warning: Token not allowed in a PDF string (Unicode):
|
| 1938 |
+
(hyperref) removing `math shift' on input line 107.
|
| 1939 |
+
|
| 1940 |
+
|
| 1941 |
+
Package hyperref Warning: Token not allowed in a PDF string (Unicode):
|
| 1942 |
+
(hyperref) removing `\pi' on input line 107.
|
| 1943 |
+
|
| 1944 |
+
|
| 1945 |
+
Package hyperref Warning: Token not allowed in a PDF string (Unicode):
|
| 1946 |
+
(hyperref) removing `subscript' on input line 107.
|
| 1947 |
+
|
| 1948 |
+
|
| 1949 |
+
|
| 1950 |
+
[60]
|
| 1951 |
+
<figures/ch5/ch5-pi0.png, id=2498, 2854.665pt x 1427.3325pt>
|
| 1952 |
+
File: figures/ch5/ch5-pi0.png Graphic file (type png)
|
| 1953 |
+
<use figures/ch5/ch5-pi0.png>
|
| 1954 |
+
Package pdftex.def Info: figures/ch5/ch5-pi0.png used on input line 115.
|
| 1955 |
+
(pdftex.def) Requested size: 455.55429pt x 227.76804pt.
|
| 1956 |
+
|
| 1957 |
+
|
| 1958 |
+
[61 <./figures/ch5/ch5-pi0.png>]
|
| 1959 |
+
<figures/ch5/ch5-pi0-sampling-timesteps.png, id=2524, 761.84625pt x 520.94624pt>
|
| 1960 |
+
File: figures/ch5/ch5-pi0-sampling-timesteps.png Graphic file (type png)
|
| 1961 |
+
<use figures/ch5/ch5-pi0-sampling-timesteps.png>
|
| 1962 |
+
Package pdftex.def Info: figures/ch5/ch5-pi0-sampling-timesteps.png used on input line 174.
|
| 1963 |
+
(pdftex.def) Requested size: 202.46686pt x 138.43959pt.
|
| 1964 |
+
|
| 1965 |
+
|
| 1966 |
+
[62 <./figures/ch5/ch5-pi0-sampling-timesteps.png>]
|
| 1967 |
+
LaTeX Font Info: Calculating math sizes for size <11> on input line 195.
|
| 1968 |
+
|
| 1969 |
+
|
| 1970 |
+
LaTeX Font Warning: Font shape `OT1/cmr/m/n' in size <5.5> not available
|
| 1971 |
+
(Font) size <5> substituted on input line 195.
|
| 1972 |
+
|
| 1973 |
+
|
| 1974 |
+
LaTeX Font Warning: Font shape `OML/cmm/m/it' in size <5.5> not available
|
| 1975 |
+
(Font) size <5> substituted on input line 195.
|
| 1976 |
+
|
| 1977 |
+
|
| 1978 |
+
LaTeX Font Warning: Font shape `OMS/cmsy/m/n' in size <5.5> not available
|
| 1979 |
+
(Font) size <5> substituted on input line 195.
|
| 1980 |
+
|
| 1981 |
+
|
| 1982 |
+
LaTeX Font Warning: Font shape `OT1/cmr/bx/n' in size <5.5> not available
|
| 1983 |
+
(Font) size <5> substituted on input line 195.
|
| 1984 |
+
|
| 1985 |
+
|
| 1986 |
+
LaTeX Font Warning: Font shape `OML/cmm/b/it' in size <5.5> not available
|
| 1987 |
+
(Font) size <5> substituted on input line 195.
|
| 1988 |
+
|
| 1989 |
+
|
| 1990 |
+
LaTeX Font Warning: Font shape `OMS/cmsy/b/n' in size <5.5> not available
|
| 1991 |
+
(Font) size <5> substituted on input line 195.
|
| 1992 |
+
|
| 1993 |
+
|
| 1994 |
+
Package hyperref Warning: Token not allowed in a PDF string (Unicode):
|
| 1995 |
+
(hyperref) removing `math shift' on input line 195.
|
| 1996 |
+
|
| 1997 |
+
|
| 1998 |
+
Package hyperref Warning: Token not allowed in a PDF string (Unicode):
|
| 1999 |
+
(hyperref) removing `\pi' on input line 195.
|
| 2000 |
+
|
| 2001 |
+
|
| 2002 |
+
Package hyperref Warning: Token not allowed in a PDF string (Unicode):
|
| 2003 |
+
(hyperref) removing `subscript' on input line 195.
|
| 2004 |
+
|
| 2005 |
+
(./snippets/ch5/01_using_pi0.py)
|
| 2006 |
+
|
| 2007 |
+
[63]
|
| 2008 |
+
<figures/ch5/ch5-smolvla.png, id=2609, 4215.75pt x 2071.74pt>
|
| 2009 |
+
File: figures/ch5/ch5-smolvla.png Graphic file (type png)
|
| 2010 |
+
<use figures/ch5/ch5-smolvla.png>
|
| 2011 |
+
Package pdftex.def Info: figures/ch5/ch5-smolvla.png used on input line 206.
|
| 2012 |
+
(pdftex.def) Requested size: 455.55429pt x 223.84575pt.
|
| 2013 |
+
|
| 2014 |
+
|
| 2015 |
+
[64 <./figures/ch5/ch5-smolvla.png>] (./snippets/ch5/02_using_smolvla.py)
|
| 2016 |
+
|
| 2017 |
+
[65])
|
| 2018 |
+
|
| 2019 |
+
[66] (./sections/07_conclusions.tex) (./main.bbl
|
| 2020 |
+
|
| 2021 |
+
[67]
|
| 2022 |
+
|
| 2023 |
+
[68]
|
| 2024 |
+
|
| 2025 |
+
[69]
|
| 2026 |
+
|
| 2027 |
+
[70]
|
| 2028 |
+
|
| 2029 |
+
[71]
|
| 2030 |
+
|
| 2031 |
+
[72]
|
| 2032 |
+
|
| 2033 |
+
[73]
|
| 2034 |
+
|
| 2035 |
+
[74]
|
| 2036 |
+
|
| 2037 |
+
[75])
|
| 2038 |
+
|
| 2039 |
+
[76]
|
| 2040 |
+
runsystem(latexminted cleantemp --timestamp 20251013185510 FAD58DE7366495DB4650CFEFAC2FCD61)...executed safely (allowed).
|
| 2041 |
+
|
| 2042 |
+
(./main.aux)
|
| 2043 |
+
***********
|
| 2044 |
+
LaTeX2e <2024-11-01> patch level 2
|
| 2045 |
+
L3 programming layer <2025-01-18>
|
| 2046 |
+
***********
|
| 2047 |
+
|
| 2048 |
+
|
| 2049 |
+
LaTeX Font Warning: Size substitutions with differences
|
| 2050 |
+
(Font) up to 0.5pt have occurred.
|
| 2051 |
+
|
| 2052 |
+
Package rerunfilecheck Info: File `main.out' has not changed.
|
| 2053 |
+
(rerunfilecheck) Checksum: C1A1034AE4EAF3F94DB7F44CB26742A6;7176.
|
| 2054 |
+
)
|
| 2055 |
+
Here is how much of TeX's memory you used:
|
| 2056 |
+
57004 strings out of 473190
|
| 2057 |
+
1236963 string characters out of 5715800
|
| 2058 |
+
2180559 words of memory out of 5000000
|
| 2059 |
+
77309 multiletter control sequences out of 15000+600000
|
| 2060 |
+
619795 words of font info for 422 fonts, out of 8000000 for 9000
|
| 2061 |
+
1141 hyphenation exceptions out of 8191
|
| 2062 |
+
101i,15n,112p,8408b,1846s stack positions out of 10000i,1000n,20000p,200000b,200000s
|
| 2063 |
+
<hfstyle/manrope/Manrope-Regular.ttf><hfstyle/manrope/Manrope-Bold.ttf></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmbx10.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmbx7.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmex10.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cmextra/cmex7.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi10.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi12.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi5.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi6.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi7.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmi9.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmmib10.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr10.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr5.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr6.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr7.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr8.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmr9.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy10.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy5.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy7.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmsy9.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/symbols/msbm10.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/symbols/msbm7.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfbx1000.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfcc1000.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0500.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0600.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0700.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0800.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm0900.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfrm1000.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfti0800.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfti0900.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sfti1000.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt0800.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt0900.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt1000.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/cm-super/sftt1200.pfb>
|
| 2064 |
+
Output written on main.pdf (76 pages, 36567388 bytes).
|
| 2065 |
+
PDF statistics:
|
| 2066 |
+
3252 PDF objects out of 3580 (max. 8388607)
|
| 2067 |
+
2950 compressed objects within 30 object streams
|
| 2068 |
+
1474 named destinations out of 1728 (max. 500000)
|
| 2069 |
+
107673 words of extra memory for PDF output out of 128383 (max. 10000000)
|
| 2070 |
+
|
app/scripts/latex-to-mdx/input/main.out
ADDED
|
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\BOOKMARK [1][-]{section.1}{\376\377\000I\000n\000t\000r\000o\000d\000u\000c\000t\000i\000o\000n}{}% 1
|
| 2 |
+
\BOOKMARK [2][-]{subsection.1.1}{\376\377\000L\000e\000R\000o\000b\000o\000t\000D\000a\000t\000a\000s\000e\000t}{section.1}% 2
|
| 3 |
+
\BOOKMARK [3][-]{subsubsection.1.1.1}{\376\377\000T\000h\000e\000\040\000d\000a\000t\000a\000s\000e\000t\000\040\000c\000l\000a\000s\000s\000\040\000d\000e\000s\000i\000g\000n}{subsection.1.1}% 3
|
| 4 |
+
\BOOKMARK [2][-]{subsection.1.2}{\376\377\000C\000o\000d\000e\000\040\000E\000x\000a\000m\000p\000l\000e\000:\000\040\000B\000a\000t\000c\000h\000i\000n\000g\000\040\000a\000\040\000\050\000S\000t\000r\000e\000a\000m\000i\000n\000g\000\051\000\040\000D\000a\000t\000a\000s\000e\000t}{section.1}% 4
|
| 5 |
+
\BOOKMARK [2][-]{subsection.1.3}{\376\377\000C\000o\000d\000e\000\040\000E\000x\000a\000m\000p\000l\000e\000:\000\040\000C\000o\000l\000l\000e\000c\000t\000i\000n\000g\000\040\000D\000a\000t\000a}{section.1}% 5
|
| 6 |
+
\BOOKMARK [1][-]{section.2}{\376\377\000C\000l\000a\000s\000s\000i\000c\000a\000l\000\040\000R\000o\000b\000o\000t\000i\000c\000s}{}% 6
|
| 7 |
+
\BOOKMARK [2][-]{subsection.2.1}{\376\377\000E\000x\000p\000l\000i\000c\000i\000t\000\040\000a\000n\000d\000\040\000I\000m\000p\000l\000i\000c\000i\000t\000\040\000M\000o\000d\000e\000l\000s}{section.2}% 7
|
| 8 |
+
\BOOKMARK [2][-]{subsection.2.2}{\376\377\000D\000i\000f\000f\000e\000r\000e\000n\000t\000\040\000T\000y\000p\000e\000s\000\040\000o\000f\000\040\000M\000o\000t\000i\000o\000n}{section.2}% 8
|
| 9 |
+
\BOOKMARK [2][-]{subsection.2.3}{\376\377\000E\000x\000a\000m\000p\000l\000e\000:\000\040\000P\000l\000a\000n\000a\000r\000\040\000M\000a\000n\000i\000p\000u\000l\000a\000t\000i\000o\000n}{section.2}% 9
|
| 10 |
+
\BOOKMARK [3][-]{subsubsection.2.3.1}{\376\377\000A\000d\000d\000i\000n\000g\000\040\000F\000e\000e\000d\000b\000a\000c\000k\000\040\000L\000o\000o\000p\000s}{subsection.2.3}% 10
|
| 11 |
+
\BOOKMARK [2][-]{subsection.2.4}{\376\377\000L\000i\000m\000i\000t\000a\000t\000i\000o\000n\000s\000\040\000o\000f\000\040\000D\000y\000n\000a\000m\000i\000c\000s\000-\000b\000a\000s\000e\000d\000\040\000R\000o\000b\000o\000t\000i\000c\000s}{section.2}% 11
|
| 12 |
+
\BOOKMARK [1][-]{section.3}{\376\377\000R\000o\000b\000o\000t\000\040\000\050\000R\000e\000i\000n\000f\000o\000r\000c\000e\000m\000e\000n\000t\000\051\000\040\000L\000e\000a\000r\000n\000i\000n\000g}{}% 12
|
| 13 |
+
\BOOKMARK [2][-]{subsection.3.1}{\376\377\000A\000\040\000\050\000C\000o\000n\000c\000i\000s\000e\000\051\000\040\000I\000n\000t\000r\000o\000d\000u\000c\000t\000i\000o\000n\000\040\000t\000o\000\040\000R\000L}{section.3}% 13
|
| 14 |
+
\BOOKMARK [2][-]{subsection.3.2}{\376\377\000R\000e\000a\000l\000-\000w\000o\000r\000l\000d\000\040\000R\000L\000\040\000f\000o\000r\000\040\000R\000o\000b\000o\000t\000i\000c\000s}{section.3}% 14
|
| 15 |
+
\BOOKMARK [3][-]{subsubsection.3.2.1}{\376\377\000C\000o\000d\000e\000\040\000E\000x\000a\000m\000p\000l\000e\000:\000\040\000R\000e\000a\000l\000-\000w\000o\000r\000l\000d\000\040\000R\000L}{subsection.3.2}% 15
|
| 16 |
+
\BOOKMARK [3][-]{subsubsection.3.2.2}{\376\377\000L\000i\000m\000i\000t\000a\000t\000i\000o\000n\000s\000\040\000o\000f\000\040\000R\000L\000\040\000i\000n\000\040\000R\000e\000a\000l\000-\000W\000o\000r\000l\000d\000\040\000R\000o\000b\000o\000t\000i\000c\000s\000:\000\040\000S\000i\000m\000u\000l\000a\000t\000o\000r\000s\000\040\000a\000n\000d\000\040\000R\000e\000w\000a\000r\000d\000\040\000D\000e\000s\000i\000g\000n}{subsection.3.2}% 16
|
| 17 |
+
\BOOKMARK [1][-]{section.4}{\376\377\000R\000o\000b\000o\000t\000\040\000\050\000I\000m\000i\000t\000a\000t\000i\000o\000n\000\051\000\040\000L\000e\000a\000r\000n\000i\000n\000g}{}% 17
|
| 18 |
+
\BOOKMARK [2][-]{subsection.4.1}{\376\377\000A\000\040\000\050\000C\000o\000n\000c\000i\000s\000e\000\051\000\040\000I\000n\000t\000r\000o\000d\000u\000c\000t\000i\000o\000n\000\040\000t\000o\000\040\000G\000e\000n\000e\000r\000a\000t\000i\000v\000e\000\040\000M\000o\000d\000e\000l\000s}{section.4}% 18
|
| 19 |
+
\BOOKMARK [3][-]{subsubsection.4.1.1}{\376\377\000V\000a\000r\000i\000a\000t\000i\000o\000n\000a\000l\000\040\000A\000u\000t\000o\000-\000E\000n\000c\000o\000d\000e\000r\000s}{subsection.4.1}% 19
|
| 20 |
+
\BOOKMARK [3][-]{subsubsection.4.1.2}{\376\377\000D\000i\000f\000f\000u\000s\000i\000o\000n\000\040\000M\000o\000d\000e\000l\000s}{subsection.4.1}% 20
|
| 21 |
+
\BOOKMARK [3][-]{subsubsection.4.1.3}{\376\377\000F\000l\000o\000w\000\040\000M\000a\000t\000c\000h\000i\000n\000g}{subsection.4.1}% 21
|
| 22 |
+
\BOOKMARK [2][-]{subsection.4.2}{\376\377\000A\000c\000t\000i\000o\000n\000\040\000C\000h\000u\000n\000k\000i\000n\000g\000\040\000w\000i\000t\000h\000\040\000T\000r\000a\000n\000s\000f\000o\000r\000m\000e\000r\000s}{section.4}% 22
|
| 23 |
+
\BOOKMARK [3][-]{subsubsection.4.2.1}{\376\377\000C\000o\000d\000e\000\040\000E\000x\000a\000m\000p\000l\000e\000:\000\040\000T\000r\000a\000i\000n\000i\000n\000g\000\040\000a\000n\000d\000\040\000U\000s\000i\000n\000g\000\040\000A\000C\000T\000\040\000i\000n\000\040\000P\000r\000a\000c\000t\000i\000c\000e}{subsection.4.2}% 23
|
| 24 |
+
\BOOKMARK [2][-]{subsection.4.3}{\376\377\000D\000i\000f\000f\000u\000s\000i\000o\000n\000\040\000P\000o\000l\000i\000c\000y}{section.4}% 24
|
| 25 |
+
\BOOKMARK [3][-]{subsubsection.4.3.1}{\376\377\000C\000o\000d\000e\000\040\000E\000x\000a\000m\000p\000l\000e\000:\000\040\000T\000r\000a\000i\000n\000i\000n\000g\000\040\000a\000n\000d\000\040\000U\000s\000i\000n\000g\000\040\000D\000i\000f\000f\000u\000s\000i\000o\000n\000\040\000P\000o\000l\000i\000c\000i\000e\000s\000\040\000i\000n\000\040\000P\000r\000a\000c\000t\000i\000c\000e}{subsection.4.3}% 25
|
| 26 |
+
\BOOKMARK [2][-]{subsection.4.4}{\376\377\000O\000p\000t\000i\000m\000i\000z\000e\000d\000\040\000I\000n\000f\000e\000r\000e\000n\000c\000e}{section.4}% 26
|
| 27 |
+
\BOOKMARK [3][-]{subsubsection.4.4.1}{\376\377\000C\000o\000d\000e\000\040\000E\000x\000a\000m\000p\000l\000e\000:\000\040\000U\000s\000i\000n\000g\000\040\000A\000s\000y\000n\000c\000\040\000I\000n\000f\000e\000r\000e\000n\000c\000e}{subsection.4.4}% 27
|
| 28 |
+
\BOOKMARK [1][-]{section.5}{\376\377\000G\000e\000n\000e\000r\000a\000l\000i\000s\000t\000\040\000R\000o\000b\000o\000t\000\040\000P\000o\000l\000i\000c\000i\000e\000s}{}% 28
|
| 29 |
+
\BOOKMARK [2][-]{subsection.5.1}{\376\377\000P\000r\000e\000l\000i\000m\000i\000n\000a\000r\000i\000e\000s\000:\000\040\000M\000o\000d\000e\000l\000s\000\040\000a\000n\000d\000\040\000D\000a\000t\000a}{section.5}% 29
|
| 30 |
+
\BOOKMARK [2][-]{subsection.5.2}{\376\377\000V\000L\000A\000s}{section.5}% 30
|
| 31 |
+
\BOOKMARK [3][-]{subsubsection.5.2.1}{\376\377\000V\000L\000M\000s\000\040\000f\000o\000r\000\040\000V\000L\000A\000s}{subsection.5.2}% 31
|
| 32 |
+
\BOOKMARK [2][-]{subsection.5.3}{\376\377\000\040\0000\000\040}{section.5}% 32
|
| 33 |
+
\BOOKMARK [3][-]{subsubsection.5.3.1}{\376\377\000C\000o\000d\000e\000\040\000E\000x\000a\000m\000p\000l\000e\000:\000\040\000U\000s\000i\000n\000g\000\040\000\040\0000\000\040}{subsection.5.3}% 33
|
| 34 |
+
\BOOKMARK [2][-]{subsection.5.4}{\376\377\000S\000m\000o\000l\000V\000L\000A}{section.5}% 34
|
| 35 |
+
\BOOKMARK [3][-]{subsubsection.5.4.1}{\376\377\000C\000o\000d\000e\000\040\000E\000x\000a\000m\000p\000l\000e\000:\000\040\000U\000s\000i\000n\000g\000\040\000S\000m\000o\000l\000V\000L\000A}{subsection.5.4}% 35
|
| 36 |
+
\BOOKMARK [1][-]{section.6}{\376\377\000C\000o\000n\000c\000l\000u\000s\000i\000o\000n\000s}{}% 36
|
app/scripts/latex-to-mdx/input/main.tex
CHANGED
|
@@ -28,9 +28,9 @@
|
|
| 28 |
\usepackage{makecell}
|
| 29 |
\usepackage{adjustbox}
|
| 30 |
|
| 31 |
-
% Color and
|
| 32 |
\usepackage[most]{tcolorbox}
|
| 33 |
-
\usepackage{xcolor}
|
| 34 |
|
| 35 |
% Text and formatting
|
| 36 |
\usepackage{xspace}
|
|
@@ -138,12 +138,12 @@
|
|
| 138 |
showtabs=false,
|
| 139 |
tabsize=2
|
| 140 |
}
|
| 141 |
-
|
| 142 |
\lstset{style=mycodestyle}
|
| 143 |
|
| 144 |
|
| 145 |
\usepackage{setspace}
|
| 146 |
-
|
| 147 |
\usepackage{nicematrix}
|
| 148 |
\newcolumntype{L}[1]{>{\raggedright\let\newline\\\arraybackslash\hspace{0pt}}m{#1}}
|
| 149 |
\newcolumntype{C}[1]{>{\centering\let\newline\\\arraybackslash\hspace{0pt}}m{#1}}
|
|
@@ -155,6 +155,7 @@
|
|
| 155 |
|
| 156 |
\newcommand{\orr}[1]{\textcolor{red}{[OZ:#1]}}
|
| 157 |
|
|
|
|
| 158 |
\tcbuselibrary{minted}
|
| 159 |
\usemintedstyle{colorful}
|
| 160 |
|
|
@@ -174,6 +175,7 @@
|
|
| 174 |
|
| 175 |
|
| 176 |
\newtcolorbox[auto counter]{pbox}[2][]{
|
|
|
|
| 177 |
colback=white,
|
| 178 |
title=\textbf{Code~\thetcbcounter: #2},
|
| 179 |
#1,fonttitle=\sffamily,
|
|
@@ -189,6 +191,7 @@
|
|
| 189 |
\input{preamble}
|
| 190 |
\input{math_commands}
|
| 191 |
\input{handles}
|
|
|
|
| 192 |
|
| 193 |
\title{
|
| 194 |
Robot Learning: A Tutorial
|
|
@@ -197,17 +200,17 @@ Robot Learning: A Tutorial
|
|
| 197 |
\newcommand{\huggingface}{\raisebox{-1.5pt}{\includegraphics[height=1.05em]{logos/hf.pdf}}\xspace}
|
| 198 |
\newcommand{\coreContrib}{\raisebox{.33em}{\hspace{.05em}\includegraphics[height=.5em]{logos/core.png}}\xspace}
|
| 199 |
|
| 200 |
-
\newcommand{\hf}{\raisebox{.28em}{\hspace{.05em}\includegraphics[height
|
| 201 |
\newcommand{\ensps}{\raisebox{.3em}{\hspace{.05em}\includegraphics[height=.65em]{logos/ensps_logo.pdf}}\xspace}
|
|
|
|
| 202 |
|
| 203 |
-
\authorOne[]{Francesco Capuano \
|
| 204 |
-
\authorOne[]{
|
| 205 |
\authorOne[]{Adil Zouitine\hf}
|
| 206 |
-
\authorOne[]{Pepijn Kooijmans\hf}
|
| 207 |
\authorOne[]{Thomas Wolf\hf}
|
| 208 |
\authorOne[]{Michel Aractingi\hf}
|
| 209 |
|
| 210 |
-
\contribution[]{\
|
| 211 |
|
| 212 |
\newcommand{\fix}{\marginpar{FIX}}
|
| 213 |
\newcommand{\new}{\marginpar{NEW}}
|
|
@@ -227,6 +230,7 @@ Robot Learning: A Tutorial
|
|
| 227 |
\newpage
|
| 228 |
\input{sections/01_introduction}
|
| 229 |
|
|
|
|
| 230 |
\input{sections/02_classic_robotics}
|
| 231 |
|
| 232 |
\newpage
|
|
|
|
| 28 |
\usepackage{makecell}
|
| 29 |
\usepackage{adjustbox}
|
| 30 |
|
| 31 |
+
% Color, boxes and tables
|
| 32 |
\usepackage[most]{tcolorbox}
|
| 33 |
+
\usepackage[table,xcdraw,dvipsnames]{xcolor}
|
| 34 |
|
| 35 |
% Text and formatting
|
| 36 |
\usepackage{xspace}
|
|
|
|
| 138 |
showtabs=false,
|
| 139 |
tabsize=2
|
| 140 |
}
|
| 141 |
+
|
| 142 |
\lstset{style=mycodestyle}
|
| 143 |
|
| 144 |
|
| 145 |
\usepackage{setspace}
|
| 146 |
+
|
| 147 |
\usepackage{nicematrix}
|
| 148 |
\newcolumntype{L}[1]{>{\raggedright\let\newline\\\arraybackslash\hspace{0pt}}m{#1}}
|
| 149 |
\newcolumntype{C}[1]{>{\centering\let\newline\\\arraybackslash\hspace{0pt}}m{#1}}
|
|
|
|
| 155 |
|
| 156 |
\newcommand{\orr}[1]{\textcolor{red}{[OZ:#1]}}
|
| 157 |
|
| 158 |
+
|
| 159 |
\tcbuselibrary{minted}
|
| 160 |
\usemintedstyle{colorful}
|
| 161 |
|
|
|
|
| 175 |
|
| 176 |
|
| 177 |
\newtcolorbox[auto counter]{pbox}[2][]{
|
| 178 |
+
breakable,
|
| 179 |
colback=white,
|
| 180 |
title=\textbf{Code~\thetcbcounter: #2},
|
| 181 |
#1,fonttitle=\sffamily,
|
|
|
|
| 191 |
\input{preamble}
|
| 192 |
\input{math_commands}
|
| 193 |
\input{handles}
|
| 194 |
+
\input{snippets/code_specs}
|
| 195 |
|
| 196 |
\title{
|
| 197 |
Robot Learning: A Tutorial
|
|
|
|
| 200 |
\newcommand{\huggingface}{\raisebox{-1.5pt}{\includegraphics[height=1.05em]{logos/hf.pdf}}\xspace}
|
| 201 |
\newcommand{\coreContrib}{\raisebox{.33em}{\hspace{.05em}\includegraphics[height=.5em]{logos/core.png}}\xspace}
|
| 202 |
|
| 203 |
+
\newcommand{\hf}{\raisebox{.28em}{\hspace{.05em}\includegraphics[height=1em]{logos/hf.pdf}}\xspace}
|
| 204 |
\newcommand{\ensps}{\raisebox{.3em}{\hspace{.05em}\includegraphics[height=.65em]{logos/ensps_logo.pdf}}\xspace}
|
| 205 |
+
\newcommand{\oxford}{\raisebox{.3em}{\hspace{.05em}\includegraphics[height=1em]{logos/oxford_logo.png}}\xspace}
|
| 206 |
|
| 207 |
+
\authorOne[]{Francesco Capuano \oxford \hf}
|
| 208 |
+
\authorOne[]{Caroline Pascal\hf}
|
| 209 |
\authorOne[]{Adil Zouitine\hf}
|
|
|
|
| 210 |
\authorOne[]{Thomas Wolf\hf}
|
| 211 |
\authorOne[]{Michel Aractingi\hf}
|
| 212 |
|
| 213 |
+
\contribution[]{\oxford University of Oxford, \hf Hugging Face}
|
| 214 |
|
| 215 |
\newcommand{\fix}{\marginpar{FIX}}
|
| 216 |
\newcommand{\new}{\marginpar{NEW}}
|
|
|
|
| 230 |
\newpage
|
| 231 |
\input{sections/01_introduction}
|
| 232 |
|
| 233 |
+
\newpage
|
| 234 |
\input{sections/02_classic_robotics}
|
| 235 |
|
| 236 |
\newpage
|
app/scripts/latex-to-mdx/input/main.toc
ADDED
|
@@ -0,0 +1,40 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\babel@toc {english}{}\relax
|
| 2 |
+
\contentsline {section}{\numberline {1}Introduction}{3}{section.1}%
|
| 3 |
+
\contentsline {subsection}{\numberline {1.1}\texttt {LeRobotDataset}}{4}{subsection.1.1}%
|
| 4 |
+
\contentsline {subsubsection}{\numberline {1.1.1}The dataset class design}{4}{subsubsection.1.1.1}%
|
| 5 |
+
\contentsline {subsection}{\numberline {1.2}Code Example: Batching a (Streaming) Dataset}{5}{subsection.1.2}%
|
| 6 |
+
\contentsline {subsection}{\numberline {1.3}Code Example: Collecting Data}{6}{subsection.1.3}%
|
| 7 |
+
\contentsline {section}{\numberline {2}Classical Robotics}{9}{section.2}%
|
| 8 |
+
\contentsline {subsection}{\numberline {2.1}Explicit and Implicit Models}{9}{subsection.2.1}%
|
| 9 |
+
\contentsline {subsection}{\numberline {2.2}Different Types of Motion}{10}{subsection.2.2}%
|
| 10 |
+
\contentsline {subsection}{\numberline {2.3}Example: Planar Manipulation}{10}{subsection.2.3}%
|
| 11 |
+
\contentsline {subsubsection}{\numberline {2.3.1}Adding Feedback Loops}{13}{subsubsection.2.3.1}%
|
| 12 |
+
\contentsline {subsection}{\numberline {2.4}Limitations of Dynamics-based Robotics}{13}{subsection.2.4}%
|
| 13 |
+
\contentsline {section}{\numberline {3}Robot (Reinforcement) Learning}{16}{section.3}%
|
| 14 |
+
\contentsline {subsection}{\numberline {3.1}A (Concise) Introduction to RL}{17}{subsection.3.1}%
|
| 15 |
+
\contentsline {subsection}{\numberline {3.2}Real-world RL for Robotics}{20}{subsection.3.2}%
|
| 16 |
+
\contentsline {paragraph}{Sample-efficient RL}{22}{figure.caption.15}%
|
| 17 |
+
\contentsline {paragraph}{Sample-efficient, data-driven RL}{23}{equation.17}%
|
| 18 |
+
\contentsline {paragraph}{Sample-efficient, data-driven, real-world RL}{23}{equation.17}%
|
| 19 |
+
\contentsline {subsubsection}{\numberline {3.2.1}Code Example: Real-world RL}{24}{subsubsection.3.2.1}%
|
| 20 |
+
\contentsline {subsubsection}{\numberline {3.2.2}Limitations of RL in Real-World Robotics: Simulators and Reward Design}{32}{subsubsection.3.2.2}%
|
| 21 |
+
\contentsline {section}{\numberline {4}Robot (Imitation) Learning}{33}{section.4}%
|
| 22 |
+
\contentsline {subsection}{\numberline {4.1}A (Concise) Introduction to Generative Models}{35}{subsection.4.1}%
|
| 23 |
+
\contentsline {subsubsection}{\numberline {4.1.1}Variational Auto-Encoders}{35}{subsubsection.4.1.1}%
|
| 24 |
+
\contentsline {subsubsection}{\numberline {4.1.2}Diffusion Models}{37}{subsubsection.4.1.2}%
|
| 25 |
+
\contentsline {subsubsection}{\numberline {4.1.3}Flow Matching}{41}{subsubsection.4.1.3}%
|
| 26 |
+
\contentsline {subsection}{\numberline {4.2}Action Chunking with Transformers}{43}{subsection.4.2}%
|
| 27 |
+
\contentsline {subsubsection}{\numberline {4.2.1}Code Example: Training and Using ACT in Practice}{46}{subsubsection.4.2.1}%
|
| 28 |
+
\contentsline {subsection}{\numberline {4.3}Diffusion Policy}{48}{subsection.4.3}%
|
| 29 |
+
\contentsline {subsubsection}{\numberline {4.3.1}Code Example: Training and Using Diffusion Policies in Practice}{50}{subsubsection.4.3.1}%
|
| 30 |
+
\contentsline {subsection}{\numberline {4.4}Optimized Inference}{52}{subsection.4.4}%
|
| 31 |
+
\contentsline {subsubsection}{\numberline {4.4.1}Code Example: Using Async Inference}{55}{subsubsection.4.4.1}%
|
| 32 |
+
\contentsline {section}{\numberline {5}Generalist Robot Policies}{57}{section.5}%
|
| 33 |
+
\contentsline {subsection}{\numberline {5.1}Preliminaries: Models and Data}{58}{subsection.5.1}%
|
| 34 |
+
\contentsline {subsection}{\numberline {5.2}VLAs}{60}{subsection.5.2}%
|
| 35 |
+
\contentsline {subsubsection}{\numberline {5.2.1}VLMs for VLAs}{60}{subsubsection.5.2.1}%
|
| 36 |
+
\contentsline {subsection}{\numberline {5.3}\( \pi _0 \)}{61}{subsection.5.3}%
|
| 37 |
+
\contentsline {subsubsection}{\numberline {5.3.1}Code Example: Using \( \pi _0 \)}{63}{subsubsection.5.3.1}%
|
| 38 |
+
\contentsline {subsection}{\numberline {5.4}SmolVLA}{64}{subsection.5.4}%
|
| 39 |
+
\contentsline {subsubsection}{\numberline {5.4.1}Code Example: Using SmolVLA}{65}{subsubsection.5.4.1}%
|
| 40 |
+
\contentsline {section}{\numberline {6}Conclusions}{67}{section.6}%
|
app/scripts/latex-to-mdx/input/presentation.aux
ADDED
|
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\relax
|
| 2 |
+
\providecommand\hyper@newdestlabel[2]{}
|
| 3 |
+
\providecommand\HyField@AuxAddToFields[1]{}
|
| 4 |
+
\providecommand\HyField@AuxAddToCoFields[2]{}
|
app/scripts/latex-to-mdx/input/presentation.log
ADDED
|
@@ -0,0 +1,895 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
This is pdfTeX, Version 3.141592653-2.6-1.40.27 (TeX Live 2025) (preloaded format=pdflatex 2025.8.26) 12 SEP 2025 23:52
|
| 2 |
+
entering extended mode
|
| 3 |
+
restricted \write18 enabled.
|
| 4 |
+
%&-line parsing enabled.
|
| 5 |
+
**slides/presentation.tex
|
| 6 |
+
(./slides/presentation.tex
|
| 7 |
+
LaTeX2e <2024-11-01> patch level 2
|
| 8 |
+
L3 programming layer <2025-01-18>
|
| 9 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamer.cls
|
| 10 |
+
Document Class: beamer 2025/02/04 v3.72 A class for typesetting presentations
|
| 11 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasemodes.sty
|
| 12 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/etoolbox/etoolbox.sty
|
| 13 |
+
Package: etoolbox 2025/02/11 v2.5l e-TeX tools for LaTeX (JAW)
|
| 14 |
+
\etb@tempcnta=\count196
|
| 15 |
+
)
|
| 16 |
+
\beamer@tempbox=\box52
|
| 17 |
+
\beamer@tempcount=\count197
|
| 18 |
+
\c@beamerpauses=\count198
|
| 19 |
+
|
| 20 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasedecode.sty
|
| 21 |
+
\beamer@slideinframe=\count199
|
| 22 |
+
\beamer@minimum=\count266
|
| 23 |
+
\beamer@decode@box=\box53
|
| 24 |
+
)
|
| 25 |
+
\beamer@commentbox=\box54
|
| 26 |
+
\beamer@modecount=\count267
|
| 27 |
+
)
|
| 28 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/iftex.sty
|
| 29 |
+
Package: iftex 2024/12/12 v1.0g TeX engine tests
|
| 30 |
+
)
|
| 31 |
+
\headdp=\dimen141
|
| 32 |
+
\footheight=\dimen142
|
| 33 |
+
\sidebarheight=\dimen143
|
| 34 |
+
\beamer@tempdim=\dimen144
|
| 35 |
+
\beamer@finalheight=\dimen145
|
| 36 |
+
\beamer@animht=\dimen146
|
| 37 |
+
\beamer@animdp=\dimen147
|
| 38 |
+
\beamer@animwd=\dimen148
|
| 39 |
+
\beamer@leftmargin=\dimen149
|
| 40 |
+
\beamer@rightmargin=\dimen150
|
| 41 |
+
\beamer@leftsidebar=\dimen151
|
| 42 |
+
\beamer@rightsidebar=\dimen152
|
| 43 |
+
\beamer@boxsize=\dimen153
|
| 44 |
+
\beamer@vboxoffset=\dimen154
|
| 45 |
+
\beamer@descdefault=\dimen155
|
| 46 |
+
\beamer@descriptionwidth=\dimen156
|
| 47 |
+
\beamer@lastskip=\skip49
|
| 48 |
+
\beamer@areabox=\box55
|
| 49 |
+
\beamer@animcurrent=\box56
|
| 50 |
+
\beamer@animshowbox=\box57
|
| 51 |
+
\beamer@sectionbox=\box58
|
| 52 |
+
\beamer@logobox=\box59
|
| 53 |
+
\beamer@linebox=\box60
|
| 54 |
+
\beamer@sectioncount=\count268
|
| 55 |
+
\beamer@subsubsectionmax=\count269
|
| 56 |
+
\beamer@subsectionmax=\count270
|
| 57 |
+
\beamer@sectionmax=\count271
|
| 58 |
+
\beamer@totalheads=\count272
|
| 59 |
+
\beamer@headcounter=\count273
|
| 60 |
+
\beamer@partstartpage=\count274
|
| 61 |
+
\beamer@sectionstartpage=\count275
|
| 62 |
+
\beamer@subsectionstartpage=\count276
|
| 63 |
+
\beamer@animationtempa=\count277
|
| 64 |
+
\beamer@animationtempb=\count278
|
| 65 |
+
\beamer@xpos=\count279
|
| 66 |
+
\beamer@ypos=\count280
|
| 67 |
+
\beamer@ypos@offset=\count281
|
| 68 |
+
\beamer@showpartnumber=\count282
|
| 69 |
+
\beamer@currentsubsection=\count283
|
| 70 |
+
\beamer@coveringdepth=\count284
|
| 71 |
+
\beamer@sectionadjust=\count285
|
| 72 |
+
\beamer@toclastsection=\count286
|
| 73 |
+
\beamer@tocsectionnumber=\count287
|
| 74 |
+
|
| 75 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseoptions.sty
|
| 76 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/keyval.sty
|
| 77 |
+
Package: keyval 2022/05/29 v1.15 key=value parser (DPC)
|
| 78 |
+
\KV@toks@=\toks17
|
| 79 |
+
))
|
| 80 |
+
\beamer@paperwidth=\skip50
|
| 81 |
+
\beamer@paperheight=\skip51
|
| 82 |
+
|
| 83 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/geometry/geometry.sty
|
| 84 |
+
Package: geometry 2020/01/02 v5.9 Page Geometry
|
| 85 |
+
|
| 86 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifvtex.sty
|
| 87 |
+
Package: ifvtex 2019/10/25 v1.7 ifvtex legacy package. Use iftex instead.
|
| 88 |
+
)
|
| 89 |
+
\Gm@cnth=\count288
|
| 90 |
+
\Gm@cntv=\count289
|
| 91 |
+
\c@Gm@tempcnt=\count290
|
| 92 |
+
\Gm@bindingoffset=\dimen157
|
| 93 |
+
\Gm@wd@mp=\dimen158
|
| 94 |
+
\Gm@odd@mp=\dimen159
|
| 95 |
+
\Gm@even@mp=\dimen160
|
| 96 |
+
\Gm@layoutwidth=\dimen161
|
| 97 |
+
\Gm@layoutheight=\dimen162
|
| 98 |
+
\Gm@layouthoffset=\dimen163
|
| 99 |
+
\Gm@layoutvoffset=\dimen164
|
| 100 |
+
\Gm@dimlist=\toks18
|
| 101 |
+
)
|
| 102 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/math/pgfmath.sty
|
| 103 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty
|
| 104 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-common.te
|
| 105 |
+
x
|
| 106 |
+
\pgfutil@everybye=\toks19
|
| 107 |
+
\pgfutil@tempdima=\dimen165
|
| 108 |
+
\pgfutil@tempdimb=\dimen166
|
| 109 |
+
)
|
| 110 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-latex.def
|
| 111 |
+
\pgfutil@abb=\box61
|
| 112 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex
|
| 113 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/pgf.revision.tex)
|
| 114 |
+
Package: pgfrcs 2023-01-15 v3.1.10 (3.1.10)
|
| 115 |
+
))
|
| 116 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty
|
| 117 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
|
| 118 |
+
\pgfkeys@pathtoks=\toks20
|
| 119 |
+
\pgfkeys@temptoks=\toks21
|
| 120 |
+
|
| 121 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeyslibraryfil
|
| 122 |
+
tered.code.tex
|
| 123 |
+
\pgfkeys@tmptoks=\toks22
|
| 124 |
+
)))
|
| 125 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex
|
| 126 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathutil.code.tex
|
| 127 |
+
\pgf@x=\dimen167
|
| 128 |
+
\pgf@xa=\dimen168
|
| 129 |
+
\pgf@xb=\dimen169
|
| 130 |
+
\pgf@xc=\dimen170
|
| 131 |
+
\pgf@y=\dimen171
|
| 132 |
+
\pgf@ya=\dimen172
|
| 133 |
+
\pgf@yb=\dimen173
|
| 134 |
+
\pgf@yc=\dimen174
|
| 135 |
+
\c@pgf@counta=\count291
|
| 136 |
+
\c@pgf@countb=\count292
|
| 137 |
+
\c@pgf@countc=\count293
|
| 138 |
+
\c@pgf@countd=\count294
|
| 139 |
+
)
|
| 140 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathparser.code.tex
|
| 141 |
+
\pgfmath@dimen=\dimen175
|
| 142 |
+
\pgfmath@count=\count295
|
| 143 |
+
\pgfmath@box=\box62
|
| 144 |
+
\pgfmath@toks=\toks23
|
| 145 |
+
\pgfmath@stack@operand=\toks24
|
| 146 |
+
\pgfmath@stack@operation=\toks25
|
| 147 |
+
)
|
| 148 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.code.
|
| 149 |
+
tex)
|
| 150 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.basic
|
| 151 |
+
.code.tex)
|
| 152 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.trigo
|
| 153 |
+
nometric.code.tex)
|
| 154 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.rando
|
| 155 |
+
m.code.tex)
|
| 156 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.compa
|
| 157 |
+
rison.code.tex)
|
| 158 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.base.
|
| 159 |
+
code.tex)
|
| 160 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.round
|
| 161 |
+
.code.tex)
|
| 162 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.misc.
|
| 163 |
+
code.tex)
|
| 164 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.integ
|
| 165 |
+
erarithmetics.code.tex)
|
| 166 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathcalc.code.tex)
|
| 167 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfloat.code.tex
|
| 168 |
+
\c@pgfmathroundto@lastzeros=\count296
|
| 169 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/latex/base/size11.clo
|
| 170 |
+
File: size11.clo 2024/06/29 v1.4n Standard LaTeX file (size option)
|
| 171 |
+
)
|
| 172 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty
|
| 173 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphicx.sty
|
| 174 |
+
Package: graphicx 2021/09/16 v1.2d Enhanced LaTeX Graphics (DPC,SPQR)
|
| 175 |
+
|
| 176 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphics.sty
|
| 177 |
+
Package: graphics 2024/08/06 v1.4g Standard LaTeX Graphics (DPC,SPQR)
|
| 178 |
+
|
| 179 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/trig.sty
|
| 180 |
+
Package: trig 2023/12/02 v1.11 sin cos tan (DPC)
|
| 181 |
+
)
|
| 182 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg
|
| 183 |
+
File: graphics.cfg 2016/06/04 v1.11 sample graphics configuration
|
| 184 |
+
)
|
| 185 |
+
Package graphics Info: Driver file: pdftex.def on input line 106.
|
| 186 |
+
|
| 187 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def
|
| 188 |
+
File: pdftex.def 2024/04/13 v1.2c Graphics/color driver for pdftex
|
| 189 |
+
))
|
| 190 |
+
\Gin@req@height=\dimen176
|
| 191 |
+
\Gin@req@width=\dimen177
|
| 192 |
+
)
|
| 193 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty
|
| 194 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
|
| 195 |
+
Package: pgfsys 2023-01-15 v3.1.10 (3.1.10)
|
| 196 |
+
\pgf@x=\dimen178
|
| 197 |
+
\pgf@y=\dimen179
|
| 198 |
+
\pgf@xa=\dimen180
|
| 199 |
+
\pgf@ya=\dimen181
|
| 200 |
+
\pgf@xb=\dimen182
|
| 201 |
+
\pgf@yb=\dimen183
|
| 202 |
+
\pgf@xc=\dimen184
|
| 203 |
+
\pgf@yc=\dimen185
|
| 204 |
+
\pgf@xd=\dimen186
|
| 205 |
+
\pgf@yd=\dimen187
|
| 206 |
+
\w@pgf@writea=\write3
|
| 207 |
+
\r@pgf@reada=\read2
|
| 208 |
+
\c@pgf@counta=\count297
|
| 209 |
+
\c@pgf@countb=\count298
|
| 210 |
+
\c@pgf@countc=\count299
|
| 211 |
+
\c@pgf@countd=\count300
|
| 212 |
+
\t@pgf@toka=\toks26
|
| 213 |
+
\t@pgf@tokb=\toks27
|
| 214 |
+
\t@pgf@tokc=\toks28
|
| 215 |
+
\pgf@sys@id@count=\count301
|
| 216 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgf.cfg
|
| 217 |
+
File: pgf.cfg 2023-01-15 v3.1.10 (3.1.10)
|
| 218 |
+
)
|
| 219 |
+
Driver file for pgf: pgfsys-pdftex.def
|
| 220 |
+
|
| 221 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.d
|
| 222 |
+
ef
|
| 223 |
+
File: pgfsys-pdftex.def 2023-01-15 v3.1.10 (3.1.10)
|
| 224 |
+
|
| 225 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-common-p
|
| 226 |
+
df.def
|
| 227 |
+
File: pgfsys-common-pdf.def 2023-01-15 v3.1.10 (3.1.10)
|
| 228 |
+
)))
|
| 229 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.
|
| 230 |
+
code.tex
|
| 231 |
+
File: pgfsyssoftpath.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 232 |
+
\pgfsyssoftpath@smallbuffer@items=\count302
|
| 233 |
+
\pgfsyssoftpath@bigbuffer@items=\count303
|
| 234 |
+
)
|
| 235 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.
|
| 236 |
+
code.tex
|
| 237 |
+
File: pgfsysprotocol.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 238 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/xcolor/xcolor.sty
|
| 239 |
+
Package: xcolor 2024/09/29 v3.02 LaTeX color extensions (UK)
|
| 240 |
+
|
| 241 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg
|
| 242 |
+
File: color.cfg 2016/01/02 v1.6 sample color configuration
|
| 243 |
+
)
|
| 244 |
+
Package xcolor Info: Driver file: pdftex.def on input line 274.
|
| 245 |
+
|
| 246 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx)
|
| 247 |
+
Package xcolor Info: Model `cmy' substituted by `cmy0' on input line 1349.
|
| 248 |
+
Package xcolor Info: Model `hsb' substituted by `rgb' on input line 1353.
|
| 249 |
+
Package xcolor Info: Model `RGB' extended on input line 1365.
|
| 250 |
+
Package xcolor Info: Model `HTML' substituted by `rgb' on input line 1367.
|
| 251 |
+
Package xcolor Info: Model `Hsb' substituted by `hsb' on input line 1368.
|
| 252 |
+
Package xcolor Info: Model `tHsb' substituted by `hsb' on input line 1369.
|
| 253 |
+
Package xcolor Info: Model `HSB' substituted by `hsb' on input line 1370.
|
| 254 |
+
Package xcolor Info: Model `Gray' substituted by `gray' on input line 1371.
|
| 255 |
+
Package xcolor Info: Model `wave' substituted by `hsb' on input line 1372.
|
| 256 |
+
)
|
| 257 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
|
| 258 |
+
Package: pgfcore 2023-01-15 v3.1.10 (3.1.10)
|
| 259 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfint.code.tex)
|
| 260 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepoints.co
|
| 261 |
+
de.tex
|
| 262 |
+
File: pgfcorepoints.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 263 |
+
\pgf@picminx=\dimen188
|
| 264 |
+
\pgf@picmaxx=\dimen189
|
| 265 |
+
\pgf@picminy=\dimen190
|
| 266 |
+
\pgf@picmaxy=\dimen191
|
| 267 |
+
\pgf@pathminx=\dimen192
|
| 268 |
+
\pgf@pathmaxx=\dimen193
|
| 269 |
+
\pgf@pathminy=\dimen194
|
| 270 |
+
\pgf@pathmaxy=\dimen195
|
| 271 |
+
\pgf@xx=\dimen196
|
| 272 |
+
\pgf@xy=\dimen197
|
| 273 |
+
\pgf@yx=\dimen198
|
| 274 |
+
\pgf@yy=\dimen199
|
| 275 |
+
\pgf@zx=\dimen256
|
| 276 |
+
\pgf@zy=\dimen257
|
| 277 |
+
)
|
| 278 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathconst
|
| 279 |
+
ruct.code.tex
|
| 280 |
+
File: pgfcorepathconstruct.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 281 |
+
\pgf@path@lastx=\dimen258
|
| 282 |
+
\pgf@path@lasty=\dimen259
|
| 283 |
+
)
|
| 284 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathusage
|
| 285 |
+
.code.tex
|
| 286 |
+
File: pgfcorepathusage.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 287 |
+
\pgf@shorten@end@additional=\dimen260
|
| 288 |
+
\pgf@shorten@start@additional=\dimen261
|
| 289 |
+
)
|
| 290 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorescopes.co
|
| 291 |
+
de.tex
|
| 292 |
+
File: pgfcorescopes.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 293 |
+
\pgfpic=\box63
|
| 294 |
+
\pgf@hbox=\box64
|
| 295 |
+
\pgf@layerbox@main=\box65
|
| 296 |
+
\pgf@picture@serial@count=\count304
|
| 297 |
+
)
|
| 298 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoregraphicst
|
| 299 |
+
ate.code.tex
|
| 300 |
+
File: pgfcoregraphicstate.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 301 |
+
\pgflinewidth=\dimen262
|
| 302 |
+
)
|
| 303 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransform
|
| 304 |
+
ations.code.tex
|
| 305 |
+
File: pgfcoretransformations.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 306 |
+
\pgf@pt@x=\dimen263
|
| 307 |
+
\pgf@pt@y=\dimen264
|
| 308 |
+
\pgf@pt@temp=\dimen265
|
| 309 |
+
)
|
| 310 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorequick.cod
|
| 311 |
+
e.tex
|
| 312 |
+
File: pgfcorequick.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 313 |
+
)
|
| 314 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreobjects.c
|
| 315 |
+
ode.tex
|
| 316 |
+
File: pgfcoreobjects.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 317 |
+
)
|
| 318 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathproce
|
| 319 |
+
ssing.code.tex
|
| 320 |
+
File: pgfcorepathprocessing.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 321 |
+
)
|
| 322 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorearrows.co
|
| 323 |
+
de.tex
|
| 324 |
+
File: pgfcorearrows.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 325 |
+
\pgfarrowsep=\dimen266
|
| 326 |
+
)
|
| 327 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreshade.cod
|
| 328 |
+
e.tex
|
| 329 |
+
File: pgfcoreshade.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 330 |
+
\pgf@max=\dimen267
|
| 331 |
+
\pgf@sys@shading@range@num=\count305
|
| 332 |
+
\pgf@shadingcount=\count306
|
| 333 |
+
)
|
| 334 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreimage.cod
|
| 335 |
+
e.tex
|
| 336 |
+
File: pgfcoreimage.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 337 |
+
)
|
| 338 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreexternal.
|
| 339 |
+
code.tex
|
| 340 |
+
File: pgfcoreexternal.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 341 |
+
\pgfexternal@startupbox=\box66
|
| 342 |
+
)
|
| 343 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorelayers.co
|
| 344 |
+
de.tex
|
| 345 |
+
File: pgfcorelayers.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 346 |
+
)
|
| 347 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretranspare
|
| 348 |
+
ncy.code.tex
|
| 349 |
+
File: pgfcoretransparency.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 350 |
+
)
|
| 351 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepatterns.
|
| 352 |
+
code.tex
|
| 353 |
+
File: pgfcorepatterns.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 354 |
+
)
|
| 355 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorerdf.code.
|
| 356 |
+
tex
|
| 357 |
+
File: pgfcorerdf.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 358 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/xxcolor.sty
|
| 359 |
+
Package: xxcolor 2003/10/24 ver 0.1
|
| 360 |
+
\XC@nummixins=\count307
|
| 361 |
+
\XC@countmixins=\count308
|
| 362 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/base/atbegshi-ltx.sty
|
| 363 |
+
Package: atbegshi-ltx 2021/01/10 v1.0c Emulation of the original atbegshi
|
| 364 |
+
package with kernel methods
|
| 365 |
+
)
|
| 366 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hyperref.sty
|
| 367 |
+
Package: hyperref 2024-11-05 v7.01l Hypertext links for LaTeX
|
| 368 |
+
|
| 369 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty
|
| 370 |
+
Package: kvsetkeys 2022-10-05 v1.19 Key value parser (HO)
|
| 371 |
+
)
|
| 372 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty
|
| 373 |
+
Package: kvdefinekeys 2019-12-19 v1.6 Define keys (HO)
|
| 374 |
+
)
|
| 375 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pdfescape/pdfescape.sty
|
| 376 |
+
Package: pdfescape 2019/12/09 v1.15 Implements pdfTeX's escape features (HO)
|
| 377 |
+
|
| 378 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty
|
| 379 |
+
Package: ltxcmds 2023-12-04 v1.26 LaTeX kernel commands for general use (HO)
|
| 380 |
+
)
|
| 381 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty
|
| 382 |
+
Package: pdftexcmds 2020-06-27 v0.33 Utility functions of pdfTeX for LuaTeX (HO
|
| 383 |
+
)
|
| 384 |
+
|
| 385 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/infwarerr/infwarerr.sty
|
| 386 |
+
Package: infwarerr 2019/12/03 v1.5 Providing info/warning/error messages (HO)
|
| 387 |
+
)
|
| 388 |
+
Package pdftexcmds Info: \pdf@primitive is available.
|
| 389 |
+
Package pdftexcmds Info: \pdf@ifprimitive is available.
|
| 390 |
+
Package pdftexcmds Info: \pdfdraftmode found.
|
| 391 |
+
))
|
| 392 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hycolor/hycolor.sty
|
| 393 |
+
Package: hycolor 2020-01-27 v1.10 Color options for hyperref/bookmark (HO)
|
| 394 |
+
)
|
| 395 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/nameref.sty
|
| 396 |
+
Package: nameref 2023-11-26 v2.56 Cross-referencing by name of section
|
| 397 |
+
|
| 398 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/refcount/refcount.sty
|
| 399 |
+
Package: refcount 2019/12/15 v3.6 Data extraction from label references (HO)
|
| 400 |
+
)
|
| 401 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/gettitlestring/gettitlestring.s
|
| 402 |
+
ty
|
| 403 |
+
Package: gettitlestring 2019/12/15 v1.6 Cleanup title references (HO)
|
| 404 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/kvoptions/kvoptions.sty
|
| 405 |
+
Package: kvoptions 2022-06-15 v3.15 Key value format for package options (HO)
|
| 406 |
+
))
|
| 407 |
+
\c@section@level=\count309
|
| 408 |
+
)
|
| 409 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/stringenc/stringenc.sty
|
| 410 |
+
Package: stringenc 2019/11/29 v1.12 Convert strings between diff. encodings (HO
|
| 411 |
+
)
|
| 412 |
+
)
|
| 413 |
+
\@linkdim=\dimen268
|
| 414 |
+
\Hy@linkcounter=\count310
|
| 415 |
+
\Hy@pagecounter=\count311
|
| 416 |
+
|
| 417 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def
|
| 418 |
+
File: pd1enc.def 2024-11-05 v7.01l Hyperref: PDFDocEncoding definition (HO)
|
| 419 |
+
Now handling font encoding PD1 ...
|
| 420 |
+
... no UTF-8 mapping file for font encoding PD1
|
| 421 |
+
)
|
| 422 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/intcalc/intcalc.sty
|
| 423 |
+
Package: intcalc 2019/12/15 v1.3 Expandable calculations with integers (HO)
|
| 424 |
+
)
|
| 425 |
+
\Hy@SavedSpaceFactor=\count312
|
| 426 |
+
|
| 427 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def
|
| 428 |
+
File: puenc.def 2024-11-05 v7.01l Hyperref: PDF Unicode definition (HO)
|
| 429 |
+
Now handling font encoding PU ...
|
| 430 |
+
... no UTF-8 mapping file for font encoding PU
|
| 431 |
+
)
|
| 432 |
+
Package hyperref Info: Option `bookmarks' set `true' on input line 4040.
|
| 433 |
+
Package hyperref Info: Option `bookmarksopen' set `true' on input line 4040.
|
| 434 |
+
Package hyperref Info: Option `implicit' set `false' on input line 4040.
|
| 435 |
+
Package hyperref Info: Hyper figures OFF on input line 4157.
|
| 436 |
+
Package hyperref Info: Link nesting OFF on input line 4162.
|
| 437 |
+
Package hyperref Info: Hyper index ON on input line 4165.
|
| 438 |
+
Package hyperref Info: Plain pages OFF on input line 4172.
|
| 439 |
+
Package hyperref Info: Backreferencing OFF on input line 4177.
|
| 440 |
+
Package hyperref Info: Implicit mode OFF; no redefinition of LaTeX internals.
|
| 441 |
+
Package hyperref Info: Bookmarks ON on input line 4424.
|
| 442 |
+
\c@Hy@tempcnt=\count313
|
| 443 |
+
|
| 444 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/url/url.sty
|
| 445 |
+
\Urlmuskip=\muskip17
|
| 446 |
+
Package: url 2013/09/16 ver 3.4 Verb mode for urls, etc.
|
| 447 |
+
)
|
| 448 |
+
LaTeX Info: Redefining \url on input line 4763.
|
| 449 |
+
\XeTeXLinkMargin=\dimen269
|
| 450 |
+
|
| 451 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/bitset/bitset.sty
|
| 452 |
+
Package: bitset 2019/12/09 v1.3 Handle bit-vector datatype (HO)
|
| 453 |
+
|
| 454 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty
|
| 455 |
+
Package: bigintcalc 2019/12/15 v1.5 Expandable calculations on big integers (HO
|
| 456 |
+
)
|
| 457 |
+
))
|
| 458 |
+
\Fld@menulength=\count314
|
| 459 |
+
\Field@Width=\dimen270
|
| 460 |
+
\Fld@charsize=\dimen271
|
| 461 |
+
Package hyperref Info: Hyper figures OFF on input line 6042.
|
| 462 |
+
Package hyperref Info: Link nesting OFF on input line 6047.
|
| 463 |
+
Package hyperref Info: Hyper index ON on input line 6050.
|
| 464 |
+
Package hyperref Info: backreferencing OFF on input line 6057.
|
| 465 |
+
Package hyperref Info: Link coloring OFF on input line 6062.
|
| 466 |
+
Package hyperref Info: Link coloring with OCG OFF on input line 6067.
|
| 467 |
+
Package hyperref Info: PDF/A mode OFF on input line 6072.
|
| 468 |
+
\Hy@abspage=\count315
|
| 469 |
+
|
| 470 |
+
|
| 471 |
+
Package hyperref Message: Stopped early.
|
| 472 |
+
|
| 473 |
+
)
|
| 474 |
+
Package hyperref Info: Driver (autodetected): hpdftex.
|
| 475 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def
|
| 476 |
+
File: hpdftex.def 2024-11-05 v7.01l Hyperref driver for pdfTeX
|
| 477 |
+
|
| 478 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/base/atveryend-ltx.sty
|
| 479 |
+
Package: atveryend-ltx 2020/08/19 v1.0a Emulation of the original atveryend pac
|
| 480 |
+
kage
|
| 481 |
+
with kernel methods
|
| 482 |
+
)
|
| 483 |
+
\Fld@listcount=\count316
|
| 484 |
+
\c@bookmark@seq@number=\count317
|
| 485 |
+
|
| 486 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty
|
| 487 |
+
Package: rerunfilecheck 2022-07-10 v1.10 Rerun checks for auxiliary files (HO)
|
| 488 |
+
|
| 489 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty
|
| 490 |
+
Package: uniquecounter 2019/12/15 v1.4 Provide unlimited unique counter (HO)
|
| 491 |
+
)
|
| 492 |
+
Package uniquecounter Info: New unique counter `rerunfilecheck' on input line 2
|
| 493 |
+
85.
|
| 494 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaserequires.sty
|
| 495 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasecompatibility.st
|
| 496 |
+
y) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasefont.sty
|
| 497 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amssymb.sty
|
| 498 |
+
Package: amssymb 2013/01/14 v3.01 AMS font symbols
|
| 499 |
+
|
| 500 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amsfonts.sty
|
| 501 |
+
Package: amsfonts 2013/01/14 v3.01 Basic AMSFonts support
|
| 502 |
+
\@emptytoks=\toks29
|
| 503 |
+
\symAMSa=\mathgroup4
|
| 504 |
+
\symAMSb=\mathgroup5
|
| 505 |
+
LaTeX Font Info: Redeclaring math symbol \hbar on input line 98.
|
| 506 |
+
LaTeX Font Info: Overwriting math alphabet `\mathfrak' in version `bold'
|
| 507 |
+
(Font) U/euf/m/n --> U/euf/b/n on input line 106.
|
| 508 |
+
))
|
| 509 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/sansmathaccent.sty
|
| 510 |
+
Package: sansmathaccent 2020/01/31
|
| 511 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile.sty
|
| 512 |
+
Package: scrlfile 2024/10/24 v3.43 KOMA-Script package (file load hooks)
|
| 513 |
+
|
| 514 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile-hook.sty
|
| 515 |
+
Package: scrlfile-hook 2024/10/24 v3.43 KOMA-Script package (using LaTeX hooks)
|
| 516 |
+
|
| 517 |
+
|
| 518 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlogo.sty
|
| 519 |
+
Package: scrlogo 2024/10/24 v3.43 KOMA-Script package (logo)
|
| 520 |
+
)))))
|
| 521 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetranslator.sty
|
| 522 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator.sty
|
| 523 |
+
Package: translator 2021-05-31 v1.12d Easy translation of strings in LaTeX
|
| 524 |
+
))
|
| 525 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasemisc.sty)
|
| 526 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetwoscreens.sty)
|
| 527 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseoverlay.sty
|
| 528 |
+
\beamer@argscount=\count318
|
| 529 |
+
\beamer@lastskipcover=\skip52
|
| 530 |
+
\beamer@trivlistdepth=\count319
|
| 531 |
+
)
|
| 532 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetitle.sty)
|
| 533 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasesection.sty
|
| 534 |
+
\c@lecture=\count320
|
| 535 |
+
\c@part=\count321
|
| 536 |
+
\c@section=\count322
|
| 537 |
+
\c@subsection=\count323
|
| 538 |
+
\c@subsubsection=\count324
|
| 539 |
+
)
|
| 540 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframe.sty
|
| 541 |
+
\beamer@framebox=\box67
|
| 542 |
+
\beamer@frametitlebox=\box68
|
| 543 |
+
\beamer@zoombox=\box69
|
| 544 |
+
\beamer@zoomcount=\count325
|
| 545 |
+
\beamer@zoomframecount=\count326
|
| 546 |
+
\beamer@frametextheight=\dimen272
|
| 547 |
+
\c@subsectionslide=\count327
|
| 548 |
+
\beamer@frametopskip=\skip53
|
| 549 |
+
\beamer@framebottomskip=\skip54
|
| 550 |
+
\beamer@frametopskipautobreak=\skip55
|
| 551 |
+
\beamer@framebottomskipautobreak=\skip56
|
| 552 |
+
\beamer@envbody=\toks30
|
| 553 |
+
\framewidth=\dimen273
|
| 554 |
+
\c@framenumber=\count328
|
| 555 |
+
)
|
| 556 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseverbatim.sty
|
| 557 |
+
\beamer@verbatimfileout=\write4
|
| 558 |
+
)
|
| 559 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframesize.sty
|
| 560 |
+
\beamer@splitbox=\box70
|
| 561 |
+
\beamer@autobreakcount=\count329
|
| 562 |
+
\beamer@autobreaklastheight=\dimen274
|
| 563 |
+
\beamer@frametitletoks=\toks31
|
| 564 |
+
\beamer@framesubtitletoks=\toks32
|
| 565 |
+
)
|
| 566 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframecomponents.
|
| 567 |
+
sty
|
| 568 |
+
\beamer@footins=\box71
|
| 569 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasecolor.sty)
|
| 570 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenotes.sty
|
| 571 |
+
\beamer@frameboxcopy=\box72
|
| 572 |
+
)
|
| 573 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetoc.sty)
|
| 574 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetemplates.sty
|
| 575 |
+
\beamer@sbttoks=\toks33
|
| 576 |
+
|
| 577 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseauxtemplates.sty
|
| 578 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseboxes.sty
|
| 579 |
+
\bmb@box=\box73
|
| 580 |
+
\bmb@colorbox=\box74
|
| 581 |
+
\bmb@boxwidth=\dimen275
|
| 582 |
+
\bmb@boxheight=\dimen276
|
| 583 |
+
\bmb@prevheight=\dimen277
|
| 584 |
+
\bmb@temp=\dimen278
|
| 585 |
+
\bmb@dima=\dimen279
|
| 586 |
+
\bmb@dimb=\dimen280
|
| 587 |
+
\bmb@prevheight=\dimen281
|
| 588 |
+
)
|
| 589 |
+
\beamer@blockheadheight=\dimen282
|
| 590 |
+
))
|
| 591 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaselocalstructure.s
|
| 592 |
+
ty (/usr/local/texlive/2025/texmf-dist/tex/latex/tools/enumerate.sty
|
| 593 |
+
Package: enumerate 2023/07/04 v3.00 enumerate extensions (DPC)
|
| 594 |
+
\@enLab=\toks34
|
| 595 |
+
)
|
| 596 |
+
\beamer@bibiconwidth=\skip57
|
| 597 |
+
\c@figure=\count330
|
| 598 |
+
\c@table=\count331
|
| 599 |
+
\abovecaptionskip=\skip58
|
| 600 |
+
\belowcaptionskip=\skip59
|
| 601 |
+
)
|
| 602 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigation.sty
|
| 603 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigationsymbol
|
| 604 |
+
s.tex)
|
| 605 |
+
\beamer@section@min@dim=\dimen283
|
| 606 |
+
)
|
| 607 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetheorems.sty
|
| 608 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsmath.sty
|
| 609 |
+
Package: amsmath 2024/11/05 v2.17t AMS math features
|
| 610 |
+
\@mathmargin=\skip60
|
| 611 |
+
|
| 612 |
+
For additional information on amsmath, use the `?' option.
|
| 613 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amstext.sty
|
| 614 |
+
Package: amstext 2021/08/26 v2.01 AMS text
|
| 615 |
+
|
| 616 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsgen.sty
|
| 617 |
+
File: amsgen.sty 1999/11/30 v2.0 generic functions
|
| 618 |
+
\@emptytoks=\toks35
|
| 619 |
+
\ex@=\dimen284
|
| 620 |
+
))
|
| 621 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsbsy.sty
|
| 622 |
+
Package: amsbsy 1999/11/29 v1.2d Bold Symbols
|
| 623 |
+
\pmbraise@=\dimen285
|
| 624 |
+
)
|
| 625 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsopn.sty
|
| 626 |
+
Package: amsopn 2022/04/08 v2.04 operator names
|
| 627 |
+
)
|
| 628 |
+
\inf@bad=\count332
|
| 629 |
+
LaTeX Info: Redefining \frac on input line 233.
|
| 630 |
+
\uproot@=\count333
|
| 631 |
+
\leftroot@=\count334
|
| 632 |
+
LaTeX Info: Redefining \overline on input line 398.
|
| 633 |
+
LaTeX Info: Redefining \colon on input line 409.
|
| 634 |
+
\classnum@=\count335
|
| 635 |
+
\DOTSCASE@=\count336
|
| 636 |
+
LaTeX Info: Redefining \ldots on input line 495.
|
| 637 |
+
LaTeX Info: Redefining \dots on input line 498.
|
| 638 |
+
LaTeX Info: Redefining \cdots on input line 619.
|
| 639 |
+
\Mathstrutbox@=\box75
|
| 640 |
+
\strutbox@=\box76
|
| 641 |
+
LaTeX Info: Redefining \big on input line 721.
|
| 642 |
+
LaTeX Info: Redefining \Big on input line 722.
|
| 643 |
+
LaTeX Info: Redefining \bigg on input line 723.
|
| 644 |
+
LaTeX Info: Redefining \Bigg on input line 724.
|
| 645 |
+
\big@size=\dimen286
|
| 646 |
+
LaTeX Font Info: Redeclaring font encoding OML on input line 742.
|
| 647 |
+
LaTeX Font Info: Redeclaring font encoding OMS on input line 743.
|
| 648 |
+
\macc@depth=\count337
|
| 649 |
+
LaTeX Info: Redefining \bmod on input line 904.
|
| 650 |
+
LaTeX Info: Redefining \pmod on input line 909.
|
| 651 |
+
LaTeX Info: Redefining \smash on input line 939.
|
| 652 |
+
LaTeX Info: Redefining \relbar on input line 969.
|
| 653 |
+
LaTeX Info: Redefining \Relbar on input line 970.
|
| 654 |
+
\c@MaxMatrixCols=\count338
|
| 655 |
+
\dotsspace@=\muskip18
|
| 656 |
+
\c@parentequation=\count339
|
| 657 |
+
\dspbrk@lvl=\count340
|
| 658 |
+
\tag@help=\toks36
|
| 659 |
+
\row@=\count341
|
| 660 |
+
\column@=\count342
|
| 661 |
+
\maxfields@=\count343
|
| 662 |
+
\andhelp@=\toks37
|
| 663 |
+
\eqnshift@=\dimen287
|
| 664 |
+
\alignsep@=\dimen288
|
| 665 |
+
\tagshift@=\dimen289
|
| 666 |
+
\tagwidth@=\dimen290
|
| 667 |
+
\totwidth@=\dimen291
|
| 668 |
+
\lineht@=\dimen292
|
| 669 |
+
\@envbody=\toks38
|
| 670 |
+
\multlinegap=\skip61
|
| 671 |
+
\multlinetaggap=\skip62
|
| 672 |
+
\mathdisplay@stack=\toks39
|
| 673 |
+
LaTeX Info: Redefining \[ on input line 2953.
|
| 674 |
+
LaTeX Info: Redefining \] on input line 2954.
|
| 675 |
+
)
|
| 676 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amscls/amsthm.sty
|
| 677 |
+
Package: amsthm 2020/05/29 v2.20.6
|
| 678 |
+
\thm@style=\toks40
|
| 679 |
+
\thm@bodyfont=\toks41
|
| 680 |
+
\thm@headfont=\toks42
|
| 681 |
+
\thm@notefont=\toks43
|
| 682 |
+
\thm@headpunct=\toks44
|
| 683 |
+
\thm@preskip=\skip63
|
| 684 |
+
\thm@postskip=\skip64
|
| 685 |
+
\thm@headsep=\skip65
|
| 686 |
+
\dth@everypar=\toks45
|
| 687 |
+
)
|
| 688 |
+
\c@theorem=\count344
|
| 689 |
+
)
|
| 690 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasethemes.sty))
|
| 691 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerthemedefault.sty
|
| 692 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerfontthemedefault.sty
|
| 693 |
+
)
|
| 694 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemedefault.st
|
| 695 |
+
y)
|
| 696 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerinnerthemedefault.st
|
| 697 |
+
y
|
| 698 |
+
\beamer@dima=\dimen293
|
| 699 |
+
\beamer@dimb=\dimen294
|
| 700 |
+
)
|
| 701 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerouterthemedefault.st
|
| 702 |
+
y))) (./preamble.tex
|
| 703 |
+
\savewidth=\skip66
|
| 704 |
+
\thinwidth=\skip67
|
| 705 |
+
)
|
| 706 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def
|
| 707 |
+
File: l3backend-pdftex.def 2024-05-08 L3 backend support: PDF output (pdfTeX)
|
| 708 |
+
\l__color_backend_stack_int=\count345
|
| 709 |
+
\l__pdf_internal_box=\box77
|
| 710 |
+
)
|
| 711 |
+
(./presentation.aux)
|
| 712 |
+
\openout1 = `presentation.aux'.
|
| 713 |
+
|
| 714 |
+
LaTeX Font Info: Checking defaults for OML/cmm/m/it on input line 7.
|
| 715 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 716 |
+
LaTeX Font Info: Checking defaults for OMS/cmsy/m/n on input line 7.
|
| 717 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 718 |
+
LaTeX Font Info: Checking defaults for OT1/cmr/m/n on input line 7.
|
| 719 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 720 |
+
LaTeX Font Info: Checking defaults for T1/cmr/m/n on input line 7.
|
| 721 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 722 |
+
LaTeX Font Info: Checking defaults for TS1/cmr/m/n on input line 7.
|
| 723 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 724 |
+
LaTeX Font Info: Checking defaults for OMX/cmex/m/n on input line 7.
|
| 725 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 726 |
+
LaTeX Font Info: Checking defaults for U/cmr/m/n on input line 7.
|
| 727 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 728 |
+
LaTeX Font Info: Checking defaults for PD1/pdf/m/n on input line 7.
|
| 729 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 730 |
+
LaTeX Font Info: Checking defaults for PU/pdf/m/n on input line 7.
|
| 731 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 732 |
+
|
| 733 |
+
*geometry* driver: auto-detecting
|
| 734 |
+
*geometry* detected driver: pdftex
|
| 735 |
+
*geometry* verbose mode - [ preamble ] result:
|
| 736 |
+
* driver: pdftex
|
| 737 |
+
* paper: custom
|
| 738 |
+
* layout: <same size as paper>
|
| 739 |
+
* layoutoffset:(h,v)=(0.0pt,0.0pt)
|
| 740 |
+
* modes: includehead includefoot
|
| 741 |
+
* h-part:(L,W,R)=(28.45274pt, 307.28987pt, 28.45274pt)
|
| 742 |
+
* v-part:(T,H,B)=(0.0pt, 273.14662pt, 0.0pt)
|
| 743 |
+
* \paperwidth=364.19536pt
|
| 744 |
+
* \paperheight=273.14662pt
|
| 745 |
+
* \textwidth=307.28987pt
|
| 746 |
+
* \textheight=244.6939pt
|
| 747 |
+
* \oddsidemargin=-43.81725pt
|
| 748 |
+
* \evensidemargin=-43.81725pt
|
| 749 |
+
* \topmargin=-72.26999pt
|
| 750 |
+
* \headheight=14.22636pt
|
| 751 |
+
* \headsep=0.0pt
|
| 752 |
+
* \topskip=11.0pt
|
| 753 |
+
* \footskip=14.22636pt
|
| 754 |
+
* \marginparwidth=4.0pt
|
| 755 |
+
* \marginparsep=10.0pt
|
| 756 |
+
* \columnsep=10.0pt
|
| 757 |
+
* \skip\footins=10.0pt plus 4.0pt minus 2.0pt
|
| 758 |
+
* \hoffset=0.0pt
|
| 759 |
+
* \voffset=0.0pt
|
| 760 |
+
* \mag=1000
|
| 761 |
+
* \@twocolumnfalse
|
| 762 |
+
* \@twosidefalse
|
| 763 |
+
* \@mparswitchfalse
|
| 764 |
+
* \@reversemarginfalse
|
| 765 |
+
* (1in=72.27pt=25.4mm, 1cm=28.453pt)
|
| 766 |
+
|
| 767 |
+
(/usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
|
| 768 |
+
[Loading MPS to PDF converter (version 2006.09.02).]
|
| 769 |
+
\scratchcounter=\count346
|
| 770 |
+
\scratchdimen=\dimen295
|
| 771 |
+
\scratchbox=\box78
|
| 772 |
+
\nofMPsegments=\count347
|
| 773 |
+
\nofMParguments=\count348
|
| 774 |
+
\everyMPshowfont=\toks46
|
| 775 |
+
\MPscratchCnt=\count349
|
| 776 |
+
\MPscratchDim=\dimen296
|
| 777 |
+
\MPnumerator=\count350
|
| 778 |
+
\makeMPintoPDFobject=\count351
|
| 779 |
+
\everyMPtoPDFconversion=\toks47
|
| 780 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty
|
| 781 |
+
Package: epstopdf-base 2020-01-24 v2.11 Base part for package epstopdf
|
| 782 |
+
Package epstopdf-base Info: Redefining graphics rule for `.eps' on input line 4
|
| 783 |
+
85.
|
| 784 |
+
|
| 785 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg
|
| 786 |
+
File: epstopdf-sys.cfg 2010/07/13 v1.3 Configuration of (r)epstopdf for TeX Liv
|
| 787 |
+
e
|
| 788 |
+
))
|
| 789 |
+
Package hyperref Info: Link coloring OFF on input line 7.
|
| 790 |
+
|
| 791 |
+
(./presentation.out) (./presentation.out)
|
| 792 |
+
\@outlinefile=\write5
|
| 793 |
+
\openout5 = `presentation.out'.
|
| 794 |
+
|
| 795 |
+
LaTeX Font Info: Overwriting symbol font `operators' in version `normal'
|
| 796 |
+
(Font) OT1/cmr/m/n --> OT1/cmss/m/n on input line 7.
|
| 797 |
+
LaTeX Font Info: Overwriting symbol font `operators' in version `bold'
|
| 798 |
+
(Font) OT1/cmr/bx/n --> OT1/cmss/b/n on input line 7.
|
| 799 |
+
\symnumbers=\mathgroup6
|
| 800 |
+
\sympureletters=\mathgroup7
|
| 801 |
+
LaTeX Font Info: Overwriting math alphabet `\mathrm' in version `normal'
|
| 802 |
+
(Font) OT1/cmss/m/n --> OT1/cmr/m/n on input line 7.
|
| 803 |
+
LaTeX Font Info: Redeclaring math alphabet \mathbf on input line 7.
|
| 804 |
+
LaTeX Font Info: Overwriting math alphabet `\mathbf' in version `normal'
|
| 805 |
+
(Font) OT1/cmr/bx/n --> OT1/cmss/b/n on input line 7.
|
| 806 |
+
LaTeX Font Info: Overwriting math alphabet `\mathbf' in version `bold'
|
| 807 |
+
(Font) OT1/cmr/bx/n --> OT1/cmss/b/n on input line 7.
|
| 808 |
+
LaTeX Font Info: Redeclaring math alphabet \mathsf on input line 7.
|
| 809 |
+
LaTeX Font Info: Overwriting math alphabet `\mathsf' in version `normal'
|
| 810 |
+
(Font) OT1/cmss/m/n --> OT1/cmss/m/n on input line 7.
|
| 811 |
+
LaTeX Font Info: Overwriting math alphabet `\mathsf' in version `bold'
|
| 812 |
+
(Font) OT1/cmss/bx/n --> OT1/cmss/m/n on input line 7.
|
| 813 |
+
LaTeX Font Info: Redeclaring math alphabet \mathit on input line 7.
|
| 814 |
+
LaTeX Font Info: Overwriting math alphabet `\mathit' in version `normal'
|
| 815 |
+
(Font) OT1/cmr/m/it --> OT1/cmss/m/it on input line 7.
|
| 816 |
+
LaTeX Font Info: Overwriting math alphabet `\mathit' in version `bold'
|
| 817 |
+
(Font) OT1/cmr/bx/it --> OT1/cmss/m/it on input line 7.
|
| 818 |
+
LaTeX Font Info: Redeclaring math alphabet \mathtt on input line 7.
|
| 819 |
+
LaTeX Font Info: Overwriting math alphabet `\mathtt' in version `normal'
|
| 820 |
+
(Font) OT1/cmtt/m/n --> OT1/cmtt/m/n on input line 7.
|
| 821 |
+
LaTeX Font Info: Overwriting math alphabet `\mathtt' in version `bold'
|
| 822 |
+
(Font) OT1/cmtt/m/n --> OT1/cmtt/m/n on input line 7.
|
| 823 |
+
LaTeX Font Info: Overwriting symbol font `numbers' in version `bold'
|
| 824 |
+
(Font) OT1/cmss/m/n --> OT1/cmss/b/n on input line 7.
|
| 825 |
+
LaTeX Font Info: Overwriting symbol font `pureletters' in version `bold'
|
| 826 |
+
(Font) OT1/cmss/m/it --> OT1/cmss/b/it on input line 7.
|
| 827 |
+
LaTeX Font Info: Overwriting math alphabet `\mathrm' in version `bold'
|
| 828 |
+
(Font) OT1/cmss/b/n --> OT1/cmr/b/n on input line 7.
|
| 829 |
+
LaTeX Font Info: Overwriting math alphabet `\mathbf' in version `bold'
|
| 830 |
+
(Font) OT1/cmss/b/n --> OT1/cmss/b/n on input line 7.
|
| 831 |
+
LaTeX Font Info: Overwriting math alphabet `\mathsf' in version `bold'
|
| 832 |
+
(Font) OT1/cmss/m/n --> OT1/cmss/b/n on input line 7.
|
| 833 |
+
LaTeX Font Info: Overwriting math alphabet `\mathit' in version `bold'
|
| 834 |
+
(Font) OT1/cmss/m/it --> OT1/cmss/b/it on input line 7.
|
| 835 |
+
LaTeX Font Info: Overwriting math alphabet `\mathtt' in version `bold'
|
| 836 |
+
(Font) OT1/cmtt/m/n --> OT1/cmtt/b/n on input line 7.
|
| 837 |
+
LaTeX Font Info: Redeclaring symbol font `pureletters' on input line 7.
|
| 838 |
+
LaTeX Font Info: Overwriting symbol font `pureletters' in version `normal'
|
| 839 |
+
(Font) OT1/cmss/m/it --> OT1/mathkerncmss/m/sl on input line 7
|
| 840 |
+
.
|
| 841 |
+
LaTeX Font Info: Overwriting symbol font `pureletters' in version `bold'
|
| 842 |
+
(Font) OT1/cmss/b/it --> OT1/mathkerncmss/m/sl on input line 7
|
| 843 |
+
.
|
| 844 |
+
LaTeX Font Info: Overwriting symbol font `pureletters' in version `bold'
|
| 845 |
+
(Font) OT1/mathkerncmss/m/sl --> OT1/mathkerncmss/bx/sl on inp
|
| 846 |
+
ut line 7.
|
| 847 |
+
|
| 848 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-basic-dicti
|
| 849 |
+
onary-English.dict
|
| 850 |
+
Dictionary: translator-basic-dictionary, Language: English
|
| 851 |
+
)
|
| 852 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-bibliograph
|
| 853 |
+
y-dictionary-English.dict
|
| 854 |
+
Dictionary: translator-bibliography-dictionary, Language: English
|
| 855 |
+
)
|
| 856 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-environment
|
| 857 |
+
-dictionary-English.dict
|
| 858 |
+
Dictionary: translator-environment-dictionary, Language: English
|
| 859 |
+
)
|
| 860 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-months-dict
|
| 861 |
+
ionary-English.dict
|
| 862 |
+
Dictionary: translator-months-dictionary, Language: English
|
| 863 |
+
)
|
| 864 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-numbers-dic
|
| 865 |
+
tionary-English.dict
|
| 866 |
+
Dictionary: translator-numbers-dictionary, Language: English
|
| 867 |
+
)
|
| 868 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-theorem-dic
|
| 869 |
+
tionary-English.dict
|
| 870 |
+
Dictionary: translator-theorem-dictionary, Language: English
|
| 871 |
+
) (./presentation.nav)
|
| 872 |
+
|
| 873 |
+
! LaTeX Error: File `01.tex' not found.
|
| 874 |
+
|
| 875 |
+
Type X to quit or <RETURN> to proceed,
|
| 876 |
+
or enter new name. (Default extension: tex)
|
| 877 |
+
|
| 878 |
+
Enter file name:
|
| 879 |
+
! Emergency stop.
|
| 880 |
+
<read *>
|
| 881 |
+
|
| 882 |
+
l.11 \input{01.tex}
|
| 883 |
+
^^M
|
| 884 |
+
End of file on the terminal!
|
| 885 |
+
|
| 886 |
+
|
| 887 |
+
Here is how much of TeX's memory you used:
|
| 888 |
+
21121 strings out of 473190
|
| 889 |
+
404630 string characters out of 5715800
|
| 890 |
+
774397 words of memory out of 5000000
|
| 891 |
+
43966 multiletter control sequences out of 15000+600000
|
| 892 |
+
559742 words of font info for 39 fonts, out of 8000000 for 9000
|
| 893 |
+
1141 hyphenation exceptions out of 8191
|
| 894 |
+
128i,4n,123p,394b,450s stack positions out of 10000i,1000n,20000p,200000b,200000s
|
| 895 |
+
! ==> Fatal error occurred, no output PDF file produced!
|
app/scripts/latex-to-mdx/input/presentation.nav
ADDED
|
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\headcommand {\beamer@partpages {1}{0}}
|
| 2 |
+
\headcommand {\beamer@subsectionpages {1}{0}}
|
| 3 |
+
\headcommand {\beamer@sectionpages {1}{0}}
|
| 4 |
+
\headcommand {\beamer@documentpages {0}}
|
| 5 |
+
\headcommand {\gdef \inserttotalframenumber {0}}
|
app/scripts/latex-to-mdx/input/presentation.out
ADDED
|
File without changes
|
app/scripts/latex-to-mdx/input/presentation.snm
ADDED
|
File without changes
|
app/scripts/latex-to-mdx/input/presentation.toc
ADDED
|
File without changes
|
app/scripts/latex-to-mdx/input/sections/00_abstract.tex
CHANGED
|
@@ -1,7 +1,7 @@
|
|
| 1 |
Robot learning is at an inflection point, driven by rapid advancements in machine learning and the growing availability of large-scale robotics data.
|
| 2 |
This shift from classical, model-based methods to data-driven, learning-based paradigms is unlocking unprecedented capabilities in autonomous systems.
|
| 3 |
This tutorial navigates the landscape of modern robot learning, charting a course from the foundational principles of Reinforcement Learning and Behavioral Cloning to generalist, language-conditioned models capable of operating across diverse tasks and even robot embodiments.
|
| 4 |
-
This work is intended as a guide for researchers and practitioners, and our goal is to equip the reader with the conceptual understanding and
|
| 5 |
\newline
|
| 6 |
|
| 7 |
Code: \textbf{\url{https://github.com/huggingface/lerobot}}
|
|
|
|
| 1 |
Robot learning is at an inflection point, driven by rapid advancements in machine learning and the growing availability of large-scale robotics data.
|
| 2 |
This shift from classical, model-based methods to data-driven, learning-based paradigms is unlocking unprecedented capabilities in autonomous systems.
|
| 3 |
This tutorial navigates the landscape of modern robot learning, charting a course from the foundational principles of Reinforcement Learning and Behavioral Cloning to generalist, language-conditioned models capable of operating across diverse tasks and even robot embodiments.
|
| 4 |
+
This work is intended as a guide for researchers and practitioners, and our goal is to equip the reader with the conceptual understanding and practical tools necessary to contribute to developments in robot learning, with ready-to-use examples implemented in~\lerobot.
|
| 5 |
\newline
|
| 6 |
|
| 7 |
Code: \textbf{\url{https://github.com/huggingface/lerobot}}
|
app/scripts/latex-to-mdx/input/sections/01_introduction.tex
CHANGED
|
@@ -32,7 +32,7 @@ This tutorial is structured as follows:
|
|
| 32 |
\begin{itemize}
|
| 33 |
\item Section~\ref{sec:classical} reviews classical robotics foundations, introducing the limitations of dynamics-based approaches to robotics.
|
| 34 |
\item Section~\ref{sec:learning-rl} elaborates on the limitations of dynamics-based methods, and introduce RL as a practical approach to solve robotics problems, considering its upsides and potential limitations.
|
| 35 |
-
\item Section~\ref{sec:
|
| 36 |
\item Section~\ref{sec:learning-foundation} presents recent contributions on developing generalist models for robotics applications, by learning from large corpora of multi-task \& multi-robot data (\emph{robotics foundation models}).
|
| 37 |
% \item Lastly, Section~\ref{sec:extensions} covers emerging directions in robot learning research, introducing recent works in post-training techniques for robotics foundation models, as well as recent works in world models for robotics.
|
| 38 |
\end{itemize}
|
|
@@ -42,7 +42,8 @@ We complement our presentation of the most common and recent approaches in robot
|
|
| 42 |
|
| 43 |
\subsection{\lerobotdataset}
|
| 44 |
|
| 45 |
-
\lerobotdataset~is
|
|
|
|
| 46 |
\lerobotdataset~also accommodates for storing general information regarding the data being collected, including textual descriptions of the task being performed by the teleoperator, the kind of robot used, and relevant measurement specifics like the frames per second at which the recording of both image and robot state's streams are proceeding.
|
| 47 |
|
| 48 |
In this, \lerobotdataset~provides a unified interface for handling multi-modal, time-series data, and it is designed to seamlessly integrate with the PyTorch and Hugging Face ecosystems.
|
|
@@ -92,8 +93,13 @@ Users can stream data of a large dataset hosted on the Hugging Face Hub, with a
|
|
| 92 |
Streaming datasets supports high-performance batch processing (ca. 80-100 it/s, varying on connectivity) and high levels of frames randomization, key features for practical BC algorithms which otherwise may be slow or operating on highly non-i.i.d. data.
|
| 93 |
This feature is designed to improve on accessibility so that large datasets can be processed by users without requiring large amounts of memory and storage.
|
| 94 |
|
| 95 |
-
\begin{pbox}[label={ex:dataset-batching}]{Batching a (Streaming) Dataset
|
| 96 |
-
|
| 97 |
-
}
|
| 98 |
-
|
| 99 |
-
\
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 32 |
\begin{itemize}
|
| 33 |
\item Section~\ref{sec:classical} reviews classical robotics foundations, introducing the limitations of dynamics-based approaches to robotics.
|
| 34 |
\item Section~\ref{sec:learning-rl} elaborates on the limitations of dynamics-based methods, and introduce RL as a practical approach to solve robotics problems, considering its upsides and potential limitations.
|
| 35 |
+
\item Section~\ref{sec:learning-imitation} further describes robot learning techniques that aim at solving single-tasks learning, leveraging BC techniques to autonomously reproduce specific expert demonstrations.
|
| 36 |
\item Section~\ref{sec:learning-foundation} presents recent contributions on developing generalist models for robotics applications, by learning from large corpora of multi-task \& multi-robot data (\emph{robotics foundation models}).
|
| 37 |
% \item Lastly, Section~\ref{sec:extensions} covers emerging directions in robot learning research, introducing recent works in post-training techniques for robotics foundation models, as well as recent works in world models for robotics.
|
| 38 |
\end{itemize}
|
|
|
|
| 42 |
|
| 43 |
\subsection{\lerobotdataset}
|
| 44 |
|
| 45 |
+
\lerobotdataset~is one of the most impactful features of \lerobot, developed in keeping with the observation that robotics data is increasingly central in robot learning.
|
| 46 |
+
Thus, \lerobot~defines a standardized dataset format designed to address the specific needs of robot learning research, providing a unified and convenient access to robotics data across modalities, including sensorimotor readings, multiple camera feeds and teleoperation status.
|
| 47 |
\lerobotdataset~also accommodates for storing general information regarding the data being collected, including textual descriptions of the task being performed by the teleoperator, the kind of robot used, and relevant measurement specifics like the frames per second at which the recording of both image and robot state's streams are proceeding.
|
| 48 |
|
| 49 |
In this, \lerobotdataset~provides a unified interface for handling multi-modal, time-series data, and it is designed to seamlessly integrate with the PyTorch and Hugging Face ecosystems.
|
|
|
|
| 93 |
Streaming datasets supports high-performance batch processing (ca. 80-100 it/s, varying on connectivity) and high levels of frames randomization, key features for practical BC algorithms which otherwise may be slow or operating on highly non-i.i.d. data.
|
| 94 |
This feature is designed to improve on accessibility so that large datasets can be processed by users without requiring large amounts of memory and storage.
|
| 95 |
|
| 96 |
+
\begin{pbox}[label={ex:dataset-batching}]{Batching a (Streaming) Dataset \\ \url{https://github.com/fracapuano/robot-learning-tutorial/blob/main/snippets/ch1/01_datasets.py}}
|
| 97 |
+
\lstinputlisting[language=python]{snippets/ch1/01_datasets.py}
|
| 98 |
+
\end{pbox}
|
| 99 |
+
|
| 100 |
+
\subsection{Code Example: Collecting Data}
|
| 101 |
+
\label{paragraph:collecting-data}
|
| 102 |
+
|
| 103 |
+
\begin{pbox}[label={ex:record-dataset}]{Record a Dataset \\ \url{https://github.com/fracapuano/robot-learning-tutorial/blob/main/snippets/ch1/02_record_data.py}}
|
| 104 |
+
\lstinputlisting[language=python]{snippets/ch1/02_record_data.py}
|
| 105 |
+
\end{pbox}
|
app/scripts/latex-to-mdx/input/sections/02_classic_robotics.tex
CHANGED
|
@@ -221,7 +221,7 @@ Rigid-body approximations are often insufficient in the presence of deformable o
|
|
| 221 |
In the case of complex, time-dependent and/or non-linear dynamics, even moderate mismatches in parameters, unmodeled evolutions, or grasp-induced couplings can qualitatively affect the observed dynamics.
|
| 222 |
|
| 223 |
Lastly, dynamics-based methods (naturally) overlook the rather recent \highlight{increase in availability of openly-available robotics datasets}.
|
| 224 |
-
The curation of academic datasets by large centralized groups of human experts in robotics~\citep{
|
| 225 |
If not tangentially, dynamics-based approaches are not posed to maximally benefit from this trend, which holds the premise of allowing generalization in the space of tasks and embodiments, like data was the cornerstone for advancements in vision~\citep{alayracFlamingoVisualLanguage2022} and natural-language understanding~\citep{brownLanguageModelsAre2020}.
|
| 226 |
|
| 227 |
Taken together, these limitations (Figure~\ref{fig:classical-limitations}) motivate the exploration of learning-based approaches that can (1) integrate perception and control more tightly, (2) adapt across tasks and embodiments with reduced expert modeling interventions and (3) scale gracefully in performance as more robotics data becomes available.
|
|
|
|
| 221 |
In the case of complex, time-dependent and/or non-linear dynamics, even moderate mismatches in parameters, unmodeled evolutions, or grasp-induced couplings can qualitatively affect the observed dynamics.
|
| 222 |
|
| 223 |
Lastly, dynamics-based methods (naturally) overlook the rather recent \highlight{increase in availability of openly-available robotics datasets}.
|
| 224 |
+
The curation of academic datasets by large centralized groups of human experts in robotics~\citep{oneillOpenXEmbodimentRobotic2025, khazatskyDROIDLargeScaleInTheWild2025} is now increasingly complemented by a \highlight{growing number of robotics datasets contributed in a decentralized fashion} by individuals with varied expertise.
|
| 225 |
If not tangentially, dynamics-based approaches are not posed to maximally benefit from this trend, which holds the premise of allowing generalization in the space of tasks and embodiments, like data was the cornerstone for advancements in vision~\citep{alayracFlamingoVisualLanguage2022} and natural-language understanding~\citep{brownLanguageModelsAre2020}.
|
| 226 |
|
| 227 |
Taken together, these limitations (Figure~\ref{fig:classical-limitations}) motivate the exploration of learning-based approaches that can (1) integrate perception and control more tightly, (2) adapt across tasks and embodiments with reduced expert modeling interventions and (3) scale gracefully in performance as more robotics data becomes available.
|
app/scripts/latex-to-mdx/input/sections/03_reinforcement_learning.tex
CHANGED
|
@@ -4,7 +4,7 @@
|
|
| 4 |
\epigraph{\textit{Approximate the solution, not the problem} [...]}{Richard Sutton}
|
| 5 |
|
| 6 |
\begin{tldr}
|
| 7 |
-
The need for expensive high-fidelity simulators can be obviated
|
| 8 |
\end{tldr}
|
| 9 |
|
| 10 |
\begin{figure}
|
|
@@ -16,28 +16,28 @@ The need for expensive high-fidelity simulators can be obviated by learning from
|
|
| 16 |
\label{fig:robot-learning-upsides}
|
| 17 |
\end{figure}
|
| 18 |
|
| 19 |
-
Learning-based techniques for robotics naturally address the limitations presented in~\ref{sec:classical} (Figure~\ref{fig:robot-learning-upsides}).
|
| 20 |
-
|
| 21 |
-
Mapping
|
| 22 |
-
|
| 23 |
-
Lastly, learning for robotics (\emph{robot learning}) is naturally well posed to leverage the growing amount of robotics data openly available, just as computer vision
|
| 24 |
|
| 25 |
-
Being a field at its relative nascent stages, no prevalent technique(s)
|
| 26 |
-
Still, two major classes of methods gained prominence: \highlight{
|
| 27 |
-
In this section, we provide a conceptual overview of applications of
|
| 28 |
-
We then introduce the major limitations RL suffers from, to introduce BC techniques in
|
| 29 |
|
| 30 |
-
\begin{
|
|
|
|
| 31 |
\centering
|
| 32 |
-
\includegraphics[width
|
| 33 |
-
\caption{Overview of the robot learning methods implemented in \lerobot.}
|
| 34 |
\label{fig:robot-learning-atlas}
|
| 35 |
-
\end{
|
| 36 |
|
| 37 |
-
In Figure~\ref{fig:robot-learning-atlas} we
|
| 38 |
-
While
|
| 39 |
Thus, we argue generalist policies can indeed be grouped alongside other task-specific BC methods, as they both leverage similar training data and schemas.
|
| 40 |
-
|
| 41 |
Figure~\ref{fig:robot-learning-atlas} illustrates this categorization graphically, explicitly listing all the robot learning policies currently available in \lerobot: Action Chunking with Transformers (ACT)~\citep{zhaoLearningFineGrainedBimanual2023}, Diffusion Policy~\citep{chiDiffusionPolicyVisuomotor2024}, Vector-Quantized Behavior Transformer (VQ-BeT)~\citep{leeBehaviorGenerationLatent2024}, \( \pi_0 \)~\citep{black$p_0$VisionLanguageActionFlow2024}, SmolVLA~\citep{shukorSmolVLAVisionLanguageActionModel2025}, Human-in-the-loop Sample-efficient RL (HIL-SERL)~\citep{luoPreciseDexterousRobotic2024} and TD-MPC~\citep{hansenTemporalDifferenceLearning2022}.
|
| 42 |
|
| 43 |
|
|
@@ -48,18 +48,17 @@ Figure~\ref{fig:robot-learning-atlas} illustrates this categorization graphicall
|
|
| 48 |
\label{fig:robotics-with-rl-examples}
|
| 49 |
\end{figure}
|
| 50 |
|
| 51 |
-
Applications of RL to robotics have been
|
| 52 |
-
Indeed, due to their interactive and sequential nature,
|
| 53 |
-
Figure~\ref{fig:robotics-with-rl-examples}
|
| 54 |
-
Reaching for an object to move somewhere else in the scene is
|
| 55 |
-
Figure~\ref{fig:robotics-with-rl-examples} also shows an example of a locomotion problem, where sequentiality is inherent in the problem formulation.
|
| 56 |
-
While sliding to the side, the controller has to constantly keep adjusting to the robot's propioperception to avoid failure (falling).
|
| 57 |
|
| 58 |
\subsection{A (Concise) Introduction to RL}
|
| 59 |
-
The RL framework~\citep{suttonReinforcementLearningIntroduction2018}, which we briefly introduce here, has often been used to
|
| 60 |
-
RL is a subfield within ML fundamentally concerned with the development of autonomous systems (\emph{agents})
|
| 61 |
-
Crucially for robotics, RL agents
|
| 62 |
-
In RL, this feedback loop (Figure~\ref{fig:rl-most-famous-pic})
|
| 63 |
|
| 64 |
\begin{figure}
|
| 65 |
\centering
|
|
@@ -69,38 +68,41 @@ In RL, this feedback loop (Figure~\ref{fig:rl-most-famous-pic}) between actions
|
|
| 69 |
\end{figure}
|
| 70 |
|
| 71 |
Formally, interactions between an agent and its environment are typically modeled via a Markov Decision Process (MDP)~\citep{bellmanMarkovianDecisionProcess1957}.
|
| 72 |
-
Representing robotics problems via MDPs offers several advantages, including (1) incorporating uncertainty through MDP's inherently stochastic formulation and (2) providing a theoretically
|
| 73 |
-
While accommodating
|
| 74 |
-
MDPs allowing for an unbounded number of interactions (
|
| 75 |
-
Unless diversely specified, we will only be referring to discrete-time finite-horizon (\emph{episodic}) MDPs
|
| 76 |
|
| 77 |
Formally, a lenght-\(T\) Markov Decision Process (MDP) is a tuple \( \mathcal M = \langle \statespace, \actionspace, \dynamics, r, \gamma, \rho, T \rangle \), where:
|
| 78 |
\begin{itemize}
|
| 79 |
-
\item \(\statespace\) is the \emph{state space}; \(\state \in \statespace\) denotes the (possibly non-directly observable) environment state at time \(t\). In robotics, states often comprise robot configuration and velocities (\(q_t, \dot q_t\)), and can accomodate sensor readings such as camera or audio streams.
|
| 80 |
-
|
| 81 |
-
\item \(\
|
| 82 |
-
|
|
|
|
|
|
|
|
|
|
| 83 |
\end{itemize}
|
| 84 |
-
Lastly, \(\gamma \in [0,1
|
| 85 |
|
| 86 |
-
|
| 87 |
\begin{equation}\label{eq:trajectory_definition}
|
| 88 |
\tau = \trajectory,
|
| 89 |
\end{equation}
|
| 90 |
-
with per-step rewards defined as \(r_t = r \transition \) for ease of notation.
|
|
|
|
| 91 |
%
|
| 92 |
\begin{align}
|
| 93 |
\mathbb P(\stateplusone \vert s_t, a_t, s_{t-1}, a_{t-1}, \dots s_0, a_0 ) &= \mathbb P \transitiongiven \label{eq:dynamics_markovian} \\
|
| 94 |
-
\mathbb P(\action \vert \state, a_{t-1}, s_{t-1}, s_0, a_0) &= \mathbb P(\action \vert \state) \label{eq:policy_markovian}
|
| 95 |
\end{align}
|
| 96 |
%
|
| 97 |
-
|
| 98 |
\begin{equation}\label{eq:traj_prob}
|
| 99 |
\mathbb P(\tau) = \mathbb P (s_0) \prod_{t=0}^{T-1} \mathbb P \transitiongiven \ \mathbb P(\action \vert \state).
|
| 100 |
\end{equation}
|
| 101 |
|
| 102 |
-
Policies \( \mathbb P(\action \vert \state) \) are typically indicated as \( \pi(\action \vert \state) \),
|
| 103 |
-
Policies are trained optimizing the (discounted) \emph{return} associated to a given \( \tau \), i.e. the (random) sum of measured rewards over trajectory:
|
| 104 |
\[
|
| 105 |
G(\tau) = \sum_{t=0}^{T-1} \gamma^{t} r_t.
|
| 106 |
\]
|
|
@@ -111,26 +113,26 @@ For a given dynamics \( \mathcal D \)---i.e., for a given problem---taking the e
|
|
| 111 |
\mathbb P_{\theta; \mathcal D} (\tau) &= \rho \prod_{t=0}^{T-1} \mathcal D \transition \ \pi_\theta (\action \vert \state).\label{eq:traj-probabilities-for-policies}
|
| 112 |
\end{align}
|
| 113 |
|
| 114 |
-
|
| 115 |
In turn, MDPs naturally provide a framework to optimize over the space of the possible behaviors an agent might enact (\( \pi \in \Pi \)), searching for the \emph{optimal policy} \( \pi^* = \arg \max_{\theta} J(\pi_\theta) \), where \( \theta \) is the parametrization adopted by the policy set \( \Pi: \pi_\theta \in \Pi, \ \forall \theta \).
|
| 116 |
-
|
| 117 |
-
Given any state \( s \in \statespace \)---e.g., a
|
| 118 |
\[
|
| 119 |
V_\pi(s) = \mathbb E_{\tau \sim \pi} \left[ G(\tau) \big \vert s_0 = s \right]
|
| 120 |
\]
|
| 121 |
can be used to discriminate between desirable and undesirable state in terms of long-term (discounted) reward maximization, under a given policy \(\pi\).
|
| 122 |
-
Similarily, the \emph{state-action} value function also conditions the cumulative discounted reward on selecting action \( a \) when in \( s \), and thereafter act according to \( \pi \)
|
| 123 |
\[
|
| 124 |
-
Q_\pi(s,a) = \mathbb E_{\tau \sim \pi} \left[ G (\tau) \big \vert s_0 = s, a_0=a \right]
|
| 125 |
\]
|
| 126 |
-
|
| 127 |
\begin{align}
|
| 128 |
Q_\pi(s_t, a_t) &= \mathbb{E}_{\stateplusone \sim \mathbb P(\bullet \vert \state, \action)} \left[ r_t + \gamma V_\pi(\stateplusone) \right] \label{eq:q-as-v} \\
|
| 129 |
-
V_\pi(\state) &= \mathbb E_{\action \sim \pi(\bullet \vert \state)} \left[ Q_\pi (\state, \action) \right]
|
| 130 |
\label{eq:v-as-q}
|
| 131 |
\end{align}
|
| 132 |
-
|
| 133 |
-
A variety of
|
| 134 |
|
| 135 |
\begin{figure}
|
| 136 |
\centering
|
|
@@ -139,22 +141,21 @@ A variety of methods have been developed in RL as standalone attemps to find (ap
|
|
| 139 |
\label{fig:rl-algos-atlas}
|
| 140 |
\end{figure}
|
| 141 |
|
| 142 |
-
Popular approaches to continuous state and action space---such as those studied within robotics---include~\citet{schulmanTrustRegionPolicy2017
|
| 143 |
-
Across manipulation~\citep{akkayaSolvingRubiksCube2019} and locomotion~\citep{leeLearningQuadrupedalLocomotion2020}
|
| 144 |
-
For a more complete survey of applications of RL to robotics, we refer the reader to~\citet{koberReinforcementLearningRobotics,
|
| 145 |
|
| 146 |
\subsection{Real-world RL for Robotics}
|
| 147 |
Streamlined end-to-end control pipelines, data-driven feature extraction and a disregard for explicit modeling in favor of interaction data are all features of RL for robotics.
|
| 148 |
-
However,
|
| 149 |
|
| 150 |
-
First, especially early in training, \highlight{actions are typically explorative, and thus erractic}.
|
| 151 |
On physical systems, untrained policies may command high velocities, self-collisiding configurations, or torques exceeding joint limits, leading to wear and potential hardware damage.
|
| 152 |
Mitigating these risks requires external safeguards (e.g., watchdogs, safety monitors, emergency stops), often incuring in a high degree of human supervision.
|
| 153 |
-
Further, in the typical episodic setting considered in most robotics problems, experimentation is substantially slowed down by the need to manually reset the environment over the course of training, a time-consuming and
|
| 154 |
-
|
| 155 |
-
Second, learning with a limited number of samples remains problematic in RL, \highlight{limiting the applicability of RL in real-world robotics due to consequently prohibitive timescales of training}.
|
| 156 |
Even strong algorithms such as SAC~\citep{haarnojaSoftActorCriticOffPolicy2018} typically require a large numbers of transitions \( \{ \sars \}_{t=1}^N \).
|
| 157 |
-
On hardware, generating
|
| 158 |
|
| 159 |
\begin{figure}
|
| 160 |
\centering
|
|
@@ -163,13 +164,13 @@ On hardware, generating these data is time-consuming and can even be prohibitive
|
|
| 163 |
\label{fig:synthetic-vs-real-duck}
|
| 164 |
\end{figure}
|
| 165 |
|
| 166 |
-
Training RL policies in simulation~\citep{tobinDomainRandomizationTransferring2017} addresses both issues
|
| 167 |
-
Yet, simulators require significant modeling effort, and rely on assumptions (simplified physical modeling, instantaneous actuation, static environmental conditions, etc.) limiting
|
| 168 |
-
\emph{Domain randomization} (DR) is a popular technique to overcome the reality gap,
|
| 169 |
-
In
|
| 170 |
-
In practice, DR is performed
|
| 171 |
-
Over the course of training---typically at each episode's reset---a new \( \xi \) is drawn, and used to specify the environment's dynamics for that episode.
|
| 172 |
For instance, one could decide to randomize the friction coefficient of the surface in a locomotion task (Figure~\ref{fig:ducks-on-terrains}), or the center of mass of an object for a manipulation task.
|
|
|
|
| 173 |
|
| 174 |
\begin{figure}
|
| 175 |
\centering
|
|
@@ -180,39 +181,38 @@ For instance, one could decide to randomize the friction coefficient of the surf
|
|
| 180 |
|
| 181 |
While effective in transfering policies across the reality gap in real-world robotics~\citep{tobinDomainRandomizationTransferring2017,akkayaSolvingRubiksCube2019, jiDribbleBotDynamicLegged2023,tiboniDomainRandomizationEntropy2024}, DR often requires extensive manual engineering.
|
| 182 |
First, identifying which parameters to randomize---i.e., the \emph{support} \( \text{supp} (\Xi) \) of \( \Xi \)---is an inherently task specific process.
|
| 183 |
-
When locomoting over different terrains, choosing to randomize the friction coefficient is a reasonable choice, yet not completely resolutive as other factors (lightning conditions, external temperature, joints' fatigue, etc.) may prove just as important, making selecting these parameters yet another source of brittlness.
|
| 184 |
|
| 185 |
Selecting the dynamics distribution \( \Xi \) is also non-trivial.
|
| 186 |
On the one hand, distributions with low entropy might risk to cause failure at transfer time, due to the limited robustness induced over the course of training.
|
| 187 |
-
On the other hand, excessive randomization may cause over-regularization and hinder performance.
|
| 188 |
Consequently, the research community investigated approaches to automatically select the randomization distribution \( \Xi \), using signals from the training process or tuning it to reproduce observed real-world trajectories.
|
| 189 |
-
|
| 190 |
-
While effective, AutoDR requires significant tuning---the bounds are widened by a fixed, pre-specified amount \( \Delta \)---and may disregard data when performance \emph{does not} improve after a distribution update~\citep{tiboniDomainRandomizationEntropy2024}.
|
| 191 |
-
|
| 192 |
-
|
| 193 |
-
|
| 194 |
-
For instance, ~\citet{chebotar2019closing} interleave in-simulation policy training with repeated real-world policy rollouts used to adjust \( \Xi \) based on real-world data, while ~\citet{tiboniDROPOSimtoRealTransfer2023} leverage a single, pre-collected set of real-world trajectories and tune \( \Xi \) under a simple likelihood objective.
|
| 195 |
|
| 196 |
-
While DR has shown promise, it does not address the main limitation that, even under the assumption that an ideal distribution \( \Xi \)
|
| 197 |
-
Simulating contact-rich manipulation of possibly deformable or soft materials---i.e., \emph{folding a piece of clothing}---can
|
| 198 |
|
| 199 |
-
A perhaps more foundamental limitation of RL for robotics is the general unavailability of complicated tasks' \emph{dense} reward function, the design of which is essentially based on human expertise and trial-and-error.
|
| 200 |
In practice, \emph{sparse} reward functions can be used to conclude whether one specific goal has been attained---\emph{has this t-shirt been correctly folded?}---but unfortunately incur in more challenging learning.
|
| 201 |
As a result, despite notable successes, deploying RL directly on real-world robots at scale remains challenging.
|
| 202 |
|
| 203 |
To make the most of (1) the growing number of openly available datasets and (2) relatively inexpensive robots like the SO-100, RL could (1) be anchored in already-collected trajectories---limiting erratic and dangerous exploration---and (2) train in the real-world directly---bypassing the aforementioned issues with low-fidelity simulations.
|
| 204 |
In such a context, sample-efficient learning is also paramount, as training on the real-world is inherently time-bottlenecked.
|
| 205 |
|
| 206 |
-
Off-policy algorithms like Soft Actor-Critic (SAC)~\citep{haarnojaSoftActorCriticOffPolicy2018} tend to be more sample efficient then their on-policy counterpart~\citep{schulmanProximalPolicyOptimization2017}, due to the presence a \emph{replay buffer} used over the course of
|
| 207 |
-
Other than allowing to re-use transitions \( \sars \)
|
| 208 |
-
Using expert demonstrations to guide learning together with learned rewards, RL
|
| 209 |
-
Interestingly, when
|
| 210 |
|
| 211 |
% DQN to DDPG to SAC
|
| 212 |
\paragraph{Sample-efficient RL}
|
| 213 |
-
In an MDP, the optimal policy \( \pi^* \) can be derived from its associated \qfunction, \( Q_{\pi^*} \), and in particular the optimal action(s) \(\mu(\state)\) can be selected maximizing the optimal \qfunction \ over the action space,
|
| 214 |
\[
|
| 215 |
-
\mu(\state) = \max_{\action \in \mathcal A}
|
| 216 |
\]
|
| 217 |
Interestingly, the \qopt-function satisfies a recursive relationship (\emph{Bellman equation}) based on a very natural intuition%
|
| 218 |
\footnote{Quote from~\citet{mnihPlayingAtariDeep2013}. The notation used has slightly been adapted for consistency with the rest of this tutorial.}:
|
|
@@ -230,9 +230,9 @@ is guaranteed to be self-consistent by definition.
|
|
| 230 |
Q_{i+1}(s_t, a_t) \leftarrow \mathbb E_{s_{t+1} \sim \mathbb P(\bullet \vert s_t, a_t)} \left[ r_t + \gamma \max_{a_{t+1} \in \mathcal A} Q_i (s_{t+1}, a_{t+1}) \big\vert s_t, a_t \right], \quad i=0,1,2,\dots,K
|
| 231 |
\]
|
| 232 |
Then, one can derive the (ideally, near-optimal) policy by explicitly maximizing over the action space the final (ideally, near-optimal) estimate \( Q_K \approx Q^* \) at each timestep.
|
| 233 |
-
|
| 234 |
|
| 235 |
-
Effective in its early applications to small-scale discrete problems
|
| 236 |
Also, vanilla Q-learning is not directly usable for \emph{continuous}, unstructured state-action space MPDs, such as those considered in robotics.
|
| 237 |
In their seminal work on \emph{Deep Q-Learning} (DQN),~\citet{mnihPlayingAtariDeep2013} propose learning Q-values using deep convolutional neural networks, thereby accomodating for large and even unstructured \emph{state} spaces.
|
| 238 |
DQN parametrizes the Q-function using a neural network with parameters \( \theta \), updating the parameters by sequentially minimizing the expected squared temporal-difference error (TD-error, \( \delta_i \)):
|
|
@@ -243,88 +243,132 @@ DQN parametrizes the Q-function using a neural network with parameters \( \theta
|
|
| 243 |
\big], \label{eq:dqn-loss} \\
|
| 244 |
y_i &= \mathbb E_{s_{t+1} \sim \mathbb P(\bullet \vert s_t, a_t)} \big[ r_t + \gamma \max_{\action \in \mathcal A} Q_{\theta_{i-1}} (\stateplusone, a_{t+1}) \big], \label{eq:TD-target}
|
| 245 |
\end{align}
|
| 246 |
-
|
| 247 |
-
Crucially, \( \chi \) can in principle be different from the policy being followed, effectively allowing to reuse prior data stored in a \emph{replay buffer} in the form of \( \sars \) transitions, used to form the TD-target \( y_i \), TD-error \( \delta_i \) and loss function
|
| 248 |
|
| 249 |
-
While effective in handling large, unstructured state spaces for discrete action-space problems, DQN
|
| 250 |
-
Indeed, in the case of high-capacity function approximators such as neural networks, solving \( \max_{a_t \in \mathcal A} Q_\theta(s_t, a_t) \) at each timestep is simply unfeasible due to the (1) continous nature of the action space (\( \actionspace \subset \mathbb R^n \) for some \( n \)) and (2) impossibility to express the
|
| 251 |
-
|
| 252 |
\begin{equation}\label{eq:deterministic-pg}
|
| 253 |
d_\phi = \mathbb E_{s_t \sim \mathbb P (\bullet)} \left[ \nabla_\phi Q(s_t, a_t)\vert_{a_t = \mu_\phi(s_t)} \right] = \mathbb E_{s_t \sim \mathbb P(\bullet)} \left[ \nabla_{a_t} Q(s_t, a_t) \vert_{a_t = \mu_\phi(s_t)} \cdot \nabla_\phi \mu(s_t) \right]
|
| 254 |
\end{equation}
|
| 255 |
-
Provably,
|
| 256 |
-
~\citet{
|
| 257 |
-
DDPG adopts a modified TD-target compared to
|
| 258 |
\begin{equation}\label{eq:TD-target-ddpg}
|
| 259 |
y_i = \mathbb E_{s_{t+1} \sim \mathbb P(\bullet \vert s_t, a_t)} \big[ r_t + \gamma Q_{\theta_{i-1}} (\stateplusone, \mu_\phi(\stateplusone)) \big] .
|
| 260 |
\end{equation}
|
| 261 |
-
Similarily to DQN, DDPG also employs the same replay buffer mechanism,
|
| 262 |
|
| 263 |
Soft Actor-Critic (SAC)~\citep{haarnojaSoftActorCriticOffPolicy2018} is a derivation of DDPG in the max-entropy (MaxEnt) RL framework, in which RL agents are tasked with \highlight{maximizing the discounted cumulative reward, while acting as randomly as possible}.
|
| 264 |
-
MaxEnt RL~\citep{
|
| 265 |
-
In that, MaxEnt revisits the RL objective \( J (\pi) \) to specifically account for the policy entropy,
|
| 266 |
\begin{align}
|
| 267 |
-
J(\pi) &= \sum_{t=0}^T \mathbb{E}_{(s_t, a_t) \sim \chi} \left[ r_t + \alpha \mathcal H(\pi (\bullet \vert s_t)) \right]
|
|
|
|
| 268 |
\end{align}
|
| 269 |
This modified objective results in the \emph{soft} TD-target:
|
| 270 |
\begin{equation}\label{eq:soft-td-target}
|
| 271 |
y_i = \mathbb E_{s_{t+1} \sim \mathbb P( \bullet \vert s_t, a_t)} \left[ r_t + \gamma \left( Q_{\theta_{i-1}} (\stateplusone, a_{t+1}) - \alpha \log \pi_\phi(a_{t+1} \vert \stateplusone) \right) \right], \quad a_{t+1} \sim \pi_\phi(\bullet \vert s_t)
|
| 272 |
\end{equation}
|
| 273 |
-
Similarily to DDPG, SAC also maintains an explicit policy, trained under the same MaxEnt framework for the maximization of
|
| 274 |
\begin{equation}\label{eq:sac-policy-update}
|
| 275 |
\pi_{k+1} \leftarrow \arg\min_{\pi^\prime \in \Pi} \DKL \left(\pi^\prime (\bullet \vert \state) \bigg\Vert \frac{\exp(Q_{\pi_k}(s_t, \bullet))}{Z_{\pi_k}(s_t)} \right)
|
| 276 |
\end{equation}
|
| 277 |
-
The update rule provided in
|
| 278 |
|
| 279 |
% SAC + prior data: RLPD
|
| 280 |
\paragraph{Sample-efficient, data-driven RL}
|
| 281 |
-
|
| 282 |
The replay buffer \( D \) also proves extremely useful in maintaining a history of previous transitions and using it for training, improving on sample efficiency.
|
| 283 |
-
Furthermore, it also naturally provides an entry point to inject offline trajectories recorded
|
| 284 |
|
| 285 |
Reinforcement Learning with Prior Data (RLPD)~\citep{ballEfficientOnlineReinforcement2023} is an Offline-to-Online RL algorithm leveraging prior data to effectively accelerate the training of a SAC agent.
|
| 286 |
-
Unlike previous works on Offline-to-Online RL, RLPD avoids any pre-training and instead uses the available offline data \( D_\text{offline} \) to improve online-learning from scratch.
|
| 287 |
-
During each training step, transitions from both the offline and online replay buffers are sampled in equal
|
|
|
|
| 288 |
|
| 289 |
% RLPD + reward classifier: SERL
|
| 290 |
\paragraph{Sample-efficient, data-driven, real-world RL}
|
| 291 |
Despite the possibility to leverage offline data for learning, the effectiveness of real-world RL training is still limited by the need to define a task-specific, hard-to-define reward function.
|
| 292 |
-
Further, even assuming to have access to a well-defined reward function, typical robotics pipelines rely
|
| 293 |
-
|
| 294 |
-
In
|
| 295 |
-
Reward classifiers are particularly useful in treating complex tasks---e.g., folding a t-shirt---for which a precise reward formulation is arbitrarily complex to obtain, or that do require significant shaping and are more easily learned directly from demonstrations of success (\(e^+\)) or failure (\(e^-\)) states, \( s \in \statespace \), with a natural choice for the state-conditioned reward function being \( r \mathcal S \mapsto \mathbb R \) being \( r(s) = \log c(e^+ \ vert s ) \).
|
| 296 |
-
Further,~\citet{luoSERLSoftwareSuite2025} demonstrate the benefits of learning \emph{forward} (executing the task from initial state to completion) and \emph{backward} (resetting the environment to the initial state from completion) controllers, parametrized by separate policies.
|
| 297 |
|
| 298 |
-
|
| 299 |
-
|
|
|
|
|
|
|
|
|
|
| 300 |
|
| 301 |
\begin{figure}
|
| 302 |
\centering
|
| 303 |
\includegraphics[width=0.8\linewidth]{figures/ch3/ch3-hil-serl-examples.png}
|
| 304 |
-
\caption{(A) HIL-SERL allows for real-world training of high performance RL agents by building on top advancements presented by of SAC, RLPD and SERL. (B) Example of human intervention during a HIL-SERL training process on a SO-100.}
|
| 305 |
\label{fig:hil-serl-blocks}
|
| 306 |
\end{figure}
|
| 307 |
|
| 308 |
% SERL + Human in the loop: HIL-SERL
|
| 309 |
-
Building on off-policy deep Q-learning with replay buffers, entropy regularization for better exploration
|
| 310 |
|
| 311 |
-
Human
|
| 312 |
-
While demonstrations provide the initial dataset seeding learning and constraining early exploration, interactive corrections allow a human supervisor to intervene on failure modes and supply targeted interventions
|
| 313 |
-
Crucially, human
|
| 314 |
-
|
| 315 |
-
Empirically, HIL-SERL attains near-perfect success rates on diverse manipulation tasks within 1-2 hours of training~\citep{luoPreciseDexterousRobotic2024}, underscoring how offline datasets with online RL can markedly improve stability and data efficiency, and ultimately even allow real-world RL-training.
|
| 316 |
|
| 317 |
\subsubsection{Code Example: Real-world RL}
|
| 318 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 319 |
|
| 320 |
\subsubsection{Limitations of RL in Real-World Robotics: Simulators and Reward Design}
|
| 321 |
|
| 322 |
-
Despite the advancements in real-world RL training,
|
| 323 |
\begin{itemize}
|
| 324 |
-
\item In those instances where real-world training experience is prohibitively expensive to gather~\citep{degraveMagneticControlTokamak2022, bellemareAutonomousNavigationStratospheric2020}
|
|
|
|
| 325 |
|
| 326 |
-
\item Reward design
|
| 327 |
\end{itemize}
|
| 328 |
|
| 329 |
-
Advances in
|
| 330 |
-
|
|
|
|
| 4 |
\epigraph{\textit{Approximate the solution, not the problem} [...]}{Richard Sutton}
|
| 5 |
|
| 6 |
\begin{tldr}
|
| 7 |
+
The need for expensive, high-fidelity simulators can be obviated learning from real-world data, using sample-efficient algorithms that can safely train directly on hardware.
|
| 8 |
\end{tldr}
|
| 9 |
|
| 10 |
\begin{figure}
|
|
|
|
| 16 |
\label{fig:robot-learning-upsides}
|
| 17 |
\end{figure}
|
| 18 |
|
| 19 |
+
Learning-based techniques for robotics naturally address the limitations presented in Section~\ref{sec:classical} (Figure~\ref{fig:robot-learning-upsides}).
|
| 20 |
+
In particular, learning-based techniques typically rely on monolithich prediction-to-action pipelines (\emph{visuomotor policies}) which do directly map sensorimotor inputs to predicted actions, streamlining control policies by removing the need to interface multiple components.
|
| 21 |
+
Mapping sensory inputs to actions also makes it possible to incorporate diverse input modalities, leveraging the automatic feature extraction capabilities of modern learning systems.
|
| 22 |
+
Moreover, learning-based approaches can, in principle, bypass explicit modeling altogether and instead rely solely on interaction data---an advantage that proves transformative when dynamics are difficult to model or entirely unknown.
|
| 23 |
+
Lastly, learning for robotics (\emph{robot learning}) is naturally well posed to leverage the growing amount of robotics data openly available, just as computer vision and natural language processing did historically benefit from large-scale corpora of data, in great part overlooked by dynamics-based approaches.
|
| 24 |
|
| 25 |
+
Being a field at its relative nascent stages, no prevalent technique(s) proves distinctly better than any other in the domain of robot learning.
|
| 26 |
+
Still, two major classes of methods gained prominence: \highlight{Reinforcement Learning (RL)} and \highlight{Behavioral Cloning (BC)} (Figure~\ref{fig:robot-learning-atlas}).
|
| 27 |
+
In this section, we provide a conceptual overview of applications of RL to robotics, as well as introduce practical examples of how to use RL within \lerobot.
|
| 28 |
+
We then introduce the major limitations RL suffers from, to introduce BC techniques in Section~\ref{sec:learning-imitation} and Section~{sec:learning-foundation}.
|
| 29 |
|
| 30 |
+
\begin{wrapfigure}[23]{r}{0.3\textwidth}
|
| 31 |
+
\vspace{-\intextsep}
|
| 32 |
\centering
|
| 33 |
+
\includegraphics[width=\linewidth]{figures/ch3/ch3-learning-atlas.png}
|
| 34 |
+
\caption{Overview of the robot learning methods implemented in \lerobot. All algorithms are implemented in Pytorch. References:~\citet{zhaoLearningFineGrainedBimanual2023,chiDiffusionPolicyVisuomotor2024,leeBehaviorGenerationLatent2024,black$p_0$VisionLanguageActionFlow2024,shukorSmolVLAVisionLanguageActionModel2025,luoPreciseDexterousRobotic2024,hansenTemporalDifferenceLearning2022} (top-to-bottom, left-to-right).}
|
| 35 |
\label{fig:robot-learning-atlas}
|
| 36 |
+
\end{wrapfigure}
|
| 37 |
|
| 38 |
+
In Figure~\ref{fig:robot-learning-atlas} we deliberately include generalist robot models~\citep{black$p_0$VisionLanguageActionFlow2024,shukorSmolVLAVisionLanguageActionModel2025} alongside task-specific BC methods.
|
| 39 |
+
While significantly different in spirit---\emph{generalist} models are language-conditioned and use instructions to generate motion valid across many tasks, while \emph{task-specific} models are typically not language-conditioned and used to perform a single task---\emph{foundation} models are still largely trained to reproduce trajectories contained in a (large) training set of input demonstrations.
|
| 40 |
Thus, we argue generalist policies can indeed be grouped alongside other task-specific BC methods, as they both leverage similar training data and schemas.
|
|
|
|
| 41 |
Figure~\ref{fig:robot-learning-atlas} illustrates this categorization graphically, explicitly listing all the robot learning policies currently available in \lerobot: Action Chunking with Transformers (ACT)~\citep{zhaoLearningFineGrainedBimanual2023}, Diffusion Policy~\citep{chiDiffusionPolicyVisuomotor2024}, Vector-Quantized Behavior Transformer (VQ-BeT)~\citep{leeBehaviorGenerationLatent2024}, \( \pi_0 \)~\citep{black$p_0$VisionLanguageActionFlow2024}, SmolVLA~\citep{shukorSmolVLAVisionLanguageActionModel2025}, Human-in-the-loop Sample-efficient RL (HIL-SERL)~\citep{luoPreciseDexterousRobotic2024} and TD-MPC~\citep{hansenTemporalDifferenceLearning2022}.
|
| 42 |
|
| 43 |
|
|
|
|
| 48 |
\label{fig:robotics-with-rl-examples}
|
| 49 |
\end{figure}
|
| 50 |
|
| 51 |
+
Applications of RL to robotics have been studied long enough that the relationship between these two disciplines has been compared to that of physics and matematics~\citep{koberReinforcementLearningRobotics}.
|
| 52 |
+
Indeed, due to their inherently interactive and sequential nature, robotics control problems can be directly cast as RL problems.
|
| 53 |
+
Figure~\ref{fig:robotics-with-rl-examples} presents two of such cases.
|
| 54 |
+
Reaching for an object to then move it somewhere else in the scene is a sequential problem where over time the controller needs to adjust the position of the robot arm based on the current configuration and the (possibly varying) position of the object.
|
| 55 |
+
Figure~\ref{fig:robotics-with-rl-examples} also shows an example of a locomotion problem, where sequentiality is inherent in the problem formulation: while sliding to the side, the controller needs to keep adjusting to the robot's to avoid failure (falling).
|
|
|
|
| 56 |
|
| 57 |
\subsection{A (Concise) Introduction to RL}
|
| 58 |
+
The RL framework~\citep{suttonReinforcementLearningIntroduction2018}, which we briefly introduce here, has often been used to tackle robotics problems~\citep{koberReinforcementLearningRobotics}.
|
| 59 |
+
RL is a subfield within ML fundamentally concerned with the development of autonomous systems (\emph{agents}) capable to \emph{continuously behave} in an evolving environment, developing (ideally, well-performing) control strategies (\emph{policies}).
|
| 60 |
+
Crucially for robotics, RL agents improve through trial and error, bypassing explicit models of the problem dynamics in favor of interaction data.
|
| 61 |
+
In RL, this feedback loop between actions and outcomes (Figure~\ref{fig:rl-most-famous-pic}) is established through the agent sensing a scalar quantity (\emph{reward}) measuring how desirable a given \emph{transition} is for the accomplishment of its goal.
|
| 62 |
|
| 63 |
\begin{figure}
|
| 64 |
\centering
|
|
|
|
| 68 |
\end{figure}
|
| 69 |
|
| 70 |
Formally, interactions between an agent and its environment are typically modeled via a Markov Decision Process (MDP)~\citep{bellmanMarkovianDecisionProcess1957}.
|
| 71 |
+
Representing robotics problems via MDPs offers several advantages, including (1) incorporating uncertainty through MDP's inherently stochastic formulation and (2) providing a theoretically-sound framework for learning \emph{without} an explicit model of the environment dynamics.
|
| 72 |
+
While accommodating a continuous time formulation too, MDPs are typically considered in discrete time in RL, assuming interactions to atomically take place at discrete \emph{timestep} \( t=0,1,2,3, \dots, T \).
|
| 73 |
+
MDPs allowing for an unbounded number of interactions (\( T \to + \infty \)) are termed \emph{infinite-horizon}, and opposed to \emph{finite-horizon} MDPs in which \( T \) is finite.
|
| 74 |
+
Unless diversely specified, we will only be referring to discrete-time finite-horizon (\emph{episodic}) MDPs.
|
| 75 |
|
| 76 |
Formally, a lenght-\(T\) Markov Decision Process (MDP) is a tuple \( \mathcal M = \langle \statespace, \actionspace, \dynamics, r, \gamma, \rho, T \rangle \), where:
|
| 77 |
\begin{itemize}
|
| 78 |
+
\item \(\statespace\) is the \emph{state space}; \(\state \in \statespace\) denotes the (possibly non-directly observable) environment state at time \(t\). In robotics, states often comprise robot configuration and velocities (\(q_t, \dot q_t\)), and can also accomodate sensor readings such as camera or audio streams.
|
| 79 |
+
%
|
| 80 |
+
\item \(\actionspace\) is the \emph{action space}; \(\action \in \actionspace\) may represent joint torques, joint velocities, or even end-effector commands at timestep \( t \). In general, actions correspond to commands intervenings on the configuration of the robot.
|
| 81 |
+
%
|
| 82 |
+
\item \(\dynamics\) represents the (possibly non-deterministic) environment dynamics, with \(\dynamics: \statespace \times \actionspace \times \statespace \mapsto [0, 1] \), \( \dynamics \, \transition = \transitionprob \). For instance, for a planar manipulator dynamics could be considered deterministic when the environment is fully described (Figure~\ref{fig:planar-manipulation-simple}), and stochastic when unmodeled disturbances depending on non-observable parameters intervene (Figure~\ref{fig:planar-manipulator-box-velocity}).
|
| 83 |
+
%
|
| 84 |
+
\item \(r: \statespace \times \actionspace \times \statespace \to \mathbb R\) is the \emph{reward function}, weighing the transition \( \transition \) in the context of the achievement of an arbitrary goal. For instance, a simple reward function for quickly moving along the \( x \) axis (Figure~\ref{fig:robotics-with-rl-examples}) could be based on the absolute position of the robot along the \( x \) axis~(\(p_{x_t}\)), present negative penalties for falling over (measured from \( p_{z_t} \)) and a introduce bonuses \( \dot p_{x_t} \) for speed, \(r \transition \equiv r(\state) = p_{x_t} \cdot \dot p_{x_t} - \tfrac{1}{p_{z_t}} \).
|
| 85 |
\end{itemize}
|
| 86 |
+
Lastly, \(\gamma \in [0,1) \) represent the discount factor regulating preference for immediate versus long-term reward (with an effective horizon equal to \( \tfrac{1}{1-\gamma} \)), and \( \rho \) is the distribution over \(\statespace \) for the MDP's \emph{initial}, \( s_0 \sim \rho \).
|
| 87 |
|
| 88 |
+
Therefore, a length-\(T\) \emph{trajectory} is the (random) sequence
|
| 89 |
\begin{equation}\label{eq:trajectory_definition}
|
| 90 |
\tau = \trajectory,
|
| 91 |
\end{equation}
|
| 92 |
+
with per-step rewards defined as \(r_t = r \transition \) for ease of notation.
|
| 93 |
+
Interestingly, assuming both the environment dynamics and conditional distribution over actions given states---i.e., the \emph{policy}---to be \emph{Markovian}:
|
| 94 |
%
|
| 95 |
\begin{align}
|
| 96 |
\mathbb P(\stateplusone \vert s_t, a_t, s_{t-1}, a_{t-1}, \dots s_0, a_0 ) &= \mathbb P \transitiongiven \label{eq:dynamics_markovian} \\
|
| 97 |
+
\mathbb P(\action \vert \state, a_{t-1}, s_{t-1}, s_0, a_0) &= \mathbb P(\action \vert \state), \label{eq:policy_markovian}
|
| 98 |
\end{align}
|
| 99 |
%
|
| 100 |
+
the probability of observing a given trajectory \( \tau \) factorizes into:
|
| 101 |
\begin{equation}\label{eq:traj_prob}
|
| 102 |
\mathbb P(\tau) = \mathbb P (s_0) \prod_{t=0}^{T-1} \mathbb P \transitiongiven \ \mathbb P(\action \vert \state).
|
| 103 |
\end{equation}
|
| 104 |
|
| 105 |
+
Policies \( \mathbb P(\action \vert \state) \) are typically indicated as \( \pi(\action \vert \state) \), often parametrized via \( \theta \), yielding \( \pi_\theta (\action \vert \state )\), and are traine by optimizing the (discounted) \emph{return} associated to a given \( \tau \), i.e. the (random) sum of measured rewards over an arbitrary trajectory,
|
|
|
|
| 106 |
\[
|
| 107 |
G(\tau) = \sum_{t=0}^{T-1} \gamma^{t} r_t.
|
| 108 |
\]
|
|
|
|
| 113 |
\mathbb P_{\theta; \mathcal D} (\tau) &= \rho \prod_{t=0}^{T-1} \mathcal D \transition \ \pi_\theta (\action \vert \state).\label{eq:traj-probabilities-for-policies}
|
| 114 |
\end{align}
|
| 115 |
|
| 116 |
+
Crucially, in the RL framework the agent is assumed to only \emph{observe} the environment dynamics and not to intervene on them, and thus eq.~\ref{eq:RL-j-function} varies exclusively with the policy followed.
|
| 117 |
In turn, MDPs naturally provide a framework to optimize over the space of the possible behaviors an agent might enact (\( \pi \in \Pi \)), searching for the \emph{optimal policy} \( \pi^* = \arg \max_{\theta} J(\pi_\theta) \), where \( \theta \) is the parametrization adopted by the policy set \( \Pi: \pi_\theta \in \Pi, \ \forall \theta \).
|
| 118 |
+
Besides providing a target for policy search, \( G(\tau) \) can also be used to discriminate between states \( s_t \) and \(\state, \action\) pairs.
|
| 119 |
+
Given any state \( s \in \statespace \)---e.g., given a configuration \( q \) of a robot---the \emph{state-value} function
|
| 120 |
\[
|
| 121 |
V_\pi(s) = \mathbb E_{\tau \sim \pi} \left[ G(\tau) \big \vert s_0 = s \right]
|
| 122 |
\]
|
| 123 |
can be used to discriminate between desirable and undesirable state in terms of long-term (discounted) reward maximization, under a given policy \(\pi\).
|
| 124 |
+
Similarily, the \emph{state-action} value function also conditions the cumulative discounted reward on selecting action \( a \) when in \( s \), and thereafter act according to \( \pi \),
|
| 125 |
\[
|
| 126 |
+
Q_\pi(s,a) = \mathbb E_{\tau \sim \pi} \left[ G (\tau) \big \vert s_0 = s, a_0=a \right].
|
| 127 |
\]
|
| 128 |
+
Importantly, value functions are interrelated:
|
| 129 |
\begin{align}
|
| 130 |
Q_\pi(s_t, a_t) &= \mathbb{E}_{\stateplusone \sim \mathbb P(\bullet \vert \state, \action)} \left[ r_t + \gamma V_\pi(\stateplusone) \right] \label{eq:q-as-v} \\
|
| 131 |
+
V_\pi(\state) &= \mathbb E_{\action \sim \pi(\bullet \vert \state)} \left[ Q_\pi (\state, \action) \right],
|
| 132 |
\label{eq:v-as-q}
|
| 133 |
\end{align}
|
| 134 |
+
inducing an ordering over states and state-action pairs under \( \pi \), and value functions are thus central to most RL algorithms.
|
| 135 |
+
A variety of algorithms have been developed in RL attempting to find (approximate) solutions to the problem of maximizing cumulative reward (we report some in Figure~\ref{fig:rl-algos-atlas}).
|
| 136 |
|
| 137 |
\begin{figure}
|
| 138 |
\centering
|
|
|
|
| 141 |
\label{fig:rl-algos-atlas}
|
| 142 |
\end{figure}
|
| 143 |
|
| 144 |
+
Popular approaches to continuous state and action space---such as those studied within robotics---include~\citet[TRPO]{schulmanTrustRegionPolicy2017},~\citet[PPO]{ schulmanProximalPolicyOptimization2017} and~\citet[SAC]{ haarnojaSoftActorCriticOffPolicy2018}.
|
| 145 |
+
Across manipulation~\citep{akkayaSolvingRubiksCube2019} and locomotion problems~\citep{leeLearningQuadrupedalLocomotion2020}, RL proved extremely effective in providing a platform to (1) leverage a unified, streamlined perception-to-action pipeline, (2) natively integrate propioperception with multi-modal high-dimensional sensory streams (3) disregard a description of the environment dynamics, by focusing on observed interaction data rather than modeling, and (4) anchor policies in the experience collected and stored in datasets.
|
| 146 |
+
For a more complete survey of applications of RL to robotics, we refer the reader to~\citet{koberReinforcementLearningRobotics,tangDeepReinforcementLearning2025}.
|
| 147 |
|
| 148 |
\subsection{Real-world RL for Robotics}
|
| 149 |
Streamlined end-to-end control pipelines, data-driven feature extraction and a disregard for explicit modeling in favor of interaction data are all features of RL for robotics.
|
| 150 |
+
However, RL still suffers from limitations concerning safety and learning efficiency, particularly pressing for real-world robotics applications.
|
| 151 |
|
| 152 |
+
First, especially early in training, \highlight{actions are typically explorative, and thus may be erractic}.
|
| 153 |
On physical systems, untrained policies may command high velocities, self-collisiding configurations, or torques exceeding joint limits, leading to wear and potential hardware damage.
|
| 154 |
Mitigating these risks requires external safeguards (e.g., watchdogs, safety monitors, emergency stops), often incuring in a high degree of human supervision.
|
| 155 |
+
Further, in the typical episodic setting considered in most robotics problems, experimentation is substantially slowed down by the need to manually reset the environment over the course of training, a time-consuming and error-prone process.
|
| 156 |
+
Second, learning efficiently remains problematic in RL, \highlight{limiting the applicability of RL in real-world robotics due to consequently prohibitive timescales of training}.
|
|
|
|
| 157 |
Even strong algorithms such as SAC~\citep{haarnojaSoftActorCriticOffPolicy2018} typically require a large numbers of transitions \( \{ \sars \}_{t=1}^N \).
|
| 158 |
+
On real-world hardware, generating this data is time-consuming.
|
| 159 |
|
| 160 |
\begin{figure}
|
| 161 |
\centering
|
|
|
|
| 164 |
\label{fig:synthetic-vs-real-duck}
|
| 165 |
\end{figure}
|
| 166 |
|
| 167 |
+
Training RL policies in simulation~\citep{tobinDomainRandomizationTransferring2017} addresses both issues, eliminating physical risk and dramatically increasing throughput.
|
| 168 |
+
Yet, simulators require significant modeling effort, and rely on assumptions (simplified physical modeling, instantaneous actuation, static environmental conditions, etc.) limiting the possibilities to transfer the policies learned in simulation, due the discrepancy between real and simulated environments (\emph{reality gap}, Figure~\ref{fig:synthetic-vs-real-duck}).
|
| 169 |
+
\emph{Domain randomization}~\citep{tobinDomainRandomizationTransferring2017} (DR) is a popular technique to overcome the reality gap, and consists in randomizing the parameters of the simulated environment during training, aiming at inducing robustness to specific disturbances.
|
| 170 |
+
In this, DR is typically employed to increase the diversity of scenarios over the course of training, improving on the performace sim-to-real transferred policies~\citep{akkayaSolvingRubiksCube2019,antonovaReinforcementLearningPivoting2017,jiDribbleBotDynamicLegged2023}.
|
| 171 |
+
In practice, DR is performed training in simulation on simulated dynamics \( \mathcal D \), further parametrized as \( \mathcal D \equiv \mathcal D_\xi \), with a \emph{dynamics} (random) vector \( \xi \) drawn an arbitrary distribution, \( \xi \sim \Xi \).
|
|
|
|
| 172 |
For instance, one could decide to randomize the friction coefficient of the surface in a locomotion task (Figure~\ref{fig:ducks-on-terrains}), or the center of mass of an object for a manipulation task.
|
| 173 |
+
Over the course of training---typically at each episode's reset---a new \( \xi \) is drawn, and used to specify the environment's dynamics for that episode.
|
| 174 |
|
| 175 |
\begin{figure}
|
| 176 |
\centering
|
|
|
|
| 181 |
|
| 182 |
While effective in transfering policies across the reality gap in real-world robotics~\citep{tobinDomainRandomizationTransferring2017,akkayaSolvingRubiksCube2019, jiDribbleBotDynamicLegged2023,tiboniDomainRandomizationEntropy2024}, DR often requires extensive manual engineering.
|
| 183 |
First, identifying which parameters to randomize---i.e., the \emph{support} \( \text{supp} (\Xi) \) of \( \Xi \)---is an inherently task specific process.
|
| 184 |
+
When locomoting over different terrains, choosing to randomize the friction coefficient is a reasonable choice, yet not completely resolutive as other factors (lightning conditions, external temperature, joints' fatigue, etc.) may prove just as important in practice, making selecting these parameters yet another source of brittlness.
|
| 185 |
|
| 186 |
Selecting the dynamics distribution \( \Xi \) is also non-trivial.
|
| 187 |
On the one hand, distributions with low entropy might risk to cause failure at transfer time, due to the limited robustness induced over the course of training.
|
| 188 |
+
On the other hand, excessive randomization may cause over-regularization and hinder performance~\citep{margolisRapidLocomotionReinforcement2022}.
|
| 189 |
Consequently, the research community investigated approaches to automatically select the randomization distribution \( \Xi \), using signals from the training process or tuning it to reproduce observed real-world trajectories.
|
| 190 |
+
\citet{akkayaSolvingRubiksCube2019} use a parametric uniform distribution \( \mathcal U(a, b) \) as \( \Xi \), widening the bounds \( a, b \) as training progresses and the agent's performance improves (AutoDR).
|
| 191 |
+
While effective, AutoDR requires significant tuning---the bounds are widened by a fixed, pre-specified amount \( \Delta \) along---and may disregard data when performance \emph{does not} improve after a distribution update~\citep{tiboniDomainRandomizationEntropy2024}. \citet{tiboniDomainRandomizationEntropy2024} propose a similar method to AutoDR (DORAEMON) to evolve \( \Xi \) based on the training signal, but with the key difference of explicitly maximizing the entropy of a parametric Beta distribution---inherently more flexible than uniform distributions---with learned updates instead of fixed \( \Delta \).
|
| 192 |
+
In this, DORAEMON proves particularly effective at dynamically increasing the entropy levels of the training distribution by employing an outer-loop max-entropy objective, tackled under performance constraints in the inner-loop RL problem.
|
| 193 |
+
Other approaches to automatically perform DR consist in specifically tuning \( \Xi \) to align as much as possible the simulation and real-world domains.
|
| 194 |
+
For instance,~\citet{chebotarClosingSimtorealLoop2019} interleave in-simulation policy training with repeated real-world policy rollouts used to adjust \( \Xi \) based on real-world data, while~\citet{tiboniDROPOSimtoRealTransfer2023} leverage a single, pre-collected set of real-world trajectories and tune \( \Xi \) under a simple likelihood objective.
|
|
|
|
| 195 |
|
| 196 |
+
While DR has shown promise, it does not address the main limitation that, even under the assumption that an ideal distribution \( \Xi \) was available, many robotics problems \highlight{cannot be simulated with high-enough fidelity under practical computational constraints}.
|
| 197 |
+
Simulating contact-rich manipulation of possibly deformable or soft materials---i.e., \emph{folding a piece of clothing}---can prove time-intensive, limiting the benefits of in-simulation training.
|
| 198 |
|
| 199 |
+
A perhaps more foundamental limitation of RL for robotics is the general unavailability of complicated tasks' \emph{dense} reward function, the design of which is essentially based on human expertise, ingenuity and trial-and-error.
|
| 200 |
In practice, \emph{sparse} reward functions can be used to conclude whether one specific goal has been attained---\emph{has this t-shirt been correctly folded?}---but unfortunately incur in more challenging learning.
|
| 201 |
As a result, despite notable successes, deploying RL directly on real-world robots at scale remains challenging.
|
| 202 |
|
| 203 |
To make the most of (1) the growing number of openly available datasets and (2) relatively inexpensive robots like the SO-100, RL could (1) be anchored in already-collected trajectories---limiting erratic and dangerous exploration---and (2) train in the real-world directly---bypassing the aforementioned issues with low-fidelity simulations.
|
| 204 |
In such a context, sample-efficient learning is also paramount, as training on the real-world is inherently time-bottlenecked.
|
| 205 |
|
| 206 |
+
Off-policy algorithms like Soft Actor-Critic (SAC)~\citep{haarnojaSoftActorCriticOffPolicy2018} tend to be more sample efficient then their on-policy counterpart~\citep{schulmanProximalPolicyOptimization2017}, due to the presence a \emph{replay buffer} used over the course of training.
|
| 207 |
+
Other than allowing to re-use past transitions \( \sars \), the replay buffer can also accomodate for the injection of previously-collected data in the training process~\citep{ballEfficientOnlineReinforcement2023}.
|
| 208 |
+
Using expert demonstrations to guide learning together with learned rewards, RL can be effectively carried out in the real-world~\citep{luoSERLSoftwareSuite2025}.
|
| 209 |
+
Interestingly, when complemented with in-training human interventions, real-world RL agents have been shown to learn policies with near-perfect success rates on challenging manipulation tasks in 1-2 hours~\citep{luoPreciseDexterousRobotic2024}.
|
| 210 |
|
| 211 |
% DQN to DDPG to SAC
|
| 212 |
\paragraph{Sample-efficient RL}
|
| 213 |
+
In an MDP, the optimal policy \( \pi^* \) can be derived from its associated \qfunction, \( Q^* \equiv Q_{\pi^*} \), and in particular the optimal action(s) \(\mu(\state)\) can be selected maximizing the optimal \qfunction \ over the action space,
|
| 214 |
\[
|
| 215 |
+
\mu(\state) = \max_{\action \in \mathcal A} Q^*(\state, \action).
|
| 216 |
\]
|
| 217 |
Interestingly, the \qopt-function satisfies a recursive relationship (\emph{Bellman equation}) based on a very natural intuition%
|
| 218 |
\footnote{Quote from~\citet{mnihPlayingAtariDeep2013}. The notation used has slightly been adapted for consistency with the rest of this tutorial.}:
|
|
|
|
| 230 |
Q_{i+1}(s_t, a_t) \leftarrow \mathbb E_{s_{t+1} \sim \mathbb P(\bullet \vert s_t, a_t)} \left[ r_t + \gamma \max_{a_{t+1} \in \mathcal A} Q_i (s_{t+1}, a_{t+1}) \big\vert s_t, a_t \right], \quad i=0,1,2,\dots,K
|
| 231 |
\]
|
| 232 |
Then, one can derive the (ideally, near-optimal) policy by explicitly maximizing over the action space the final (ideally, near-optimal) estimate \( Q_K \approx Q^* \) at each timestep.
|
| 233 |
+
Indeed, one can show that under certain assumptions on the MDP considered, \( Q_K \to Q^* \, \text{as } K \to \infty \).
|
| 234 |
|
| 235 |
+
Effective in its early applications to small-scale discrete problems, vanilla Q-learning was found complicated to scale to large \( \statespace \times \actionspace \) problems, in which storing \( Q : \statespace \times \actionspace \mapsto \mathbb R \) alone might result prohibitive.
|
| 236 |
Also, vanilla Q-learning is not directly usable for \emph{continuous}, unstructured state-action space MPDs, such as those considered in robotics.
|
| 237 |
In their seminal work on \emph{Deep Q-Learning} (DQN),~\citet{mnihPlayingAtariDeep2013} propose learning Q-values using deep convolutional neural networks, thereby accomodating for large and even unstructured \emph{state} spaces.
|
| 238 |
DQN parametrizes the Q-function using a neural network with parameters \( \theta \), updating the parameters by sequentially minimizing the expected squared temporal-difference error (TD-error, \( \delta_i \)):
|
|
|
|
| 243 |
\big], \label{eq:dqn-loss} \\
|
| 244 |
y_i &= \mathbb E_{s_{t+1} \sim \mathbb P(\bullet \vert s_t, a_t)} \big[ r_t + \gamma \max_{\action \in \mathcal A} Q_{\theta_{i-1}} (\stateplusone, a_{t+1}) \big], \label{eq:TD-target}
|
| 245 |
\end{align}
|
| 246 |
+
where \( \chi \) represents a behavior distribution over state-action pairs.
|
| 247 |
+
Crucially, \( \chi \) can in principle be different from the policy being followed, effectively allowing to reuse prior data stored in a \emph{replay buffer} \( D \) in the form of \( \sars \) transitions, used to form the TD-target \( y_i \), TD-error \( \delta_i \) and loss function eq.~\ref{eq:dqn-loss} via Monte-Carlo (MC) estimates.
|
| 248 |
|
| 249 |
+
While effective in handling large, unstructured state spaces for discrete action-space problems, DQN's application to continous control problems proved challenging.
|
| 250 |
+
Indeed, in the case of high-capacity function approximators such as neural networks, solving \( \max_{a_t \in \mathcal A} Q_\theta(s_t, a_t) \) at each timestep is simply unfeasible due to the (1) continous nature of the action space (\( \actionspace \subset \mathbb R^n \) for some \( n \)) and (2) impossibility to express the policy with a cheap (ideally, even closed-form) formulation, so that \( \max Q_\theta \) could be solved analytically.
|
| 251 |
+
\citet{pmlr-v32-silver14} tackle these fundamental challenges by using a \emph{deterministic} function of the state \( s_t \) as policy, \( \mu_\phi(s_t) = a_t \), parametrized by \( \phi \). Thus, policies can be iteratively refined updating \( \phi \) along the direction:
|
| 252 |
\begin{equation}\label{eq:deterministic-pg}
|
| 253 |
d_\phi = \mathbb E_{s_t \sim \mathbb P (\bullet)} \left[ \nabla_\phi Q(s_t, a_t)\vert_{a_t = \mu_\phi(s_t)} \right] = \mathbb E_{s_t \sim \mathbb P(\bullet)} \left[ \nabla_{a_t} Q(s_t, a_t) \vert_{a_t = \mu_\phi(s_t)} \cdot \nabla_\phi \mu(s_t) \right]
|
| 254 |
\end{equation}
|
| 255 |
+
Provably, eq.~\ref{eq:deterministic-pg} is the \emph{deterministic policy gradient} (DPG) of the policy \(\mu_\phi \)~\citep{pmlr-v32-silver14}, so that updates \( \phi_{k+1}\leftarrow \phi_k + \alpha d_\phi \) are guaranteed to increase the (deterministic) cumulative discounted reward, \( J(\mu_\phi) \).
|
| 256 |
+
~\citet{lillicrapContinuousControlDeep2019a} extended DPG to the case of (1) high-dimensional unstructured observations and (2) continuous action spaces, introducing Deep Deterministic Policy Gradient (DDPG), an important algorithm in RL and its applications to robotics.
|
| 257 |
+
DDPG adopts a modified TD-target compared to eq.~\ref{eq:TD-target}, by maintaining a policy network used to select actions, yielding
|
| 258 |
\begin{equation}\label{eq:TD-target-ddpg}
|
| 259 |
y_i = \mathbb E_{s_{t+1} \sim \mathbb P(\bullet \vert s_t, a_t)} \big[ r_t + \gamma Q_{\theta_{i-1}} (\stateplusone, \mu_\phi(\stateplusone)) \big] .
|
| 260 |
\end{equation}
|
| 261 |
+
Similarily to DQN, DDPG also employs the same replay buffer mechanism, reusing past transitions over training for increased sample efficiency and estimate the loss function via MC-estimates.
|
| 262 |
|
| 263 |
Soft Actor-Critic (SAC)~\citep{haarnojaSoftActorCriticOffPolicy2018} is a derivation of DDPG in the max-entropy (MaxEnt) RL framework, in which RL agents are tasked with \highlight{maximizing the discounted cumulative reward, while acting as randomly as possible}.
|
| 264 |
+
MaxEnt RL~\citep{haarnojaReinforcementLearningDeep2017b} has proven particularly robust thanks to the development of diverse behaviors, incentivized by its entropy-regularization formulation.
|
| 265 |
+
In that, MaxEnt revisits the RL objective \( J (\pi) \) to specifically account for the policy entropy \( \mathcal H(\pi (\bullet \vert s_t)) \),
|
| 266 |
\begin{align}
|
| 267 |
+
J(\pi) &= \sum_{t=0}^T \mathbb{E}_{(s_t, a_t) \sim \chi} \left[ r_t + \alpha \mathcal H(\pi (\bullet \vert s_t)) \right].
|
| 268 |
+
\label{eq:J-soft}
|
| 269 |
\end{align}
|
| 270 |
This modified objective results in the \emph{soft} TD-target:
|
| 271 |
\begin{equation}\label{eq:soft-td-target}
|
| 272 |
y_i = \mathbb E_{s_{t+1} \sim \mathbb P( \bullet \vert s_t, a_t)} \left[ r_t + \gamma \left( Q_{\theta_{i-1}} (\stateplusone, a_{t+1}) - \alpha \log \pi_\phi(a_{t+1} \vert \stateplusone) \right) \right], \quad a_{t+1} \sim \pi_\phi(\bullet \vert s_t)
|
| 273 |
\end{equation}
|
| 274 |
+
Similarily to DDPG, SAC also maintains an explicit policy, trained under the same MaxEnt framework for the maximization of eq.~\ref{eq:J-soft}, updated using:
|
| 275 |
\begin{equation}\label{eq:sac-policy-update}
|
| 276 |
\pi_{k+1} \leftarrow \arg\min_{\pi^\prime \in \Pi} \DKL \left(\pi^\prime (\bullet \vert \state) \bigg\Vert \frac{\exp(Q_{\pi_k}(s_t, \bullet))}{Z_{\pi_k}(s_t)} \right)
|
| 277 |
\end{equation}
|
| 278 |
+
The update rule provided in eq.~\ref{eq:sac-policy-update} optimizes the policy while projecting it on a set \( \Pi \) of tractable distributions (e.g., Gaussians,~\citet{haarnojaReinforcementLearningDeep2017b}).
|
| 279 |
|
| 280 |
% SAC + prior data: RLPD
|
| 281 |
\paragraph{Sample-efficient, data-driven RL}
|
| 282 |
+
Sampling \( \sars \) from the replay buffer \( D \) conveniently allows to approximate expectations for TD-target and TD-error through Monte-Carlo (MC) estimates.
|
| 283 |
The replay buffer \( D \) also proves extremely useful in maintaining a history of previous transitions and using it for training, improving on sample efficiency.
|
| 284 |
+
Furthermore, it also naturally provides an entry point to inject offline trajectories recorded by a human demonstrator into the training process.
|
| 285 |
|
| 286 |
Reinforcement Learning with Prior Data (RLPD)~\citep{ballEfficientOnlineReinforcement2023} is an Offline-to-Online RL algorithm leveraging prior data to effectively accelerate the training of a SAC agent.
|
| 287 |
+
Unlike previous works on Offline-to-Online RL, RLPD avoids any pre-training and instead only uses the available offline data \( D_\text{offline} \) to improve online-learning from scratch.
|
| 288 |
+
During each training step, transitions from both the offline and online replay buffers are sampled in equal proportions, and used in the underlying SAC routine.
|
| 289 |
+
Together with other implementation details (using LayerNorm layers to prevent value overestimation, and the use of ensembles techniques to form the TD-target), RLPD proves a particularly simple yet effective approach to use \( D_\text{offline} \) for Offline-to-Online RL.
|
| 290 |
|
| 291 |
% RLPD + reward classifier: SERL
|
| 292 |
\paragraph{Sample-efficient, data-driven, real-world RL}
|
| 293 |
Despite the possibility to leverage offline data for learning, the effectiveness of real-world RL training is still limited by the need to define a task-specific, hard-to-define reward function.
|
| 294 |
+
Further, even assuming to have access to a well-defined reward function, typical robotics pipelines rely on augmenting propioperceptive inputs with camera streams, and thus even well-defined rewards would need to be defined starting from unstructured observation---a challenging assumption in practice.
|
| 295 |
+
In their technical report,~\citet{luoSERLSoftwareSuite2025} empirically address the needs (1) to define a reward function and (2) to use it starting from unstructured, image observations.
|
| 296 |
+
In particular,~\citet[SERL]{luoSERLSoftwareSuite2025} introduces a suite of tools streamlining training of \emph{reward classifiers} \( c \), as well as jointly learn forward-backward controllers to speed up real-world RL.
|
|
|
|
|
|
|
| 297 |
|
| 298 |
+
Reward classifiers are particularly useful in treating complex, dynamic tasks---e.g., folding a t-shirt---for which a precise reward formulation is arbitrarily complex to obtain, or that do require significant shaping and are more easily learned directly from demonstrations of success (\(e^+\)) or failure (\(e^-\)) states, rather than from a precise formulation of \( r_t \), with a natural target for the reward classifier being \( r(s) = \log c(e^+ \ vert s ) \).
|
| 299 |
+
Furthermore,~\citet{luoSERLSoftwareSuite2025} demonstrate the benefits of learning separate (1) \emph{forward} and (2) \emph{backward} controllers---parametrized by separate policies---where (1) the former learns to execute a task to completion and (2) the latter learns to reset the environment to its initial state from terminal states, thereby aiding training in real-world episodic settings.
|
| 300 |
+
|
| 301 |
+
Lastly, in order to improve on the robustness of their approach to different goals while maintaing practical scalability,~\citet{luoSERLSoftwareSuite2025} introduced a modified state and action space, expressing proprioperceptive configurations \( q \) and actions \( \dot q \) in the frame of the end-effector pose at \( t=0 \).
|
| 302 |
+
Randomizing the initial pose of the end-effector (\( s_0 \)),~\citet{luoSERLSoftwareSuite2025} achieved a similar result to that of manually randomizing the environment at every timestep, but with the benefit of maintaining the environment in the same condition across multiple training episodes, achieving higher scalability of their method thanks to the increased practicality of their approach.
|
| 303 |
|
| 304 |
\begin{figure}
|
| 305 |
\centering
|
| 306 |
\includegraphics[width=0.8\linewidth]{figures/ch3/ch3-hil-serl-examples.png}
|
| 307 |
+
\caption{(A) HIL-SERL allows for real-world training of high performance RL agents by building on top advancements presented by of SAC, RLPD and SERL. (B) Example of human intervention during a HIL-SERL training process on a real-world SO-100.}
|
| 308 |
\label{fig:hil-serl-blocks}
|
| 309 |
\end{figure}
|
| 310 |
|
| 311 |
% SERL + Human in the loop: HIL-SERL
|
| 312 |
+
Building on off-policy deep Q-learning with replay buffers, entropy regularization for better exploration, expert demonstrations to guide learning, and a series of tools and recommendations for real-world training using reward classifiers (Figure~\ref{fig:hil-serl-blocks}),~\citet{luoPreciseDexterousRobotic2024} introduce human interactions during training, learning near-optimal policies in challenging real-world manipulation tasks in 1-2 hours.
|
| 313 |
|
| 314 |
+
Human-in-the-Loop, Sample Efficient Robot reinforcement Learning (HIL-SERL)~\citep{luoPreciseDexterousRobotic2024} augments offline-to-online RL with targeted human corrections during training, and employs prior data to (1) train a reward classifier and (2) bootstrap RL training on expert trajectories.
|
| 315 |
+
While offline demonstrations provide the initial dataset seeding learning and constraining early exploration, interactive, online corrections allow a human supervisor to intervene on failure modes and supply targeted interventions, greatly aiding the learning process~\citep{luoPreciseDexterousRobotic2024}.
|
| 316 |
+
Crucially, human intervention data is stored in \emph{both} the offline and online replay buffers, differently from the autonomous transitions generated at training time and stored in the online buffer only.
|
| 317 |
+
In turn, given an intervention timestep \( k \in (0, T) \), length-\(K\) human intervention data \( \{ s^{\text{human}}_k, a^{\text{human}}_k, r^{\text{human}}_k, s^{\text{human}}_{k+1},\}_{k=1}^K \) is more likely to be sampled than the data generated online during training, providing stronger supervision to the agent while still allowing for autonomous learning.
|
| 318 |
+
Empirically, HIL-SERL attains near-perfect success rates (99\%+) on diverse manipulation tasks within 1-2 hours of training~\citep{luoPreciseDexterousRobotic2024}, underscoring how offline datasets with online RL can markedly improve stability and data efficiency, and ultimately even allow real-world RL-training.
|
| 319 |
|
| 320 |
\subsubsection{Code Example: Real-world RL}
|
| 321 |
+
|
| 322 |
+
\begin{figure}
|
| 323 |
+
\centering
|
| 324 |
+
\includegraphics[width=0.9\textwidth]{figures/ch3/ch3-hil-serl-architecture.png}
|
| 325 |
+
\caption{HIL-SERL is a SOTA RL algorithm for training control policies directly in the real-world. Its implementation in \lerobot~relies on a decoupled actor-learner architecture, communicating over processes (and possibly networks) with queues used to share (1) transitions \( \sars \) and (2) parameters \( \theta \).}
|
| 326 |
+
\label{fig:ch3-hil-serl-architecture}
|
| 327 |
+
\end{figure}
|
| 328 |
+
|
| 329 |
+
This example shows how to use the HIL-SERL implementation supported by \lerobot.
|
| 330 |
+
This code example is organized into four parts: we first show how to train a reward classifier from a custom set of demonstrations, then define the \texttt{Actor} and \texttt{Learner} components, and finally, we bring them together in a complete script showing how to use HIL-SERL in practice.
|
| 331 |
+
|
| 332 |
+
At a higher level, the HIL-SERL architecture (Figure~\ref{fig:ch3-hil-serl-architecture}) relies on two main components:
|
| 333 |
+
\begin{itemize}
|
| 334 |
+
\item An \texttt{Actor}, running a frozen policy network used to interact with the environment and obtain observations. Observations are used to both condition the frozen actor in selecting the action to enact, and to form \( \sars \) transitions that are shared with the \texttt{Learner}. Rewards are inferred using a custom, learned reward classifier trained on a dataset of offline demonstrations.
|
| 335 |
+
%
|
| 336 |
+
\item A \texttt{Learner}, used to optimize the policy's parameters \( \theta \) for maximum expected return. The learner samples batches of offline data from online and offline buffers in equal proportion~\citep{ballEfficientOnlineReinforcement2023}, and shares updated parameters with the \texttt{Actor}.
|
| 337 |
+
\end{itemize}
|
| 338 |
+
|
| 339 |
+
The HIL-SERL architecture presented in this example can be exclusively run locally, but the implementation in \lerobot~also allows the \texttt{Actor} and \texttt{Learner} to run on two separate machines connected by the network.
|
| 340 |
+
|
| 341 |
+
% \paragraph{Learning a Reward Classifier}
|
| 342 |
+
\begin{pbox}[label={ex:train_reward_classifier}]{Training a Reward Classifier \\ \url{https://github.com/fracapuano/robot-learning-tutorial/blob/main/snippets/ch3/01_reward_classifier.py}}
|
| 343 |
+
\lstinputlisting[language=python]{snippets/ch3/01_reward_classifier.py}
|
| 344 |
+
\end{pbox}
|
| 345 |
+
|
| 346 |
+
% \paragraph{Defining the \texttt{Actor}}
|
| 347 |
+
\begin{pbox}[label={ex:hil_serl_defining_actor}]{Defining the \texttt{Actor} \\ \url{https://github.com/fracapuano/robot-learning-tutorial/blob/main/snippets/ch3/02_actor.py}}
|
| 348 |
+
\lstinputlisting[language=python]{snippets/ch3/02_actor.py}
|
| 349 |
+
\end{pbox}
|
| 350 |
+
|
| 351 |
+
|
| 352 |
+
% \paragraph{Defining the \texttt{Learner}}
|
| 353 |
+
\begin{pbox}[label={ex:hil_serl_defining_learner}]{Defining the \texttt{Learner} \\ \url{https://github.com/fracapuano/robot-learning-tutorial/blob/main/snippets/ch3/03_learner.py}}
|
| 354 |
+
\lstinputlisting[language=python]{snippets/ch3/03_learner.py}
|
| 355 |
+
\end{pbox}
|
| 356 |
+
|
| 357 |
+
% \paragraph{Using HIL-SERL}
|
| 358 |
+
\begin{pbox}[label={ex:hil_serl_full}]{Using HIL-SERL \\ \url{https://github.com/fracapuano/robot-learning-tutorial/blob/main/snippets/ch3/04_hil_serl.py}}
|
| 359 |
+
\lstinputlisting[language=python]{snippets/ch3/04_hil_serl.py}
|
| 360 |
+
\end{pbox}
|
| 361 |
+
|
| 362 |
|
| 363 |
\subsubsection{Limitations of RL in Real-World Robotics: Simulators and Reward Design}
|
| 364 |
|
| 365 |
+
Despite the advancements in real-world RL training, training RL agents for real-world tasks still suffers from the following limitations:
|
| 366 |
\begin{itemize}
|
| 367 |
+
\item In those instances where real-world training experience is prohibitively expensive to gather (e.g., Tokamak control~\citep{degraveMagneticControlTokamak2022}, Autonomous Stratospehere Navigation~\citep{bellemareAutonomousNavigationStratospheric2020})in-simulation training is often the only viable option.
|
| 368 |
+
However, high-fidelity simulators for real-world problems can be difficult to build and maintain, especially for contact-rich manipulation and tasks involving deformable or soft materials.
|
| 369 |
|
| 370 |
+
\item Reward design is a fundamental source of brittleness in real-world RL pipelines. While shaping dense rewards is often necessary to guide exploration in long-horizon tasks, the process is error-prone and heavily reliant on human expertise and intuition. Poorly tuned terms can lead to specification gaming or convergence to local optima, making reward shaping a critical challenge for applying RL in practice. Sparse rewards that only signal successful trajectories can avoid these pitfalls but typically result in much slower learning due to reduced supervision.
|
| 371 |
\end{itemize}
|
| 372 |
|
| 373 |
+
Advances in learning to act from potentially large corpora of human demonstrations via Behavioral Cloning (BC) address both of these concerns.
|
| 374 |
+
Although suffering from an inherent suboptimality---imitation learning can at most match the performance level of the demonstrator---learning to reproduce expert demonstrations via BC has proven increasingly competitive and practical, bypassing the need for simulated environments and hard-to-define reward functions.
|
app/scripts/latex-to-mdx/input/sections/04_imitation_learning.tex
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
app/scripts/latex-to-mdx/input/sections/05_foundation_models.tex
CHANGED
|
@@ -4,14 +4,14 @@
|
|
| 4 |
\epigraph{\textit{Specialization is for insects}}{Robert A. Heinlein}
|
| 5 |
|
| 6 |
\begin{tldr}
|
| 7 |
-
Openly available large
|
| 8 |
\end{tldr}
|
| 9 |
|
| 10 |
-
The advent of large models trained on internet-scale datasets has drastically influenced fields like Computer Vision (CV) and Natural Language Processing (NLP), shifting the paradigm towards combining (1) an initial, task-agnostic large-scale pre-training stage and a (2) task-specific, adjustment phase.
|
| 11 |
-
|
| 12 |
-
Factors including (1) the advancements in generalist models learned with self-supervision for perception~\citep{oquabDINOv2LearningRobust2024} or semantic understanding~\citep{devlinBERTPretrainingDeep2019} and (2) the popularization collective efforts to aggregate large-scale openly available datasets~\citep{
|
| 13 |
This shift taps into the long-standing challenge of developing generalist robot policies, and holds the premise to surpass traditionally siloed approaches to robotics problems and develop a \emph{foundation robotics model}.
|
| 14 |
-
While Section~\ref{sec:learning-
|
| 15 |
|
| 16 |
\begin{figure}
|
| 17 |
\centering
|
|
@@ -21,112 +21,110 @@ While Section~\ref{sec:learning-bc-single} introduced methods for learning \emph
|
|
| 21 |
\end{figure}
|
| 22 |
|
| 23 |
\subsection{Preliminaries: Models and Data}
|
| 24 |
-
The remarkable success of foundation models in NLP and CV
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
In particular, since each expert trajectory is tied to a specific robot platform and the operating conditions of its environment and task, data heterogeneity has long posed a \emph{methodological} challenge for scaling robotics datasets via aggregation.
|
|
|
|
| 30 |
Beyond this, heterogeneity also raises \emph{conceptual} issues: naively mixing data across embodiments can induce negative transfer, as control strategies developed in isolation for different robot systems in different environments may even conflict when combined.
|
| 31 |
-
Thus, the high degree of fragmentation of robotics datasets and tasks has traditionally led to the development of \emph{specialist} policies, trained on small, task-specific datasets,
|
| 32 |
|
| 33 |
\begin{figure}
|
| 34 |
\centering
|
| 35 |
\includegraphics[width=0.8\textwidth]{figures/ch5/ch5-generalist-policies-timeline.png}
|
| 36 |
-
\caption{Early efforts in the development of generalist models for robotics include BC-Zero~\citep{jangBCZZeroShotTask2022}, RT-1~\citep{brohanRT1RoboticsTransformer2023}, and RT-2~\citep{brohanRT2VisionLanguageActionModels2023}: large scale models trained on thousands of demonstrations. The open release of the Open-X~\citep{
|
| 37 |
\label{fig:ch5-generalist-policies-timeline}
|
| 38 |
\end{figure}
|
| 39 |
|
| 40 |
-
|
| 41 |
Figure~\ref{fig:ch5-generalist-policies-timeline} shows a timeline of some of the most popular contributions attempting at developing generalist policies.
|
| 42 |
-
Starting from BC-Zero, a latent variable model trained on
|
| 43 |
-
|
| 44 |
-
In particular, RT-1 uses a transformer architecture, and is trained on as many as 130k human-recorded trajectories collected over 13 robots
|
| 45 |
-
RT-1 learns to process a history of camera images and a natural language instruction, and feeds the resulting sequence of high-dimensional tokens to a transformer, trained using a \emph{classification loss on a discretized actions space} consisting of
|
| 46 |
|
| 47 |
-
|
| 48 |
In RT-2,~\citet{brohanRT2VisionLanguageActionModels2023} propose inheriting internet-scale semantic knowledge from large-scale multi-modal datasets to learn a single, \emph{unified model} for robotics control.
|
| 49 |
-
Such a model, termed \emph{Vision-Language-Action} (VLA) in the original RT-2 paper, effectively casts robot control as a language
|
| 50 |
-
In their work,~\citet{brohanRT2VisionLanguageActionModels2023} propose co-fine-tuning
|
| 51 |
-
|
| 52 |
-
For instance,~\citet{brohanRT2VisionLanguageActionModels2023} show that while RT-2 has never been explicitly trained to repurpose tools for a hammering task, it can still combine its semantic understanding of images, so that when asked which object between (1) a piece of paper, (2) a pair of headphones or (3) a rock may be used instead of a hammer, it answers
|
| 53 |
-
|
| 54 |
-
Traditionally, research
|
| 55 |
-
|
| 56 |
-
The Open X-Embodiment project~\citep{
|
| 57 |
-
Besides the contribution of an aggregate, large scale dataset,~\citet{
|
| 58 |
-
The Distributed Robot Interaction Dataset (DROID)~\citep{khazatskyDROIDLargeScaleInTheWild2025} represents another significant step towards addressing the problem of scarse and disaggregated data in robot learning, providing a unique dataset consisting of
|
| 59 |
-
Recently, foundational datasets curated through large, centralized efforts, are increasingly complemented by decentralized, community-driven
|
| 60 |
-
Software libraries
|
| 61 |
-
|
| 62 |
-
|
| 63 |
-
The OpenVLA project~\citep{kimOpenVLAOpenSourceVisionLanguageAction2024} emerged in direct contrast
|
| 64 |
-
In particular,~\citet{kimOpenVLAOpenSourceVisionLanguageAction2024} trained OpenVLA by exclusively leveraging openly available data (
|
| 65 |
-
Architecturally, OpenVLA integrates a pre-trained vision encoder to project visual tokens into the embedding space of Llama2-7B~\citep{touvronLlama2Open2023} language
|
| 66 |
The language model backbone is then used to predict \emph{discrete action tokens} over 256 activation levels.
|
| 67 |
|
| 68 |
\begin{figure}
|
| 69 |
\centering
|
| 70 |
\includegraphics[width=0.9\textwidth]{figures/ch5/ch5-trends.png}
|
| 71 |
-
\caption{Robot learning is undergoing a paradigmatic shift: centralized data collections (A, left) are increasingly larger, often comprising
|
| 72 |
\label{fig:ch5-trends}
|
| 73 |
\end{figure}
|
| 74 |
|
| 75 |
-
Figure~\ref{fig:ch5-trends}
|
| 76 |
-
As datasets collected via centralized, cross-institutions cooperation of increasing size are made available for the research community, decentralized datasets collected by individual researchers and practitioners
|
| 77 |
-
Further, models used across tasks and embodiments are
|
| 78 |
|
| 79 |
-
\subsection{
|
| 80 |
Modern recipes to train large scale VLAs extend early efforts to learn foundation models from large amounts of data via BC, introducing significant advancements concerning both architectural and procedural aspects.
|
| 81 |
From an architectural perspective, modern VLAs such as \pizero~\citep{black$p_0$VisionLanguageActionFlow2024} leverage a \emph{unified transformer model} for efficiency of computation, while maintaining specialized sub-components within the model for visual perception and action prediction, enabling cross-task performance via language conditioning.
|
| 82 |
-
Crucially, modern VLAs including~\
|
| 83 |
-
Procedurally,
|
| 84 |
|
| 85 |
-
These architectural and procedural innovations present three benefits.
|
| 86 |
-
First, developing architectures that exploit internet-scale pre-trained backbones allows to fully
|
| 87 |
-
Second, using generative models for continuous action distributions allows to learn rich, multimodal data distributions, a much more likely scenario in the big-data regime typically tackled while developing generalist policies.
|
| 88 |
-
Further, introducing
|
| 89 |
-
This new paradigm has been at the core of some of the most capable generalist policies developed to date, capable to few-shot adapt to novel tasks and to perform highly dexterous manipulation tasks
|
| 90 |
|
| 91 |
\subsubsection{VLMs for VLAs}
|
| 92 |
-
VLMs are designed to
|
| 93 |
Recent advances in VLMs have been driven by the success of LLMs, with many approaches building upon pretrained LLMs and adopting similar training paradigms to the ones used in language modeling.
|
| 94 |
Typically, VLMs~\citep{alayracFlamingoVisualLanguage2022,laurenconWhatMattersWhen2024,linVILAPretrainingVisual2024} are constructed by integrating a pretrained vision encoder~\citep{radfordLearningTransferableVisual2021,zhaiSigmoidLossLanguage2023,finiMultimodalAutoregressivePretraining2024} with a pretrained LLM~\citep{grattafioriLlama3Herd2024,jiangMistral7B2023}.
|
| 95 |
Training then proceeds in multiple multimodal stages, beginning with a large-scale pretraining on datasets containing image-text pairs~\citep{LAION-COCO,kakaobrain2022coyo700m} and interleaved vision-language corpora~\citep{OBELICS,MMC4}, all followed by a supervised fine-tuning stage on instruction-tuning datasets~\citep{LLaVA-1.5,tong2024cambrian,laurenconWhatMattersWhen2024}.
|
| 96 |
The inherent multimodal nature of VLMs enables them to jointly reason over vision and language.
|
| 97 |
Pre-training on vast internet-scale datasets allows these models to associate visual patterns with textual descriptions, thereby acquiring a rich semantic understanding of the world---knowledge about objects, their properties, and relationships---without explicit supervision for each concept.
|
| 98 |
-
In turn, integrating
|
| 99 |
-
In principle, this allows the robot to ground high-level natural language instructions in its visual context, and possibly recognize
|
| 100 |
|
| 101 |
-
Recently, compute efficiency has also become a central focus in
|
| 102 |
Several works aim to reduce training costs by using smaller, more diverse datasets~\citep{LLaVA-1.5,InstructBLIP,bai2025qwen25vl,zhu2024minigpt,tong2024cambrian}, training smaller-scale models~\citep{marafiotiSmolVLMRedefiningSmall2025, moondream,minicmpv2024}, or by adapting pretrained unimodal models by tuning only a small subset of parameters~\citep{shukor2023epalm,vallaeys2024improveddepalm,MAPL,FROMAGe,tsimpoukelli2021multimodalfrozen,BLIP-2}.
|
| 103 |
-
While the majority of VLM research focuses on image and text modalities, recent work has demonstrated that similar techniques can be extended to integrate additional modalities, such as video and audio~\citep{wang2025internvideo2,liu2024kangaroo,zhang2025videollama,kong2024audioflam}---a particularly promising direction of research for robotics applications, where multiple sensor modalities can be integrated effectively.
|
| 104 |
This trend towards efficiency is paramount for robotics applications, where policies must operate under the stringent constraints of real-world deployment.
|
| 105 |
-
Indeed, robots often possess limited on-board computational resources and must react in real-time to dynamic environments.
|
| 106 |
-
Smaller and faster VLMs have thus become quintessential for developing responsive autonomous systems, enabling high-frequency control loops by reducing the latency between perception and action.
|
| 107 |
|
| 108 |
\subsection{\( \pi_0 \)}
|
| 109 |
|
| 110 |
\pizero~\citep{black$p_0$VisionLanguageActionFlow2024} introduce a VLA consisting of a MoE architecture consisting of (1) a pre-trained VLM backbone (Gemma 2.6B~\citep{teamGemma2Improving2024}) and (2) a dedicated action expert used to generate continuous actions via flow matching.
|
| 111 |
-
Images and language are embedded with a late-fusion
|
| 112 |
The two separate experts communicate via self-attention layers, but maintain disjoint weights to obtain query, key and values matrices at each layer, maintaining specialization while efficiently allocating computation.
|
| 113 |
|
| 114 |
\begin{figure}
|
| 115 |
\centering
|
| 116 |
\includegraphics[width=0.9\textwidth]{figures/ch5/ch5-pi0.png}
|
| 117 |
-
\caption{The \pizero
|
| 118 |
\label{fig:ch5-pi0}
|
| 119 |
\end{figure}
|
| 120 |
|
| 121 |
-
|
| 122 |
-
|
| 123 |
-
|
| 124 |
-
|
| 125 |
-
The different expert networks operate separately in processing the respective inputs and turning them into query, key and value matrices, and only share information between each other via self-attention layers.
|
| 126 |
The outputs from the VLM backbone are disregarded, while the vector field regressed by the action expert is used to iteratively refine the action process.
|
| 127 |
-
In particular, \pizero
|
| 128 |
-
Notably, \emph{within} each block the attention operations are bidirectional, while across blocks, future blocks are masked out.
|
| 129 |
-
Formally, this corresponds to using
|
| 130 |
\begin{equation*}
|
| 131 |
\mathbf{A} =
|
| 132 |
\bordermatrix{
|
|
@@ -139,7 +137,7 @@ Formally, this corresponds to using the attention mask
|
|
| 139 |
\end{equation*}
|
| 140 |
Note how \emph{intra}-block directional attention allows tokens to communicate freely, while \emph{inter}-block communication is mediated by the attention mask \(\mathbf{A} \).
|
| 141 |
\emph{Blockwise causal masking} effectively prevents the pre-trained perception-language tokens from attending to robotics-tokens, likely out of distribution for VLM backbones traditionally trained on large corpora of internet, non-robotics, data.
|
| 142 |
-
Crucially, because communication is obstructed between image-language tokens, proprioperceptive and action tokens, one can cache keys and values across denoising steps at runtime time, incuring in a reduced computational footprint and faster inference.
|
| 143 |
|
| 144 |
In \pizero, both the VLM backbone and action expert are update using a \emph{flow matching} loss, and in particular are updated minimizing:
|
| 145 |
\begin{align}
|
|
@@ -154,86 +152,90 @@ In \pizero, both the VLM backbone and action expert are update using a \emph{flo
|
|
| 154 |
\epsilon \sim \mathcal{N}(\mathbf{0}, \mathbf{I}), \quad
|
| 155 |
o_t, a_{t:t+H_a} \sim \mathcal D \notag
|
| 156 |
\end{align}
|
| 157 |
-
|
| 158 |
-
Importantly,~\citet{black$p_0$VisionLanguageActionFlow2024} minimize
|
| 159 |
In contrast,~\citet{driessKnowledgeInsulatingVisionLanguageAction2025} later show that failing to insulate the VLM knowledge from the flow matching gradients actually harms performance.
|
| 160 |
-
|
|
|
|
| 161 |
\begin{equation}
|
| 162 |
a_{t:t+H_a}^{\tau + \delta} = a_{t:t+H_a}^{\tau } + \delta v_\theta(a_{t:t+H_a}^{\tau }, o_t)
|
| 163 |
\end{equation}
|
| 164 |
|
| 165 |
-
Flow matching~\citep[Section\ref{sec:ch4-flow-matching}]{lipmanFlowMatchingGenerative2023} can be seen as a continuous time,
|
| 166 |
-
In turn,
|
| 167 |
-
In particular, the action expert is
|
| 168 |
-
Each action token embeds a noisy action \(a_i^{\tau} \in a^\tau_{t:t+H_a}\), alongside a sinusoidal encoding of the \emph{flow process} timestep \(\tau\).
|
| 169 |
-
The action expert then leverages full bidirectional attention across the \(H_a\) action tokens provided,
|
| 170 |
-
Interestingly, differently from a standard flow matching pipeline~\
|
| 171 |
|
| 172 |
\begin{wrapfigure}{r}{0.4\textwidth}
|
| 173 |
\vspace{-10pt}
|
| 174 |
\centering
|
| 175 |
\includegraphics[width=\linewidth]{figures/ch5/ch5-pi0-sampling-timesteps.png}
|
| 176 |
-
\caption{Unlike more traditional flow-matching algorithms, \pizero
|
| 177 |
\label{fig:ch5-pi0-sampling-timesteps}
|
| 178 |
\end{wrapfigure}
|
| 179 |
-
|
|
|
|
| 180 |
To further optimize performance and reduce inference time,~\citet{black$p_0$VisionLanguageActionFlow2024} propose reducing the support of the timestep distribution to \([0,s], \ s < 1 \), as for any forward-integration step size \( \delta = 1-s \) timesteps above \(s \) are never sampled at inference time.
|
| 181 |
|
| 182 |
-
Besides adopting a MoE architecture with a VLM backbone initialized from a pre-trained model and trained jointly with an action expert via flow matching, \pizero
|
| 183 |
-
The dataset used to train \pizero---referred to as \( \pi \) dataset---comprises a private, undisclosed portion obtained via teleoperation
|
| 184 |
-
|
| 185 |
-
|
| 186 |
-
In particular,~\citet{black$p_0$VisionLanguageActionFlow2024} report that, across a variety of benchmarks, \pizero
|
| 187 |
-
|
| 188 |
-
In turn, robot trained on high-quality data exclusively with BC may be incapable to recover from failure.
|
| 189 |
-
Conversely, large scale collections of human demonstrations are typically much more diverse (if anything, for their sheer scale), and
|
| 190 |
|
| 191 |
Lastly,~\citet{black$p_0$VisionLanguageActionFlow2024} present cross-embodiment experiments where they demonstrate \pizero's ability to control both mobile and static manipulator robots with varying arm embodiments.
|
| 192 |
-
The emergence of cross-embodiment capabilities is largely to be attributed to the presence of large scale cross-embodiment data in
|
| 193 |
-
|
| 194 |
-
\pizero also relies on three camera views, and uses masked image slots for training and deployment scenarios with fewer cameras.
|
| 195 |
|
| 196 |
\subsubsection{Code Example: Using \pizero}
|
| 197 |
-
\
|
|
|
|
|
|
|
| 198 |
|
| 199 |
\subsection{SmolVLA}
|
| 200 |
-
VLAs
|
| 201 |
-
|
| 202 |
-
SmolVLA~\citep{shukorSmolVLAVisionLanguageActionModel2025} is an entirely open-source research effort, aiming to democratize the developments of robotics foundation models by open sourcing model, training recipes and data used.
|
| 203 |
|
| 204 |
\begin{figure}
|
| 205 |
\centering
|
| 206 |
\includegraphics[width=0.9\textwidth]{figures/ch5/ch5-smolvla.png}
|
| 207 |
-
\caption{The SmolVLA architecture, as in~\citet{shukorSmolVLAVisionLanguageActionModel2025}. SmolVLA is a compact MoE model trained with flow matching to denoise action chunks. Vision and language tokens are fed to a VLM backbone, and share information with the proprioperceptive and action tokens via the attention mechanism. The attention expert interleaves SA and CA layers for further conditioning on the visual features from the VLM backbone. SmolVLA skips computations and reduces the visual tokens, resulting in
|
| 208 |
\label{fig:ch5-smolvla}
|
| 209 |
\end{figure}
|
| 210 |
|
| 211 |
-
While encouraging efforts like \pizero~\citep{black$p_0$VisionLanguageActionFlow2024} demonstrate the feasibility of open VLA systems, they remain (1) large and compute-intensive and (2) dependent on closed datasets collected via centralized efforts on costly robotic platforms, ultimately
|
| 212 |
-
SmolVLA mitigates both these
|
| 213 |
Similarly to \pizero, SmolVLA (Figure~\ref{fig:ch5-smolvla}) employs a MoE architecture combining a pretrained VLM backbone with a dedicated action expert, and trains with flow matching.
|
| 214 |
To ensure efficiency and accessibility, SmolVLA adopts SmolVLM-2~\citep{marafiotiSmolVLMRedefiningSmall2025} as its VLM backbone, considering SmolVLM-2's reduced size and capability to process multiple image inputs alongside text items.
|
| 215 |
SmolVLM-2 uses SigLIP~\citep{zhaiSigmoidLossLanguage2023} as vision encoder, producing visual features for a SmolLM2 language decoder~\citep{allalSmolLM2WhenSmol2025}.
|
| 216 |
Further, SmolVLA adopts a smaller action expert consisting of \(\sim\)100M parameters and an interleaved stack of self and cross-attention layers.
|
| 217 |
To improve efficiency, the action expert adopts a reduced embedding dimension compared to the VLM backbone, resulting in \( d_{v_\theta} = 0.75 d_{\text{VLM}} \).
|
| 218 |
-
\
|
| 219 |
|
| 220 |
-
|
| 221 |
-
|
| 222 |
-
Similarily to \pizero, SmolVLA adopts separate experts communicating exclusively through self-attention layers, which do not employ
|
| 223 |
|
| 224 |
In contrast with \pizero, the action expert interleaves \emph{cross-attention} (CA) and \emph{self-attention} (SA) layers, a choice shown to yield higher success and smoother action chunks in practice.
|
| 225 |
-
While in the expert SA layers
|
| 226 |
-
Notably, keys and values can be cached as well, resulting in performance gains at inference time.
|
| 227 |
|
| 228 |
-
SmolVLA trims both token and layer compute.
|
| 229 |
-
First, it \emph{reduces visual tokens} via pixel
|
| 230 |
-
Second, it \emph{skips upper VLM layers}
|
| 231 |
Beyond model compactness, SmolVLA also contributes an inference stack that decouples action prediction from execution for responsiveness on modest hardware (Section~\ref{sec:ch4-async-inference}).
|
| 232 |
|
| 233 |
-
Departing from reliance on proprietary datasets, SmolVLA pretrains exclusively on 450+ \emph{community datasets}, totaling
|
| 234 |
Because instructions in community contributed dataset can be noisy or missing, the authors re-annotate tasks with a small off-the-shelf VLM using frames sampled from the dataset, and standardize camera viewpoints by mapping sources to a consistent top/wrist/side ordering.
|
| 235 |
-
At
|
| 236 |
-
SmolVLA proves effective across a range of both real-world and simulated environments, rivaling \pizero
|
| 237 |
|
| 238 |
\subsubsection{Code Example: Using SmolVLA}
|
| 239 |
-
\
|
|
|
|
|
|
|
|
|
| 4 |
\epigraph{\textit{Specialization is for insects}}{Robert A. Heinlein}
|
| 5 |
|
| 6 |
\begin{tldr}
|
| 7 |
+
Openly available, large-scale datasets and the development of stable-to-train, expressive and efficient architectures fostered research on the development of generalist robot policies that can operate across embodiment and tasks.
|
| 8 |
\end{tldr}
|
| 9 |
|
| 10 |
+
The advent of large models trained on internet-scale datasets has drastically influenced fields like Computer Vision (CV) and Natural Language Processing (NLP), shifting the previously task-specific paradigm towards combining (1) an initial, task-agnostic large-scale pre-training stage and a (2) task-specific, adjustment phase.
|
| 11 |
+
This \emph{pre-train-and-adaptat} paradigm has now largely replaced more classic approaches consisting of task-specific data collection, curation and model training in many subdomains within CV and NLP, and it is motivated by the main drawback of limited scalability for \emph{task-specific approaches}, which have been traditionally more labor intensive.
|
| 12 |
+
Factors including (1) the advancements in generalist models learned with self-supervision for perception~\citep{oquabDINOv2LearningRobust2024} or semantic understanding~\citep{devlinBERTPretrainingDeep2019} and (2) the popularization of collective efforts to aggregate large-scale openly available datasets~\citep{oneillOpenXEmbodimentRobotic2025,khazatskyDROIDLargeScaleInTheWild2025} are increasingly pushing the field of robot learning towards the pre-train-and-adapt paradigm.
|
| 13 |
This shift taps into the long-standing challenge of developing generalist robot policies, and holds the premise to surpass traditionally siloed approaches to robotics problems and develop a \emph{foundation robotics model}.
|
| 14 |
+
While Section~\ref{sec:learning-imitation} introduced methods for learning \emph{single-task policies} such as ACT or Diffusion Policy, in this section we present advancements in developing \emph{generalist, multi-task, policies}, capable of performing a wide range of tasks across different environments and embodiments, and guided by unstructured instructions typically given in plain, natural language.
|
| 15 |
|
| 16 |
\begin{figure}
|
| 17 |
\centering
|
|
|
|
| 21 |
\end{figure}
|
| 22 |
|
| 23 |
\subsection{Preliminaries: Models and Data}
|
| 24 |
+
The remarkable success of foundation models in NLP and CV seems to be increasingly predicated on two core principles: architectural innovation and (joint) data-compute scaling.
|
| 25 |
+
Indeed, the transformer architecture proved very effective in capturing long-range dependencies in a variety of data formats, and its stability and expressivity made it the \emph{de facto} standard for modern large-scale models trained on internet-scale datasets.
|
| 26 |
+
However, in stark contrast with large-scale NLP and CV datasets~\citep{raffelExploringLimitsTransfer2023,ImageNet_VSS09}, robotics has historically developed around small, task-specific datasets.
|
| 27 |
+
In turn, this traditionally hindered scalability across problems as well as results, posing concrete challenges to developing general-purpose robot learning algorithms.
|
| 28 |
+
Indeed, differently from the wealth of relatively readily-available task-agnostic text and images datasets on the internet, robotics data is \emph{intrinsically embodied} and thus task-specific: datasets collected for \emph{manipulation} differ significantly from \emph{locomotion}.
|
| 29 |
+
In particular, since each expert trajectory is tied to a specific robot platform and the operating conditions of its environment and task, data heterogeneity has long posed a \emph{methodological} challenge for scaling robotics datasets via aggregation.
|
| 30 |
+
Further, datasets consisting of expert demonstrations are (1) intrinsically more expensive to collect and (2) notoriously heterogeneous---different human experts may perform the same task in very different.
|
| 31 |
Beyond this, heterogeneity also raises \emph{conceptual} issues: naively mixing data across embodiments can induce negative transfer, as control strategies developed in isolation for different robot systems in different environments may even conflict when combined.
|
| 32 |
+
Thus, the high degree of fragmentation of robotics datasets and tasks has traditionally led to the development of \emph{specialist} policies, trained on small, task-specific datasets, developed to perform well at their designated task but that fail to generalize to new deployment scenarios (Figure~\ref{fig:ch5-ml-vs-robotics-foundation}).
|
| 33 |
|
| 34 |
\begin{figure}
|
| 35 |
\centering
|
| 36 |
\includegraphics[width=0.8\textwidth]{figures/ch5/ch5-generalist-policies-timeline.png}
|
| 37 |
+
\caption{Early efforts in the development of generalist models for robotics include BC-Zero~\citep{jangBCZZeroShotTask2022}, RT-1~\citep{brohanRT1RoboticsTransformer2023}, and RT-2~\citep{brohanRT2VisionLanguageActionModels2023}: large scale models trained on thousands of demonstrations. The open release of the Open-X~\citep{oneillOpenXEmbodimentRobotic2025} and DROID datasets~\citep{khazatskyDROIDLargeScaleInTheWild2025} fostered the development of open source models: OpenVLA~\citep{kimOpenVLAOpenSourceVisionLanguageAction2024}, \pizero~\citep{black$p_0$VisionLanguageActionFlow2024} and SmolVLA~\citep{shukorSmolVLAVisionLanguageActionModel2025}.}
|
| 38 |
\label{fig:ch5-generalist-policies-timeline}
|
| 39 |
\end{figure}
|
| 40 |
|
| 41 |
+
Driven by the goal of developing generalist robot policies, the research community has increasingly explored how insights and techniques from other areas of ML can be integrated into robotics.
|
| 42 |
Figure~\ref{fig:ch5-generalist-policies-timeline} shows a timeline of some of the most popular contributions attempting at developing generalist policies.
|
| 43 |
+
Starting from BC-Zero, a latent variable model trained on 25k+ demonstrations, the field has now evolved into \( \pi_0 \), a transformer-based model trained on 10M+ demonstrations and exhibiting strong few-shot capabilities across tasks and embodiments.
|
| 44 |
+
In between, Robotics Transformer 1 (RT-1)~\citep{brohanRT1RoboticsTransformer2023} represented a significant step in the direction of developing a generalist robot policies over prior work including (1) BC-Zero~\citep{jangBCZZeroShotTask2022} and (2) Gato~\citep{reedGeneralistAgent2022}, in that~\citet{brohanRT1RoboticsTransformer2023} use a much larger and diverse set of training tasks compared to both BC-Zero and Gato.
|
| 45 |
+
In particular, RT-1 uses a transformer architecture, and is trained on as many as 130k human-recorded trajectories collected over 13 robots and over 17 months.
|
| 46 |
+
RT-1 learns to process a history of camera images and a natural language instruction, and feeds the resulting sequence of high-dimensional tokens to a transformer, trained using a \emph{classification loss on a discretized actions space} consisting of six different 256-bins, one for each joint of a 6-dof robotic arm.
|
| 47 |
|
| 48 |
+
In a follow-up work, the same group of authors propose a modified method to learn generalist models, leveraging (1) a more powerful architecture and (2) scaling up the dataset used~\citep[RT-2]{brohanRT2VisionLanguageActionModels2023}.
|
| 49 |
In RT-2,~\citet{brohanRT2VisionLanguageActionModels2023} propose inheriting internet-scale semantic knowledge from large-scale multi-modal datasets to learn a single, \emph{unified model} for robotics control.
|
| 50 |
+
Such a model, termed \emph{Vision-Language-Action} (VLA) in the original RT-2 paper, effectively casts robot control as a language-modeling problem, and in particular as a Visual Question-Answering (VQ\&A) task, in which the output token space used to represent \emph{textual tokens} is shared with the \emph{8-bits tokens} used to represent the 256 (\( 2^8 \)) actuation levels of a 6-dof robot.
|
| 51 |
+
In their work,~\citet{brohanRT2VisionLanguageActionModels2023} propose co-fine-tuning large-scale VLMs such as PaLIX~\citep{chenPaLIXScalingMultilingual2023} or PaLM-E~\citep{driessPaLMEEmbodiedMultimodal2023} on a mix of (1) web and (2) robotics data, complementing VQ\&A training with robotics-specific signal, and learning to directly output robot actions in a shared token space for visual and language inputs.
|
| 52 |
+
In their work, the authors claim using large models trained on internet-scale data as backbones for VLAs allows models to tap into the rich semantic knowledge embedded in the VLM's parameters, interpreting instructions and unseen objects by connecting them to concepts acquired while pre-training.
|
| 53 |
+
For instance,~\citet{brohanRT2VisionLanguageActionModels2023} show that while RT-2 has never been explicitly trained to repurpose tools for a \emph{hammering} task, it can still combine its semantic understanding of images, so that when asked which object between (1) a piece of paper, (2) a pair of headphones or (3) a rock may be used instead of a hammer, it correctly answers (3).
|
| 54 |
+
|
| 55 |
+
Traditionally, research efforts revolved around not only training models, but also proposing datasets for the community, a costly and time-consuming process.
|
| 56 |
+
Due to the aforementioned embodiment gap, the data used in research efforts in robot learning have traditionally proved rather fragmented, tailored to the specific task considered by the specific group of researchers who collected it, which ultimately hindered integration.
|
| 57 |
+
The Open X-Embodiment project~\citep{oneillOpenXEmbodimentRobotic2025} was a landmark collaboration effort to address data fragmentation, by curating the aggregation of 60 \emph{existing} robotics datasets from 22 different robot embodiments and 21 institutions across the world, and resulted in a total 1.4M of cross-embodiments, cross-tasks, openly-available trajectories.
|
| 58 |
+
Besides the contribution of an aggregate, large scale dataset,~\citet{oneillOpenXEmbodimentRobotic2025} also demonstrated significant positive transfer \emph{across tasks and embodiments}, showing that \highlight{a single model trained on multi-embodiment data can outperform specialist models} trained on their respective single-embodiment datasets.
|
| 59 |
+
The Distributed Robot Interaction Dataset (DROID)~\citep{khazatskyDROIDLargeScaleInTheWild2025} represents another significant step towards addressing the problem of scarse and disaggregated data in robot learning, providing a unique dataset consisting of 75k+ human demonstrations collected in realistic (\emph{in-the-wild}) manipulation settings, providing another cornerstone for building general-purpose robot policies.
|
| 60 |
+
Recently, foundational datasets curated through large, centralized efforts, are increasingly complemented by decentralized, community-driven contributions of robotics data.
|
| 61 |
+
Software libraries like \lerobot~have been instrumental in enabling decentralized collection of large amounts of data, providing the infrastructure for researchers and practitioners to easily contribute trajectories from a wide range of embodiments, democratizing data access via distributed collection.
|
| 62 |
+
|
| 63 |
+
Despite these advancements, the success of large, proprietary models like RT-1 and RT-2, highlighted a growing accessibility gap in robotics research, as training and deploying large-scale robotics foundation models requires computational resources simply unattainable for most research institutions.
|
| 64 |
+
The OpenVLA project~\citep{kimOpenVLAOpenSourceVisionLanguageAction2024} emerged in direct contrast to traditionally closed-source efforts to develop VLAs.
|
| 65 |
+
In particular,~\citet{kimOpenVLAOpenSourceVisionLanguageAction2024} trained OpenVLA by exclusively leveraging openly available data (970k+ trajectories from the Open-X dataset), and openly shared their training recipes alongside the model weights.
|
| 66 |
+
Architecturally, OpenVLA integrates a pre-trained vision encoder to project visual tokens into the embedding space of the Llama2-7B~\citep{touvronLlama2Open2023} language-model backbone.
|
| 67 |
The language model backbone is then used to predict \emph{discrete action tokens} over 256 activation levels.
|
| 68 |
|
| 69 |
\begin{figure}
|
| 70 |
\centering
|
| 71 |
\includegraphics[width=0.9\textwidth]{figures/ch5/ch5-trends.png}
|
| 72 |
+
\caption{Robot learning is undergoing a paradigmatic shift: centralized data collections (A, left) are increasingly larger, often comprising millions of demonstrations, while (A, right) decentralized data collection efforts are becoming an alternative for large scale data collection. (B) Generalist models are also becoming increasingly smaller and easier to run on limited hardware.}
|
| 73 |
\label{fig:ch5-trends}
|
| 74 |
\end{figure}
|
| 75 |
|
| 76 |
+
Figure~\ref{fig:ch5-trends} shows the current trends in robot learning in terms of size and nature of the robotics datasets contributed, together with the size and accessibility of the available models.
|
| 77 |
+
As datasets collected via centralized, cross-institutions cooperation of increasing size are made available for the research community, decentralized datasets collected by individual researchers and practitioners also gained traction, closing the gap with academic benchmarks thanks to community-contributed datasets.
|
| 78 |
+
Further, models used across tasks and embodiments are increasingly becoming much more compute-efficient, and as a result the models' size has been consistently reducing over time, with consequent gains for autonomous robots in real-world, resource-constrained environments.
|
| 79 |
|
| 80 |
+
\subsection{VLAs}
|
| 81 |
Modern recipes to train large scale VLAs extend early efforts to learn foundation models from large amounts of data via BC, introducing significant advancements concerning both architectural and procedural aspects.
|
| 82 |
From an architectural perspective, modern VLAs such as \pizero~\citep{black$p_0$VisionLanguageActionFlow2024} leverage a \emph{unified transformer model} for efficiency of computation, while maintaining specialized sub-components within the model for visual perception and action prediction, enabling cross-task performance via language conditioning.
|
| 83 |
+
Crucially, modern VLAs including\pizero~\citep{black$p_0$VisionLanguageActionFlow2024} and SmolVLA~\citep{shukorSmolVLAVisionLanguageActionModel2025} adopt \emph{unified} transformer models employing disjoint set of weights (\emph{experts}) for both compute-efficient visual-semantic understanding as well as control.
|
| 84 |
+
Procedurally, VLAs complement advanced Vision-Language Model (VLM) backbones with action-specific modules (1) adopting mid-sized \emph{action experts} to model continuous actions distributions \( p (a_{t:t+H_a} \vert o_t) \)---avoiding discrete action tokens entirely---and (2) relying on~\emph{action chunking}~\citep[Section~\ref{sec:learning-imitation}]{zhaoLearningFineGrainedBimanual2023} as a strategy to reduce error compounding when predicting multiple actions learning from inherently non-i.i.d. data, such as demonstration data.
|
| 85 |
|
| 86 |
+
These architectural and procedural innovations present three benefits over task-specific methods.
|
| 87 |
+
First, developing architectures that exploit internet-scale pre-trained backbones allows to fully capitalize on the vast world knowledge and skills state-of-the-art VLMs exhibit, preventig models from needing to learn visual, linguistic and semantic concepts from scratch.
|
| 88 |
+
Second, using generative models for continuous action distributions allows to learn rich, multimodal data distributions, a much more likely scenario in the big-data regime which is typically tackled while developing generalist policies.
|
| 89 |
+
Further, introducing separate components for perception and action planning enable using Mixture of Experts (MoE) architectures~\citep{fedusReviewSparseExpert2022}, which are often more efficient to run---a key feature for models deployed in real-world scenarios.
|
| 90 |
+
This new paradigm has been at the core of some of the most capable generalist policies developed to date, capable to few-shot adapt to novel tasks and to perform highly dexterous manipulation tasks ranging from end-to-end folding laundry to bussing tables~\citep{black$p_0$VisionLanguageActionFlow2024}.
|
| 91 |
|
| 92 |
\subsubsection{VLMs for VLAs}
|
| 93 |
+
VLMs are designed to handle both visual and textual modalities, most commonly by taking both images and text as inputs, generating text conditioned on the visual context.
|
| 94 |
Recent advances in VLMs have been driven by the success of LLMs, with many approaches building upon pretrained LLMs and adopting similar training paradigms to the ones used in language modeling.
|
| 95 |
Typically, VLMs~\citep{alayracFlamingoVisualLanguage2022,laurenconWhatMattersWhen2024,linVILAPretrainingVisual2024} are constructed by integrating a pretrained vision encoder~\citep{radfordLearningTransferableVisual2021,zhaiSigmoidLossLanguage2023,finiMultimodalAutoregressivePretraining2024} with a pretrained LLM~\citep{grattafioriLlama3Herd2024,jiangMistral7B2023}.
|
| 96 |
Training then proceeds in multiple multimodal stages, beginning with a large-scale pretraining on datasets containing image-text pairs~\citep{LAION-COCO,kakaobrain2022coyo700m} and interleaved vision-language corpora~\citep{OBELICS,MMC4}, all followed by a supervised fine-tuning stage on instruction-tuning datasets~\citep{LLaVA-1.5,tong2024cambrian,laurenconWhatMattersWhen2024}.
|
| 97 |
The inherent multimodal nature of VLMs enables them to jointly reason over vision and language.
|
| 98 |
Pre-training on vast internet-scale datasets allows these models to associate visual patterns with textual descriptions, thereby acquiring a rich semantic understanding of the world---knowledge about objects, their properties, and relationships---without explicit supervision for each concept.
|
| 99 |
+
In turn, integrating VLMs as the perceptual backbone for VLAs allows the latter to inherit rich, contextual world knowledge from the VLM, sidestepping the need to re-learn visual and semantic representations.
|
| 100 |
+
In principle, this also allows the robot to ground high-level natural language instructions in its visual context, and possibly recognize objects by connecting them to the pre-trained concepts absorbed during pre-training, improving on the possibility to generalize to novel scenarios.
|
| 101 |
|
| 102 |
+
Recently, compute efficiency has also become a central focus in multi-modal research.
|
| 103 |
Several works aim to reduce training costs by using smaller, more diverse datasets~\citep{LLaVA-1.5,InstructBLIP,bai2025qwen25vl,zhu2024minigpt,tong2024cambrian}, training smaller-scale models~\citep{marafiotiSmolVLMRedefiningSmall2025, moondream,minicmpv2024}, or by adapting pretrained unimodal models by tuning only a small subset of parameters~\citep{shukor2023epalm,vallaeys2024improveddepalm,MAPL,FROMAGe,tsimpoukelli2021multimodalfrozen,BLIP-2}.
|
| 104 |
+
While the majority of VLM research focuses on image and text modalities, recent work has also demonstrated that similar techniques can be extended to integrate additional modalities, such as video and audio~\citep{wang2025internvideo2,liu2024kangaroo,zhang2025videollama,kong2024audioflam}---a particularly promising direction of research for robotics applications, where multiple sensor modalities can be integrated effectively.
|
| 105 |
This trend towards efficiency is paramount for robotics applications, where policies must operate under the stringent constraints of real-world deployment.
|
|
|
|
|
|
|
| 106 |
|
| 107 |
\subsection{\( \pi_0 \)}
|
| 108 |
|
| 109 |
\pizero~\citep{black$p_0$VisionLanguageActionFlow2024} introduce a VLA consisting of a MoE architecture consisting of (1) a pre-trained VLM backbone (Gemma 2.6B~\citep{teamGemma2Improving2024}) and (2) a dedicated action expert used to generate continuous actions via flow matching.
|
| 110 |
+
Images and language are embedded with PaliGemma, a VLM merging independently encoded visual and textual features deep in the network (\emph{late-fusion}), while proprioceptive state and actions chunks are routed to a smaller \emph{action expert}, initialized from scratch.
|
| 111 |
The two separate experts communicate via self-attention layers, but maintain disjoint weights to obtain query, key and values matrices at each layer, maintaining specialization while efficiently allocating computation.
|
| 112 |
|
| 113 |
\begin{figure}
|
| 114 |
\centering
|
| 115 |
\includegraphics[width=0.9\textwidth]{figures/ch5/ch5-pi0.png}
|
| 116 |
+
\caption{The \pizero~architecture, as in~\citet{black$p_0$VisionLanguageActionFlow2024}. Vision and language tokens are routed to a VLM backbone which is prevented from attending robot proprioperceptive states and action tokens, which are instead routed to a smaller subset of weights within the architecture referred to as "action expert". The architecture is trained with Flow Matching on 10M+ trajectories from a mixture of closed and openly available datasets.}
|
| 117 |
\label{fig:ch5-pi0}
|
| 118 |
\end{figure}
|
| 119 |
|
| 120 |
+
Concretely, \( \pi_0 \) is a single, unified transformer with two disjoint sets of weights \( \phi, \theta\).
|
| 121 |
+
A larger VLM backbone \( f_\phi \) initialized from Gemma 2.6B processes multiple image frames obtained from multiple cameras points \( [\{ I_t \}_{t=1}^n] \), as well as a language instruction \([\ell_t]\) used to describe the task considered.
|
| 122 |
+
Concurrently, a 300M-parameter \emph{action expert} based on a similar transformer architecture is used to process both the robot proprioperceptive state \(q_t\) and an action chunk \(a_{t:t+H_a}\) (Figure~\ref{fig:ch5-pi0}).
|
| 123 |
+
The different expert networks operate separately in processing the respective inputs and turn them into query, key and value matrices, and only share information between each other via self-attention layers.
|
|
|
|
| 124 |
The outputs from the VLM backbone are disregarded, while the vector field regressed by the action expert is used to iteratively refine the action process.
|
| 125 |
+
In particular, \pizero~uses a \emph{blockwise causal attention mask} over tokens belonging to three separate blocks: (1) image and language tokens \(\mathcal T_i \) obtained from \([\{ I_t \}_{t=1}^n, \ell_t]\), (2) proprioperceptive tokens \(\mathcal T_q \) obtained from \(q_t\), and (3) the action tokens \( \mathcal T_a \) for items in the chunk \(a^{\tau}_{t:t+H_a}\) at time \( \tau \) in the flow-matching process.
|
| 126 |
+
Notably, \emph{within} each block the attention operations are bidirectional, while \emph{across} blocks, future blocks are masked out.
|
| 127 |
+
Formally, this corresponds to using an attention mask like:
|
| 128 |
\begin{equation*}
|
| 129 |
\mathbf{A} =
|
| 130 |
\bordermatrix{
|
|
|
|
| 137 |
\end{equation*}
|
| 138 |
Note how \emph{intra}-block directional attention allows tokens to communicate freely, while \emph{inter}-block communication is mediated by the attention mask \(\mathbf{A} \).
|
| 139 |
\emph{Blockwise causal masking} effectively prevents the pre-trained perception-language tokens from attending to robotics-tokens, likely out of distribution for VLM backbones traditionally trained on large corpora of internet, non-robotics, data.
|
| 140 |
+
Crucially, because communication is obstructed between image-language tokens, proprioperceptive tokens and action tokens, one can cache keys and values across denoising steps at runtime time, incuring in a reduced computational footprint and faster inference.
|
| 141 |
|
| 142 |
In \pizero, both the VLM backbone and action expert are update using a \emph{flow matching} loss, and in particular are updated minimizing:
|
| 143 |
\begin{align}
|
|
|
|
| 152 |
\epsilon \sim \mathcal{N}(\mathbf{0}, \mathbf{I}), \quad
|
| 153 |
o_t, a_{t:t+H_a} \sim \mathcal D \notag
|
| 154 |
\end{align}
|
| 155 |
+
where the two experts parametrized by the separate weights \( \phi, \theta \) interact with each other via self-attention layers only, so that the action expert \( v_\theta \) internal computations also depend on the VLM backbone's parameters \( \phi \).
|
| 156 |
+
Importantly,~\citet{black$p_0$VisionLanguageActionFlow2024} minimize eq.~\ref{eq:pi0-loss} over both the multimodal backbone and action expert parameters, thus updating both the internal representations of the VLM and action-expert weights using BC-specific gradients.
|
| 157 |
In contrast,~\citet{driessKnowledgeInsulatingVisionLanguageAction2025} later show that failing to insulate the VLM knowledge from the flow matching gradients actually harms performance.
|
| 158 |
+
|
| 159 |
+
At runtime, inference is performed iteratively refining action chunks while numerically forward-integrating the vector field predicted by the action expert,
|
| 160 |
\begin{equation}
|
| 161 |
a_{t:t+H_a}^{\tau + \delta} = a_{t:t+H_a}^{\tau } + \delta v_\theta(a_{t:t+H_a}^{\tau }, o_t)
|
| 162 |
\end{equation}
|
| 163 |
|
| 164 |
+
Flow matching~\citep[Section\ref{sec:ch4-flow-matching}]{lipmanFlowMatchingGenerative2023} can be seen as a continuous time, deterministic generalization of diffusion processes, and has proven effective in modeling highly complex multi-modal distributions, including those over images and video.
|
| 165 |
+
In turn, the application of flow matching to large-scale datasets of multiple human behaviors across tasks and embodiments appears rather consequential, particularly considering how it can enable faster inference via a limited number of denoising steps at test time---as few as 10, in \pizero.
|
| 166 |
+
In particular, the action expert is implemented as a conditional flow matching model.
|
| 167 |
+
Each action token embeds a noisy action \(a_i^{\tau} \in a^\tau_{t:t+H_a}\), alongside a sinusoidal encoding of the \emph{flow process} timestep \(\tau\).
|
| 168 |
+
The action expert then leverages full bidirectional attention across the \(H_a\) action tokens provided, and also attends to previous proprioperceptive and image-language tokens.
|
| 169 |
+
Interestingly, differently from a standard flow matching pipeline~\citep{lipmanFlowMatchingGenerative2023}, \(\tau\) is \emph{not} sampled from a uniform distribution \(\tau \sim \mathcal U([0,1]) \), but rather obtained from \(\tau \sim \textrm{Beta}(1.5,1) \) defined on the \( [0,s], s<1 \) support (Figure~\ref{fig:ch5-pi0-sampling-timesteps}).
|
| 170 |
|
| 171 |
\begin{wrapfigure}{r}{0.4\textwidth}
|
| 172 |
\vspace{-10pt}
|
| 173 |
\centering
|
| 174 |
\includegraphics[width=\linewidth]{figures/ch5/ch5-pi0-sampling-timesteps.png}
|
| 175 |
+
\caption{Unlike more traditional flow-matching algorithms, \pizero~uses a modified distribution to sample the timestep \( \tau \) from during training and inference, favouring earlier timestamps corresponding to noisier chunks.}
|
| 176 |
\label{fig:ch5-pi0-sampling-timesteps}
|
| 177 |
\end{wrapfigure}
|
| 178 |
+
|
| 179 |
+
Using such Beta distribution emphasizes higher noise levels during training, a choice~\citet{black$p_0$VisionLanguageActionFlow2024} argue allows \pizero~to focus on learning to reconstruct the mean of the data distribution \( \mathbb E[a_{t:t+H_a} \vert o_t] \) over an identity map during training, in keeping with~\citet{esserScalingRectifiedFlow2024}.
|
| 180 |
To further optimize performance and reduce inference time,~\citet{black$p_0$VisionLanguageActionFlow2024} propose reducing the support of the timestep distribution to \([0,s], \ s < 1 \), as for any forward-integration step size \( \delta = 1-s \) timesteps above \(s \) are never sampled at inference time.
|
| 181 |
|
| 182 |
+
Besides adopting a MoE architecture with a VLM backbone initialized from a pre-trained model and trained jointly with an action expert via flow matching, \pizero~also relies on a unique pre-training corpus comprising of a mix of proprietary and open data totaling 10M+ trajectories, which in their work~\citet{black$p_0$VisionLanguageActionFlow2024} claim to be the largest dataset used to develop a foundational robotics model to date.
|
| 183 |
+
The dataset used to train \pizero---referred to as "the \( \pi \) dataset"---comprises a private, undisclosed portion obtained via expert teleoperation as well as openly available datasets including Open-X and DROID, with only \(\approx 9.1\%\) of the \( \pi \) being openly available.
|
| 184 |
+
In the \( \pi \) dataset, open datasets such as DROID and Open-X are complemeneted with expert trajectories consisting of dexterous demonstrations tasks spanning 7 robot configurations and 68 different tasks.
|
| 185 |
+
Crucially, \citet{black$p_0$VisionLanguageActionFlow2024} show that pre-training on the \( \pi \) dataset yields a broadly capable base model, which can be adapted via fine-tuning on narrower, higher-quality task data, which induces a fluent multi-stage behavior while retaining robustness.
|
| 186 |
+
In particular,~\citet{black$p_0$VisionLanguageActionFlow2024} report that, across a variety of benchmarks, the version of \pizero~pretrained on the \( \pi \) dataset and fine-tuned on extra high-quality data demonstrations \emph{consistently outperforms} a \( \pi_0^{\text{scratch}} \)~baseline trained entirely from scratch for a given specific task, which further underscores the relevance of pretraining on the \( \pi \) dataset.
|
| 187 |
+
\citet{black$p_0$VisionLanguageActionFlow2024} do also offer an intuition behind this finding: high-quality demonstrations of a given task tend to omit failure data, which inherently prevents an autonomous agent to learn how to recover from near-failure states.
|
| 188 |
+
In turn, robot trained on high-quality data exclusively with BC may as well be entirely incapable to recover from failure.
|
| 189 |
+
Conversely, large scale collections of human demonstrations are typically much more diverse (if anything, for their sheer scale), and typically contain rich and diverse information, which may prove suboptimal for any given task when considered in isolation but which proves invaluable in coupling with a small, narrower set of demonstrations.
|
| 190 |
|
| 191 |
Lastly,~\citet{black$p_0$VisionLanguageActionFlow2024} present cross-embodiment experiments where they demonstrate \pizero's ability to control both mobile and static manipulator robots with varying arm embodiments.
|
| 192 |
+
The emergence of cross-embodiment capabilities is largely to be attributed to the presence of large scale cross-embodiment data in \( \pi \) data mixture, which is in practice handled by \pizero~outputting actions with maximal configuration size across the whole \( \pi \) dataset, and zero-padding robots with fewer dofs.
|
| 193 |
+
\pizero~does also rely on exactly three camera views at both training and test time, and uses masked image slots for training and deployment scenarios with fewer cameras.
|
|
|
|
| 194 |
|
| 195 |
\subsubsection{Code Example: Using \pizero}
|
| 196 |
+
\begin{pbox}[label={ex:using-pizero}]{Using \pizero \\ \url{https://github.com/fracapuano/robot-learning-tutorial/blob/main/snippets/ch5/01_using_pi0.py}}
|
| 197 |
+
\lstinputlisting[language=python]{snippets/ch5/01_using_pi0.py}
|
| 198 |
+
\end{pbox}
|
| 199 |
|
| 200 |
\subsection{SmolVLA}
|
| 201 |
+
With VLAs in the early stage of development compared to more mature LLMs and VLMs, much of the progress made on VLAs remains proprietary, with many releases exclusively sharing the weights while withholding the data used, full experimental details and essential methodological components of training.
|
| 202 |
+
In constrast with this closed approach, SmolVLA~\citep{shukorSmolVLAVisionLanguageActionModel2025} is an entirely open-source research effort, which aims at democratizing the developments of robotics foundation models by open sourcing the model alongside the data used as well as the training recipes.
|
|
|
|
| 203 |
|
| 204 |
\begin{figure}
|
| 205 |
\centering
|
| 206 |
\includegraphics[width=0.9\textwidth]{figures/ch5/ch5-smolvla.png}
|
| 207 |
+
\caption{The SmolVLA architecture, as in~\citet{shukorSmolVLAVisionLanguageActionModel2025}. SmolVLA is a compact MoE model trained with flow matching to denoise action chunks. Vision and language tokens are fed to a VLM backbone, and share information with the proprioperceptive and action tokens via the attention mechanism. The attention expert interleaves SA and CA layers for further conditioning on the visual features from the VLM backbone. SmolVLA skips computations and reduces the visual tokens, resulting in 7x less memory usage than \pizero~(450M parameters vs. \pizero's 3.3B).}
|
| 208 |
\label{fig:ch5-smolvla}
|
| 209 |
\end{figure}
|
| 210 |
|
| 211 |
+
While encouraging efforts like \pizero~\citep{black$p_0$VisionLanguageActionFlow2024} demonstrate the feasibility of open VLA systems, they remain (1) large and compute-intensive and (2) dependent on closed datasets collected via centralized efforts on costly robotic platforms, which ultimately hinders the accessibility of the method altogether.
|
| 212 |
+
SmolVLA mitigates both these issues by (1) prioritizing a compact, compute-efficient VLA design and (2) targeting community-contributed datasets on accessible robotic platforms such as the SO-100 and SO-101 arms.
|
| 213 |
Similarly to \pizero, SmolVLA (Figure~\ref{fig:ch5-smolvla}) employs a MoE architecture combining a pretrained VLM backbone with a dedicated action expert, and trains with flow matching.
|
| 214 |
To ensure efficiency and accessibility, SmolVLA adopts SmolVLM-2~\citep{marafiotiSmolVLMRedefiningSmall2025} as its VLM backbone, considering SmolVLM-2's reduced size and capability to process multiple image inputs alongside text items.
|
| 215 |
SmolVLM-2 uses SigLIP~\citep{zhaiSigmoidLossLanguage2023} as vision encoder, producing visual features for a SmolLM2 language decoder~\citep{allalSmolLM2WhenSmol2025}.
|
| 216 |
Further, SmolVLA adopts a smaller action expert consisting of \(\sim\)100M parameters and an interleaved stack of self and cross-attention layers.
|
| 217 |
To improve efficiency, the action expert adopts a reduced embedding dimension compared to the VLM backbone, resulting in \( d_{v_\theta} = 0.75 d_{\text{VLM}} \).
|
| 218 |
+
\citet{shukorSmolVLAVisionLanguageActionModel2025}'s design choices thus result in a much smaller size model compared to \pizero, consisting of ca. 450M parameters versus \pizero's 3.3B parameters.
|
| 219 |
|
| 220 |
+
In practice, SmolVLA consumes multi-view RGB images, a natural-language instruction, and projected sensorimotor state token as inputs, together with the noised \emph{action chunk} \( \tilde{a}_{t:t+H_a} \) the action expert \( v_\theta \) is trained to denoise.
|
| 221 |
+
The robot proprioperceptive states are projected to a shared token space with the VLM to match \( d_{\text{VLM}} \), and successively projected into the expert's token space.
|
| 222 |
+
Similarily to \pizero, SmolVLA adopts separate experts communicating exclusively through self-attention layers, which however do not employ blockwise causal attention masking and rather favour simple causal masking.
|
| 223 |
|
| 224 |
In contrast with \pizero, the action expert interleaves \emph{cross-attention} (CA) and \emph{self-attention} (SA) layers, a choice shown to yield higher success and smoother action chunks in practice.
|
| 225 |
+
While in the expert SA layers tokens are used to obtain queries, keys and values, CA layers use action tokens only as queries, and instead project visual, language and proprioperceptive tokens from the VLM backbone to a shared embedding space to then obtain keys and values.
|
| 226 |
+
Notably, keys and values can be cached here as well, resulting in performance gains at inference time.
|
| 227 |
|
| 228 |
+
SmolVLA also trims down both token and layer compute.
|
| 229 |
+
First, it \emph{reduces visual tokens} via pixel shuffling to a fixed budget of 64 tokens per frame, foregoing the tiling used during VLM pretraining for the sake of runtime efficiency.
|
| 230 |
+
Second, it \emph{skips upper VLM layers}, as only features from the first \(N\) decoder layers, with \(N=L/2\), are consumed, which provides a good speed-performance trade-off and effectively halves compute needs for the larger part of SmolVLA.
|
| 231 |
Beyond model compactness, SmolVLA also contributes an inference stack that decouples action prediction from execution for responsiveness on modest hardware (Section~\ref{sec:ch4-async-inference}).
|
| 232 |
|
| 233 |
+
Departing from reliance on proprietary datasets, SmolVLA pretrains exclusively on 450+ \emph{community datasets}, totaling 20k+ trajectories.
|
| 234 |
Because instructions in community contributed dataset can be noisy or missing, the authors re-annotate tasks with a small off-the-shelf VLM using frames sampled from the dataset, and standardize camera viewpoints by mapping sources to a consistent top/wrist/side ordering.
|
| 235 |
+
At test time, similarily to \pizero, SmolVLA forward-integrates flow over 10 steps, resulting in fast inference.
|
| 236 |
+
SmolVLA proves effective across a range of both real-world and simulated environments, rivaling \pizero~while being close to 40\% faster and consuming 6x less memory~\citep{shukorSmolVLAVisionLanguageActionModel2025}.
|
| 237 |
|
| 238 |
\subsubsection{Code Example: Using SmolVLA}
|
| 239 |
+
\begin{pbox}[label={ex:using-smolvla}]{Using SmolVLA \\ \url{https://github.com/fracapuano/robot-learning-tutorial/blob/main/snippets/ch5/02_using_smolvla.py}}
|
| 240 |
+
\lstinputlisting[language=python]{snippets/ch5/02_using_smolvla.py}
|
| 241 |
+
\end{pbox}
|
app/scripts/latex-to-mdx/input/sections/07_conclusions.tex
CHANGED
|
@@ -1,19 +1,12 @@
|
|
| 1 |
\section{Conclusions}
|
| 2 |
\label{sec:conclusions}
|
| 3 |
|
| 4 |
-
This tutorial has
|
| 5 |
-
We began by examining the limitations of traditional dynamics-based control, highlighting the brittleness and the significant engineering overhead required by traditional approaches, which in turn motivates more flexible, less model-intensive learning approaches.
|
| 6 |
|
| 7 |
-
Our exploration of learning-
|
| 8 |
-
We began with Reinforcement Learning, acknowledging its power to learn through interaction but also its real-world challenges, particularly sample inefficiency and the complexities of reward design.
|
| 9 |
-
We saw how modern, data-driven approaches like HIL-SERL can make real-world RL feasible by incorporating human guidance and prior data.
|
| 10 |
-
The inherent difficulties of RL, however, naturally motivated a deeper dive into imitation learning. This led us to single-task policies, where Behavioral Cloning, powered by advanced generative models like Action Chunking with Transformers and Diffusion Policy, demonstrated the ability to learn complex, multimodal behaviors directly from expert demonstrations.
|
| 11 |
-
This laid the groundwork for the current frontier: the development of generalist, language-conditioned Vision-Language-Action models.
|
| 12 |
-
Architectures like \( \pi_0 \) and SmolVLA---leveraging powerful pre-trained backbones and sophisticated generative modeling techniques like flow matching---represent a significant leap towards building foundational models for robotics that can generalize across varied tasks and embodiments.
|
| 13 |
|
| 14 |
-
|
| 15 |
-
The recent explosion in capability is inseparable from the advent of large-scale, openly available datasets, the standardization of powerful and efficient model architectures, and the development of accessible, open-source software like \lerobot.
|
| 16 |
-
We argue the convergence towards an open approach to robotics is not merely a trend but a fundamental enabler, democratizing access to cutting-edge research in a traditionally siloed field like robotics.
|
| 17 |
|
| 18 |
-
|
| 19 |
-
|
|
|
|
|
|
| 1 |
\section{Conclusions}
|
| 2 |
\label{sec:conclusions}
|
| 3 |
|
| 4 |
+
This tutorial has charted the paradigmatic shift transforming robotics, tracing the \highlight{evolution of robotics from structured, model-based methods to the dynamic, data-driven approaches that define modern robot learning}. We began by examining the limitations of traditional dynamics-based control, namely its brittleness and significant engineering overhead, which motivate the adoption of more flexible, learning-based alternatives. Unlike scalable, data-driven techniques, conventional explicit models demand extensive human expertise, hindering wider accessibility and scalability of robotics.
|
|
|
|
| 5 |
|
| 6 |
+
Our exploration traced a clear trajectory of progress, beginning with Reinforcement Learning (RL). While RL offers a powerful paradigm for learning through interaction, its application in robotics is complicated by challenges such as sample inefficiency, safety concerns in real-world training, and the complexities of reward design. We saw how modern approaches like HIL-SERL make real-world RL more feasible by incorporating training-time human guidance, datasets of previously collected data as well as learned reward classifiers.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 7 |
|
| 8 |
+
Nonetheless, the inherent difficulties of RL increasingly motivate approaches based on imitation learning, capable to safely learns from limited numbers of real-world, reward-free expert demonstrations. In turn, the wider adoption of imitation learning led to the development of single-task policies, where advanced Behavioral Cloning techniques---implemented as state-conditioned generative models like Action Chunking with Transformers and Diffusion Policy---have demonstrated the ability to learn complex, multimodal behaviors from human demonstrations. These advancements laid the groundwork for the current frontier: generalist, language-conditioned Vision-Language-Action models capable to perform few- and zero-shot a variety of different real-world tasks. By leveraging powerful pre-trained backbones and sophisticated generative methods like flow matching, models such as \pizero~and SmolVLA represent a significant leap towards foundational models for robotics capable of generalizing across diverse tasks, and even robot embodiments.
|
|
|
|
|
|
|
| 9 |
|
| 10 |
+
A central theme of this work is the critical role of openness in accelerating this progress. The recent explosion in capability is inseparable from the advent of large-scale, openly available datasets, standardized, stable and accessible model architectures, and accessible, open-source software like \lerobot. We argue this convergence on open-source robotics is not a mere trend but a fundamental enabler, democratizing access to research and unlocking the potential of large, decentralized efforts to advance the field.
|
| 11 |
+
|
| 12 |
+
The journey detailed in this tutorial, from first principles to the state-of-the-art, aims to equip researchers and practitioners with the context and tools to begin their own explorations in open-source robot learning.
|
app/scripts/latex-to-mdx/input/sections/A_foreword.tex
CHANGED
|
@@ -1,11 +1,11 @@
|
|
| 1 |
\section*{Foreword}
|
| 2 |
|
| 3 |
-
Robotics is an inherently multidisciplinary field,
|
| 4 |
Yet, more than sixty years after the debut of Unimate, robots have still not fully integrated into the rich, unstructured, and dynamic world we humans inhabit.
|
| 5 |
-
Over the decades, numerous disciplines have shown immense promise in tackling the challenges of creating autonomous systems.
|
| 6 |
This tutorial takes a clear stance in the debate on whether modern Machine
|
| 7 |
Learning can play a pivotal role in the development of
|
| 8 |
-
autonomous
|
| 9 |
|
| 10 |
Nonetheless, we also hold that the wealth of research from both academia and industry in classical robotics over the past six decades is, simply put, too valuable to be cast aside in favor of purely learning-based methods.
|
| 11 |
However, the interplay between classical robotics and modern machine learning is still in its nascent stages, and the path to integration yet to be clearly defined.
|
|
|
|
| 1 |
\section*{Foreword}
|
| 2 |
|
| 3 |
+
Robotics is an inherently multidisciplinary field, which is witnessing unprecedented advancements since its inception in the 1960s.
|
| 4 |
Yet, more than sixty years after the debut of Unimate, robots have still not fully integrated into the rich, unstructured, and dynamic world we humans inhabit.
|
| 5 |
+
Over the decades, numerous disciplines have shown immense promise in tackling the challenges of creating autonomous robotic systems.
|
| 6 |
This tutorial takes a clear stance in the debate on whether modern Machine
|
| 7 |
Learning can play a pivotal role in the development of
|
| 8 |
+
autonomous robots: we believe this to be the case.
|
| 9 |
|
| 10 |
Nonetheless, we also hold that the wealth of research from both academia and industry in classical robotics over the past six decades is, simply put, too valuable to be cast aside in favor of purely learning-based methods.
|
| 11 |
However, the interplay between classical robotics and modern machine learning is still in its nascent stages, and the path to integration yet to be clearly defined.
|
app/scripts/latex-to-mdx/input/slides/.DS_Store
ADDED
|
Binary file (6.15 kB). View file
|
|
|
app/scripts/latex-to-mdx/input/slides/_minted/A95BA625987D2B89E91E7BD2313DE693.highlight.minted
ADDED
|
@@ -0,0 +1,52 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\begin{MintedVerbatim}[commandchars=\\\{\}]
|
| 2 |
+
\PYG{k+kn}{import}\PYG{+w}{ }\PYG{n+nn}{torch}
|
| 3 |
+
\PYG{k+kn}{from}\PYG{+w}{ }\PYG{n+nn}{lerobot}\PYG{n+nn}{.}\PYG{n+nn}{datasets}\PYG{n+nn}{.}\PYG{n+nn}{lerobot\PYGZus{}dataset}\PYG{+w}{ }\PYG{k+kn}{import} \PYG{n}{LeRobotDataset}
|
| 4 |
+
\PYG{k+kn}{from}\PYG{+w}{ }\PYG{n+nn}{lerobot}\PYG{n+nn}{.}\PYG{n+nn}{datasets}\PYG{n+nn}{.}\PYG{n+nn}{streaming\PYGZus{}dataset}\PYG{+w}{ }\PYG{k+kn}{import} \PYG{n}{StreamingLeRobotDataset}
|
| 5 |
+
|
| 6 |
+
\PYG{n}{delta\PYGZus{}timestamps} \PYG{o}{=} \PYG{p}{\PYGZob{}}
|
| 7 |
+
\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{observation.images.wrist\PYGZus{}camera}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{:} \PYG{p}{[}\PYG{o}{\PYGZhy{}}\PYG{l+m+mf}{0.2}\PYG{p}{,} \PYG{o}{\PYGZhy{}}\PYG{l+m+mf}{0.1}\PYG{p}{,} \PYG{l+m+mf}{0.0}\PYG{p}{]} \PYG{c+c1}{\PYGZsh{} 0.2, and 0.1 seconds *before* each frame}
|
| 8 |
+
\PYG{p}{\PYGZcb{}}
|
| 9 |
+
|
| 10 |
+
\PYG{c+c1}{\PYGZsh{} Optionally, use StreamingLeRobotDataset to avoid downloading the dataset}
|
| 11 |
+
\PYG{n}{dataset} \PYG{o}{=} \PYG{n}{LeRobotDataset}\PYG{p}{(}
|
| 12 |
+
\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{lerobot/svla\PYGZus{}so101\PYGZus{}pickplace}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{,}
|
| 13 |
+
\PYG{n}{delta\PYGZus{}timestamps}\PYG{o}{=}\PYG{n}{delta\PYGZus{}timestamps}
|
| 14 |
+
\PYG{p}{)}
|
| 15 |
+
|
| 16 |
+
\PYG{c+c1}{\PYGZsh{} Streams frames from the Hugging Face Hub without loading into memory}
|
| 17 |
+
\PYG{n}{streaming\PYGZus{}dataset} \PYG{o}{=} \PYG{n}{StreamingLeRobotDataset}\PYG{p}{(}
|
| 18 |
+
\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{lerobot/svla\PYGZus{}so101\PYGZus{}pickplace}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{,}
|
| 19 |
+
\PYG{n}{delta\PYGZus{}timestamps}\PYG{o}{=}\PYG{n}{delta\PYGZus{}timestamps}
|
| 20 |
+
\PYG{p}{)}
|
| 21 |
+
|
| 22 |
+
\PYG{c+c1}{\PYGZsh{} Get the 100th frame in the dataset by }
|
| 23 |
+
\PYG{n}{sample} \PYG{o}{=} \PYG{n}{dataset}\PYG{p}{[}\PYG{l+m+mi}{100}\PYG{p}{]}
|
| 24 |
+
\PYG{n+nb}{print}\PYG{p}{(}\PYG{n}{sample}\PYG{p}{)}
|
| 25 |
+
\PYG{c+c1}{\PYGZsh{} \PYGZob{}}
|
| 26 |
+
\PYG{c+c1}{\PYGZsh{} \PYGZsq{}observation.state\PYGZsq{}: tensor([...]), }
|
| 27 |
+
\PYG{c+c1}{\PYGZsh{} \PYGZsq{}action\PYGZsq{}: tensor([...]), }
|
| 28 |
+
\PYG{c+c1}{\PYGZsh{} \PYGZsq{}observation.images.wrist\PYGZus{}camera\PYGZsq{}: tensor([3, C, H, W]), for delta timesteps}
|
| 29 |
+
\PYG{c+c1}{\PYGZsh{} ...}
|
| 30 |
+
\PYG{c+c1}{\PYGZsh{} \PYGZcb{}}
|
| 31 |
+
|
| 32 |
+
\PYG{n}{batch\PYGZus{}size}\PYG{o}{=}\PYG{l+m+mi}{16}
|
| 33 |
+
\PYG{c+c1}{\PYGZsh{} wrap the dataset in a DataLoader to use process it batches for training purposes}
|
| 34 |
+
\PYG{n}{data\PYGZus{}loader} \PYG{o}{=} \PYG{n}{torch}\PYG{o}{.}\PYG{n}{utils}\PYG{o}{.}\PYG{n}{data}\PYG{o}{.}\PYG{n}{DataLoader}\PYG{p}{(}
|
| 35 |
+
\PYG{n}{dataset}\PYG{p}{,}
|
| 36 |
+
\PYG{n}{batch\PYGZus{}size}\PYG{o}{=}\PYG{n}{batch\PYGZus{}size}
|
| 37 |
+
\PYG{p}{)}
|
| 38 |
+
|
| 39 |
+
\PYG{c+c1}{\PYGZsh{} Iterate over the DataLoader in a training loop}
|
| 40 |
+
\PYG{n}{num\PYGZus{}epochs} \PYG{o}{=} \PYG{l+m+mi}{1}
|
| 41 |
+
\PYG{n}{device} \PYG{o}{=} \PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{cuda}\PYG{l+s+s2}{\PYGZdq{}} \PYG{k}{if} \PYG{n}{torch}\PYG{o}{.}\PYG{n}{cuda}\PYG{o}{.}\PYG{n}{is\PYGZus{}available}\PYG{p}{(}\PYG{p}{)} \PYG{k}{else} \PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{cpu}\PYG{l+s+s2}{\PYGZdq{}}
|
| 42 |
+
|
| 43 |
+
\PYG{k}{for} \PYG{n}{epoch} \PYG{o+ow}{in} \PYG{n+nb}{range}\PYG{p}{(}\PYG{n}{num\PYGZus{}epochs}\PYG{p}{)}\PYG{p}{:}
|
| 44 |
+
\PYG{k}{for} \PYG{n}{batch} \PYG{o+ow}{in} \PYG{n}{data\PYGZus{}loader}\PYG{p}{:}
|
| 45 |
+
\PYG{c+c1}{\PYGZsh{} Move data to the appropriate device (e.g., GPU)}
|
| 46 |
+
\PYG{n}{observations} \PYG{o}{=} \PYG{n}{batch}\PYG{p}{[}\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{observation.state}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{]}\PYG{o}{.}\PYG{n}{to}\PYG{p}{(}\PYG{n}{device}\PYG{p}{)}
|
| 47 |
+
\PYG{n}{actions} \PYG{o}{=} \PYG{n}{batch}\PYG{p}{[}\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{action}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{]}\PYG{o}{.}\PYG{n}{to}\PYG{p}{(}\PYG{n}{device}\PYG{p}{)}
|
| 48 |
+
\PYG{n}{images} \PYG{o}{=} \PYG{n}{batch}\PYG{p}{[}\PYG{l+s+s2}{\PYGZdq{}}\PYG{l+s+s2}{observation.images.wrist\PYGZus{}camera}\PYG{l+s+s2}{\PYGZdq{}}\PYG{p}{]}\PYG{o}{.}\PYG{n}{to}\PYG{p}{(}\PYG{n}{device}\PYG{p}{)}
|
| 49 |
+
|
| 50 |
+
\PYG{c+c1}{\PYGZsh{} Next, you can do amazing\PYGZus{}model.forward(batch)}
|
| 51 |
+
\PYG{o}{.}\PYG{o}{.}\PYG{o}{.}
|
| 52 |
+
\end{MintedVerbatim}
|
app/scripts/latex-to-mdx/input/slides/_minted/_2486923A98E77FD0740381D01ACD1782.index.minted
ADDED
|
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"jobname": "presentation",
|
| 3 |
+
"md5": "2486923A98E77FD0740381D01ACD1782",
|
| 4 |
+
"timestamp": "20250912235219",
|
| 5 |
+
"cachefiles": [
|
| 6 |
+
"A95BA625987D2B89E91E7BD2313DE693.highlight.minted",
|
| 7 |
+
"_2486923A98E77FD0740381D01ACD1782.index.minted",
|
| 8 |
+
"colorful.style.minted",
|
| 9 |
+
"default.style.minted"
|
| 10 |
+
]
|
| 11 |
+
}
|
app/scripts/latex-to-mdx/input/slides/_minted/colorful.style.minted
ADDED
|
@@ -0,0 +1,100 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\makeatletter
|
| 2 |
+
\def\PYG@reset{\let\PYG@it=\relax \let\PYG@bf=\relax%
|
| 3 |
+
\let\PYG@ul=\relax \let\PYG@tc=\relax%
|
| 4 |
+
\let\PYG@bc=\relax \let\PYG@ff=\relax}
|
| 5 |
+
\def\PYG@tok#1{\csname PYG@tok@#1\endcsname}
|
| 6 |
+
\def\PYG@toks#1+{\ifx\relax#1\empty\else%
|
| 7 |
+
\PYG@tok{#1}\expandafter\PYG@toks\fi}
|
| 8 |
+
\def\PYG@do#1{\PYG@bc{\PYG@tc{\PYG@ul{%
|
| 9 |
+
\PYG@it{\PYG@bf{\PYG@ff{#1}}}}}}}
|
| 10 |
+
\def\PYG#1#2{\PYG@reset\PYG@toks#1+\relax+\PYG@do{#2}}
|
| 11 |
+
|
| 12 |
+
\@namedef{PYG@tok@w}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.73,0.73}{##1}}}
|
| 13 |
+
\@namedef{PYG@tok@c}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 14 |
+
\@namedef{PYG@tok@cp}{\def\PYG@tc##1{\textcolor[rgb]{0.33,0.47,0.60}{##1}}}
|
| 15 |
+
\@namedef{PYG@tok@cs}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.80,0.00,0.00}{##1}}}
|
| 16 |
+
\@namedef{PYG@tok@k}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.53,0.00}{##1}}}
|
| 17 |
+
\@namedef{PYG@tok@kp}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.20,0.53}{##1}}}
|
| 18 |
+
\@namedef{PYG@tok@kt}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.20,0.20,0.60}{##1}}}
|
| 19 |
+
\@namedef{PYG@tok@o}{\def\PYG@tc##1{\textcolor[rgb]{0.20,0.20,0.20}{##1}}}
|
| 20 |
+
\@namedef{PYG@tok@ow}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.00}{##1}}}
|
| 21 |
+
\@namedef{PYG@tok@nb}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.44,0.13}{##1}}}
|
| 22 |
+
\@namedef{PYG@tok@nf}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.40,0.73}{##1}}}
|
| 23 |
+
\@namedef{PYG@tok@nc}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.73,0.00,0.40}{##1}}}
|
| 24 |
+
\@namedef{PYG@tok@nn}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.05,0.52,0.71}{##1}}}
|
| 25 |
+
\@namedef{PYG@tok@ne}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{1.00,0.00,0.00}{##1}}}
|
| 26 |
+
\@namedef{PYG@tok@nv}{\def\PYG@tc##1{\textcolor[rgb]{0.60,0.40,0.20}{##1}}}
|
| 27 |
+
\@namedef{PYG@tok@vi}{\def\PYG@tc##1{\textcolor[rgb]{0.20,0.20,0.73}{##1}}}
|
| 28 |
+
\@namedef{PYG@tok@vc}{\def\PYG@tc##1{\textcolor[rgb]{0.20,0.40,0.60}{##1}}}
|
| 29 |
+
\@namedef{PYG@tok@vg}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.87,0.47,0.00}{##1}}}
|
| 30 |
+
\@namedef{PYG@tok@no}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.20,0.40}{##1}}}
|
| 31 |
+
\@namedef{PYG@tok@nl}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.60,0.47,0.00}{##1}}}
|
| 32 |
+
\@namedef{PYG@tok@ni}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.53,0.00,0.00}{##1}}}
|
| 33 |
+
\@namedef{PYG@tok@na}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.80}{##1}}}
|
| 34 |
+
\@namedef{PYG@tok@nt}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.47,0.00}{##1}}}
|
| 35 |
+
\@namedef{PYG@tok@nd}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.33,0.33,0.33}{##1}}}
|
| 36 |
+
\@namedef{PYG@tok@s}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 37 |
+
\@namedef{PYG@tok@sc}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.27,0.87}{##1}}}
|
| 38 |
+
\@namedef{PYG@tok@sd}{\def\PYG@tc##1{\textcolor[rgb]{0.87,0.27,0.13}{##1}}}
|
| 39 |
+
\@namedef{PYG@tok@si}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{0.93,0.93,0.93}{\strut ##1}}}}
|
| 40 |
+
\@namedef{PYG@tok@se}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.40,0.40,0.40}{##1}}\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 41 |
+
\@namedef{PYG@tok@sr}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.00}{##1}}\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,1.00}{\strut ##1}}}}
|
| 42 |
+
\@namedef{PYG@tok@ss}{\def\PYG@tc##1{\textcolor[rgb]{0.67,0.40,0.00}{##1}}}
|
| 43 |
+
\@namedef{PYG@tok@sx}{\def\PYG@tc##1{\textcolor[rgb]{0.87,0.13,0.00}{##1}}\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 44 |
+
\@namedef{PYG@tok@m}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.40,0.00,0.93}{##1}}}
|
| 45 |
+
\@namedef{PYG@tok@mi}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.87}{##1}}}
|
| 46 |
+
\@namedef{PYG@tok@mf}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.40,0.00,0.93}{##1}}}
|
| 47 |
+
\@namedef{PYG@tok@mh}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.33,0.53}{##1}}}
|
| 48 |
+
\@namedef{PYG@tok@mo}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.27,0.00,0.93}{##1}}}
|
| 49 |
+
\@namedef{PYG@tok@gh}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.50}{##1}}}
|
| 50 |
+
\@namedef{PYG@tok@gu}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.50,0.00,0.50}{##1}}}
|
| 51 |
+
\@namedef{PYG@tok@gd}{\def\PYG@tc##1{\textcolor[rgb]{0.63,0.00,0.00}{##1}}}
|
| 52 |
+
\@namedef{PYG@tok@gi}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.63,0.00}{##1}}}
|
| 53 |
+
\@namedef{PYG@tok@gr}{\def\PYG@tc##1{\textcolor[rgb]{1.00,0.00,0.00}{##1}}}
|
| 54 |
+
\@namedef{PYG@tok@ge}{\let\PYG@it=\textit}
|
| 55 |
+
\@namedef{PYG@tok@gs}{\let\PYG@bf=\textbf}
|
| 56 |
+
\@namedef{PYG@tok@ges}{\let\PYG@bf=\textbf\let\PYG@it=\textit}
|
| 57 |
+
\@namedef{PYG@tok@gp}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.78,0.36,0.04}{##1}}}
|
| 58 |
+
\@namedef{PYG@tok@go}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 59 |
+
\@namedef{PYG@tok@gt}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.27,0.87}{##1}}}
|
| 60 |
+
\@namedef{PYG@tok@err}{\def\PYG@tc##1{\textcolor[rgb]{1.00,0.00,0.00}{##1}}\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.67,0.67}{\strut ##1}}}}
|
| 61 |
+
\@namedef{PYG@tok@kc}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.53,0.00}{##1}}}
|
| 62 |
+
\@namedef{PYG@tok@kd}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.53,0.00}{##1}}}
|
| 63 |
+
\@namedef{PYG@tok@kn}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.53,0.00}{##1}}}
|
| 64 |
+
\@namedef{PYG@tok@kr}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.53,0.00}{##1}}}
|
| 65 |
+
\@namedef{PYG@tok@bp}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.44,0.13}{##1}}}
|
| 66 |
+
\@namedef{PYG@tok@fm}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.40,0.73}{##1}}}
|
| 67 |
+
\@namedef{PYG@tok@vm}{\def\PYG@tc##1{\textcolor[rgb]{0.60,0.40,0.20}{##1}}}
|
| 68 |
+
\@namedef{PYG@tok@sa}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 69 |
+
\@namedef{PYG@tok@sb}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 70 |
+
\@namedef{PYG@tok@dl}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 71 |
+
\@namedef{PYG@tok@s2}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 72 |
+
\@namedef{PYG@tok@sh}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 73 |
+
\@namedef{PYG@tok@s1}{\def\PYG@bc##1{{\setlength{\fboxsep}{0pt}\colorbox[rgb]{1.00,0.94,0.94}{\strut ##1}}}}
|
| 74 |
+
\@namedef{PYG@tok@mb}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.40,0.00,0.93}{##1}}}
|
| 75 |
+
\@namedef{PYG@tok@il}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.87}{##1}}}
|
| 76 |
+
\@namedef{PYG@tok@ch}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 77 |
+
\@namedef{PYG@tok@cm}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 78 |
+
\@namedef{PYG@tok@cpf}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 79 |
+
\@namedef{PYG@tok@c1}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.53,0.53}{##1}}}
|
| 80 |
+
|
| 81 |
+
\def\PYGZbs{\char`\\}
|
| 82 |
+
\def\PYGZus{\char`\_}
|
| 83 |
+
\def\PYGZob{\char`\{}
|
| 84 |
+
\def\PYGZcb{\char`\}}
|
| 85 |
+
\def\PYGZca{\char`\^}
|
| 86 |
+
\def\PYGZam{\char`\&}
|
| 87 |
+
\def\PYGZlt{\char`\<}
|
| 88 |
+
\def\PYGZgt{\char`\>}
|
| 89 |
+
\def\PYGZsh{\char`\#}
|
| 90 |
+
\def\PYGZpc{\char`\%}
|
| 91 |
+
\def\PYGZdl{\char`\$}
|
| 92 |
+
\def\PYGZhy{\char`\-}
|
| 93 |
+
\def\PYGZsq{\char`\'}
|
| 94 |
+
\def\PYGZdq{\char`\"}
|
| 95 |
+
\def\PYGZti{\char`\~}
|
| 96 |
+
% for compatibility with earlier versions
|
| 97 |
+
\def\PYGZat{@}
|
| 98 |
+
\def\PYGZlb{[}
|
| 99 |
+
\def\PYGZrb{]}
|
| 100 |
+
\makeatother
|
app/scripts/latex-to-mdx/input/slides/_minted/default.style.minted
ADDED
|
@@ -0,0 +1,100 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\makeatletter
|
| 2 |
+
\def\PYG@reset{\let\PYG@it=\relax \let\PYG@bf=\relax%
|
| 3 |
+
\let\PYG@ul=\relax \let\PYG@tc=\relax%
|
| 4 |
+
\let\PYG@bc=\relax \let\PYG@ff=\relax}
|
| 5 |
+
\def\PYG@tok#1{\csname PYG@tok@#1\endcsname}
|
| 6 |
+
\def\PYG@toks#1+{\ifx\relax#1\empty\else%
|
| 7 |
+
\PYG@tok{#1}\expandafter\PYG@toks\fi}
|
| 8 |
+
\def\PYG@do#1{\PYG@bc{\PYG@tc{\PYG@ul{%
|
| 9 |
+
\PYG@it{\PYG@bf{\PYG@ff{#1}}}}}}}
|
| 10 |
+
\def\PYG#1#2{\PYG@reset\PYG@toks#1+\relax+\PYG@do{#2}}
|
| 11 |
+
|
| 12 |
+
\@namedef{PYG@tok@w}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.73,0.73}{##1}}}
|
| 13 |
+
\@namedef{PYG@tok@c}{\let\PYG@it=\textit\def\PYG@tc##1{\textcolor[rgb]{0.24,0.48,0.48}{##1}}}
|
| 14 |
+
\@namedef{PYG@tok@cp}{\def\PYG@tc##1{\textcolor[rgb]{0.61,0.40,0.00}{##1}}}
|
| 15 |
+
\@namedef{PYG@tok@k}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.50,0.00}{##1}}}
|
| 16 |
+
\@namedef{PYG@tok@kp}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.50,0.00}{##1}}}
|
| 17 |
+
\@namedef{PYG@tok@kt}{\def\PYG@tc##1{\textcolor[rgb]{0.69,0.00,0.25}{##1}}}
|
| 18 |
+
\@namedef{PYG@tok@o}{\def\PYG@tc##1{\textcolor[rgb]{0.40,0.40,0.40}{##1}}}
|
| 19 |
+
\@namedef{PYG@tok@ow}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.67,0.13,1.00}{##1}}}
|
| 20 |
+
\@namedef{PYG@tok@nb}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.50,0.00}{##1}}}
|
| 21 |
+
\@namedef{PYG@tok@nf}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,1.00}{##1}}}
|
| 22 |
+
\@namedef{PYG@tok@nc}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,1.00}{##1}}}
|
| 23 |
+
\@namedef{PYG@tok@nn}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,1.00}{##1}}}
|
| 24 |
+
\@namedef{PYG@tok@ne}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.80,0.25,0.22}{##1}}}
|
| 25 |
+
\@namedef{PYG@tok@nv}{\def\PYG@tc##1{\textcolor[rgb]{0.10,0.09,0.49}{##1}}}
|
| 26 |
+
\@namedef{PYG@tok@no}{\def\PYG@tc##1{\textcolor[rgb]{0.53,0.00,0.00}{##1}}}
|
| 27 |
+
\@namedef{PYG@tok@nl}{\def\PYG@tc##1{\textcolor[rgb]{0.46,0.46,0.00}{##1}}}
|
| 28 |
+
\@namedef{PYG@tok@ni}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.44,0.44,0.44}{##1}}}
|
| 29 |
+
\@namedef{PYG@tok@na}{\def\PYG@tc##1{\textcolor[rgb]{0.41,0.47,0.13}{##1}}}
|
| 30 |
+
\@namedef{PYG@tok@nt}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.50,0.00}{##1}}}
|
| 31 |
+
\@namedef{PYG@tok@nd}{\def\PYG@tc##1{\textcolor[rgb]{0.67,0.13,1.00}{##1}}}
|
| 32 |
+
\@namedef{PYG@tok@s}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.13,0.13}{##1}}}
|
| 33 |
+
\@namedef{PYG@tok@sd}{\let\PYG@it=\textit\def\PYG@tc##1{\textcolor[rgb]{0.73,0.13,0.13}{##1}}}
|
| 34 |
+
\@namedef{PYG@tok@si}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.64,0.35,0.47}{##1}}}
|
| 35 |
+
\@namedef{PYG@tok@se}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.67,0.36,0.12}{##1}}}
|
| 36 |
+
\@namedef{PYG@tok@sr}{\def\PYG@tc##1{\textcolor[rgb]{0.64,0.35,0.47}{##1}}}
|
| 37 |
+
\@namedef{PYG@tok@ss}{\def\PYG@tc##1{\textcolor[rgb]{0.10,0.09,0.49}{##1}}}
|
| 38 |
+
\@namedef{PYG@tok@sx}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.50,0.00}{##1}}}
|
| 39 |
+
\@namedef{PYG@tok@m}{\def\PYG@tc##1{\textcolor[rgb]{0.40,0.40,0.40}{##1}}}
|
| 40 |
+
\@namedef{PYG@tok@gh}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.50}{##1}}}
|
| 41 |
+
\@namedef{PYG@tok@gu}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.50,0.00,0.50}{##1}}}
|
| 42 |
+
\@namedef{PYG@tok@gd}{\def\PYG@tc##1{\textcolor[rgb]{0.63,0.00,0.00}{##1}}}
|
| 43 |
+
\@namedef{PYG@tok@gi}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.52,0.00}{##1}}}
|
| 44 |
+
\@namedef{PYG@tok@gr}{\def\PYG@tc##1{\textcolor[rgb]{0.89,0.00,0.00}{##1}}}
|
| 45 |
+
\@namedef{PYG@tok@ge}{\let\PYG@it=\textit}
|
| 46 |
+
\@namedef{PYG@tok@gs}{\let\PYG@bf=\textbf}
|
| 47 |
+
\@namedef{PYG@tok@ges}{\let\PYG@bf=\textbf\let\PYG@it=\textit}
|
| 48 |
+
\@namedef{PYG@tok@gp}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,0.50}{##1}}}
|
| 49 |
+
\@namedef{PYG@tok@go}{\def\PYG@tc##1{\textcolor[rgb]{0.44,0.44,0.44}{##1}}}
|
| 50 |
+
\@namedef{PYG@tok@gt}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.27,0.87}{##1}}}
|
| 51 |
+
\@namedef{PYG@tok@err}{\def\PYG@bc##1{{\setlength{\fboxsep}{\string -\fboxrule}\fcolorbox[rgb]{1.00,0.00,0.00}{1,1,1}{\strut ##1}}}}
|
| 52 |
+
\@namedef{PYG@tok@kc}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.50,0.00}{##1}}}
|
| 53 |
+
\@namedef{PYG@tok@kd}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.50,0.00}{##1}}}
|
| 54 |
+
\@namedef{PYG@tok@kn}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.50,0.00}{##1}}}
|
| 55 |
+
\@namedef{PYG@tok@kr}{\let\PYG@bf=\textbf\def\PYG@tc##1{\textcolor[rgb]{0.00,0.50,0.00}{##1}}}
|
| 56 |
+
\@namedef{PYG@tok@bp}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.50,0.00}{##1}}}
|
| 57 |
+
\@namedef{PYG@tok@fm}{\def\PYG@tc##1{\textcolor[rgb]{0.00,0.00,1.00}{##1}}}
|
| 58 |
+
\@namedef{PYG@tok@vc}{\def\PYG@tc##1{\textcolor[rgb]{0.10,0.09,0.49}{##1}}}
|
| 59 |
+
\@namedef{PYG@tok@vg}{\def\PYG@tc##1{\textcolor[rgb]{0.10,0.09,0.49}{##1}}}
|
| 60 |
+
\@namedef{PYG@tok@vi}{\def\PYG@tc##1{\textcolor[rgb]{0.10,0.09,0.49}{##1}}}
|
| 61 |
+
\@namedef{PYG@tok@vm}{\def\PYG@tc##1{\textcolor[rgb]{0.10,0.09,0.49}{##1}}}
|
| 62 |
+
\@namedef{PYG@tok@sa}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.13,0.13}{##1}}}
|
| 63 |
+
\@namedef{PYG@tok@sb}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.13,0.13}{##1}}}
|
| 64 |
+
\@namedef{PYG@tok@sc}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.13,0.13}{##1}}}
|
| 65 |
+
\@namedef{PYG@tok@dl}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.13,0.13}{##1}}}
|
| 66 |
+
\@namedef{PYG@tok@s2}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.13,0.13}{##1}}}
|
| 67 |
+
\@namedef{PYG@tok@sh}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.13,0.13}{##1}}}
|
| 68 |
+
\@namedef{PYG@tok@s1}{\def\PYG@tc##1{\textcolor[rgb]{0.73,0.13,0.13}{##1}}}
|
| 69 |
+
\@namedef{PYG@tok@mb}{\def\PYG@tc##1{\textcolor[rgb]{0.40,0.40,0.40}{##1}}}
|
| 70 |
+
\@namedef{PYG@tok@mf}{\def\PYG@tc##1{\textcolor[rgb]{0.40,0.40,0.40}{##1}}}
|
| 71 |
+
\@namedef{PYG@tok@mh}{\def\PYG@tc##1{\textcolor[rgb]{0.40,0.40,0.40}{##1}}}
|
| 72 |
+
\@namedef{PYG@tok@mi}{\def\PYG@tc##1{\textcolor[rgb]{0.40,0.40,0.40}{##1}}}
|
| 73 |
+
\@namedef{PYG@tok@il}{\def\PYG@tc##1{\textcolor[rgb]{0.40,0.40,0.40}{##1}}}
|
| 74 |
+
\@namedef{PYG@tok@mo}{\def\PYG@tc##1{\textcolor[rgb]{0.40,0.40,0.40}{##1}}}
|
| 75 |
+
\@namedef{PYG@tok@ch}{\let\PYG@it=\textit\def\PYG@tc##1{\textcolor[rgb]{0.24,0.48,0.48}{##1}}}
|
| 76 |
+
\@namedef{PYG@tok@cm}{\let\PYG@it=\textit\def\PYG@tc##1{\textcolor[rgb]{0.24,0.48,0.48}{##1}}}
|
| 77 |
+
\@namedef{PYG@tok@cpf}{\let\PYG@it=\textit\def\PYG@tc##1{\textcolor[rgb]{0.24,0.48,0.48}{##1}}}
|
| 78 |
+
\@namedef{PYG@tok@c1}{\let\PYG@it=\textit\def\PYG@tc##1{\textcolor[rgb]{0.24,0.48,0.48}{##1}}}
|
| 79 |
+
\@namedef{PYG@tok@cs}{\let\PYG@it=\textit\def\PYG@tc##1{\textcolor[rgb]{0.24,0.48,0.48}{##1}}}
|
| 80 |
+
|
| 81 |
+
\def\PYGZbs{\char`\\}
|
| 82 |
+
\def\PYGZus{\char`\_}
|
| 83 |
+
\def\PYGZob{\char`\{}
|
| 84 |
+
\def\PYGZcb{\char`\}}
|
| 85 |
+
\def\PYGZca{\char`\^}
|
| 86 |
+
\def\PYGZam{\char`\&}
|
| 87 |
+
\def\PYGZlt{\char`\<}
|
| 88 |
+
\def\PYGZgt{\char`\>}
|
| 89 |
+
\def\PYGZsh{\char`\#}
|
| 90 |
+
\def\PYGZpc{\char`\%}
|
| 91 |
+
\def\PYGZdl{\char`\$}
|
| 92 |
+
\def\PYGZhy{\char`\-}
|
| 93 |
+
\def\PYGZsq{\char`\'}
|
| 94 |
+
\def\PYGZdq{\char`\"}
|
| 95 |
+
\def\PYGZti{\char`\~}
|
| 96 |
+
% for compatibility with earlier versions
|
| 97 |
+
\def\PYGZat{@}
|
| 98 |
+
\def\PYGZlb{[}
|
| 99 |
+
\def\PYGZrb{]}
|
| 100 |
+
\makeatother
|
app/scripts/latex-to-mdx/input/slides/presentation.aux
ADDED
|
@@ -0,0 +1,18 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\relax
|
| 2 |
+
\providecommand\zref@newlabel[2]{}
|
| 3 |
+
\providecommand\hyper@newdestlabel[2]{}
|
| 4 |
+
\providecommand\HyField@AuxAddToFields[1]{}
|
| 5 |
+
\providecommand\HyField@AuxAddToCoFields[2]{}
|
| 6 |
+
\providecommand\pbs@newkey[2]{}
|
| 7 |
+
\providecommand\pbs@seq@push@cx[2]{}
|
| 8 |
+
\providecommand\@anim@newkey[2]{}
|
| 9 |
+
\providecommand\mix@newkey[2]{}
|
| 10 |
+
\providecommand \oddpage@label [2]{}
|
| 11 |
+
\@writefile{nav}{\headcommand {\slideentry {0}{0}{1}{1/1}{}{0}}}
|
| 12 |
+
\@writefile{nav}{\headcommand {\beamer@framepages {1}{1}}}
|
| 13 |
+
\@writefile{nav}{\headcommand {\beamer@partpages {1}{1}}}
|
| 14 |
+
\@writefile{nav}{\headcommand {\beamer@subsectionpages {1}{1}}}
|
| 15 |
+
\@writefile{nav}{\headcommand {\beamer@sectionpages {1}{1}}}
|
| 16 |
+
\@writefile{nav}{\headcommand {\beamer@documentpages {1}}}
|
| 17 |
+
\@writefile{nav}{\headcommand {\gdef \inserttotalframenumber {1}}}
|
| 18 |
+
\gdef \@abspage@last{1}
|
app/scripts/latex-to-mdx/input/slides/presentation.fdb_latexmk
ADDED
|
@@ -0,0 +1,263 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Fdb version 4
|
| 2 |
+
["pdflatex"] 1758794168.39845 "/Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/slides/presentation.tex" "presentation.pdf" "presentation" 1758794170.37639 0
|
| 3 |
+
"../logos/hf.pdf" 1757444600.08131 24570 821623074dcfbebaa29bc6a5c197dcdf ""
|
| 4 |
+
"../logos/lerobot.png" 1758030276.97623 281437 0accc40bd89be94ac1a003439f01b806 ""
|
| 5 |
+
"/Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/slides/presentation.tex" 1758030276.98118 473 a8999eb3f52e1fbbc4f9e94dc7e675fc ""
|
| 6 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/map/fontname/texfonts.map" 1577235249 3524 cb3e574dea2d1052e39280babc910dc8 ""
|
| 7 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/adobe/symbol/psyr.tfm" 1136768653 1408 5937f58aa508ea2cea4901c07d10f5fe ""
|
| 8 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/adobe/zapfding/pzdr.tfm" 1136768653 1528 f853c4d1b4e0550255e02831fdc8496f ""
|
| 9 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex7.tfm" 1246382020 1004 54797486969f23fa377b128694d548df ""
|
| 10 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex8.tfm" 1246382020 988 bdf658c3bfc2d96d3c8b02cfc1c94c20 ""
|
| 11 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam10.tfm" 1246382020 916 f87d7c45f9c908e672703b83b72241a3 ""
|
| 12 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam5.tfm" 1246382020 924 9904cf1d39e9767e7a3622f2a125a565 ""
|
| 13 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam7.tfm" 1246382020 928 2dc8d444221b7a635bb58038579b861a ""
|
| 14 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm10.tfm" 1246382020 908 2921f8a10601f252058503cc6570e581 ""
|
| 15 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm5.tfm" 1246382020 940 75ac932a52f80982a9f8ea75d03a34cf ""
|
| 16 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm7.tfm" 1246382020 940 228d6584342e91276bf566bcf9716b83 ""
|
| 17 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmex10.tfm" 1136768653 992 662f679a0b3d2d53c1b94050fdaa3f50 ""
|
| 18 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi10.tfm" 1136768653 1528 abec98dbc43e172678c11b3b9031252a ""
|
| 19 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi6.tfm" 1136768653 1512 f21f83efb36853c0b70002322c1ab3ad ""
|
| 20 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi8.tfm" 1136768653 1520 eccf95517727cb11801f4f1aee3a21b4 ""
|
| 21 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr10.tfm" 1136768653 1296 45809c5a464d5f32c8f98ba97c1bb47f ""
|
| 22 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmss10.tfm" 1136768653 1316 b636689f1933f24d1294acdf6041daaa ""
|
| 23 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmss12.tfm" 1136768653 1324 37b971caf729d7edd9cbb9f9b0ea76eb ""
|
| 24 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmss8.tfm" 1136768653 1296 d77f431d10d47c8ea2cc18cf45346274 ""
|
| 25 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy10.tfm" 1136768653 1124 6c73e740cf17375f03eec0ee63599741 ""
|
| 26 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy6.tfm" 1136768653 1116 933a60c408fc0a863a92debe84b2d294 ""
|
| 27 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy8.tfm" 1136768653 1120 8b7d695260f3cff42e636090a8002094 ""
|
| 28 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmtt12.tfm" 1136768653 772 9a936b7f5e2ff0557fce0f62822f0bbf ""
|
| 29 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmtt8.tfm" 1136768653 768 d7b9a2629a0c353102ad947dc9221d49 ""
|
| 30 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/latex-fonts/lasy6.tfm" 1136768653 520 4889cce2180234b97cad636b6039c722 ""
|
| 31 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/sansmathaccent/mathkerncmssi10.tfm" 1336178347 1696 aaa5bbd1f47f001247d42218ce371101 ""
|
| 32 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/tfm/public/sansmathaccent/mathkerncmssi8.tfm" 1336178347 1676 fb6c6a335484692abff897d6e8965829 ""
|
| 33 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmss10.pfb" 1248133631 24457 5cbb7bdf209d5d1ce9892a9b80a307cc ""
|
| 34 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmss12.pfb" 1248133631 24393 3b7eb51a67a0a62aec5849271bdb9c2e ""
|
| 35 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmss8.pfb" 1248133631 24420 52dbb8e8aa0069a1b987309557f8d303 ""
|
| 36 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmtt12.pfb" 1248133631 24252 1e4e051947e12dfb50fee0b7f4e26e3a ""
|
| 37 |
+
"/usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmtt8.pfb" 1248133631 24287 6b803fa9eb1ddff9112e00519b09dd9e ""
|
| 38 |
+
"/usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii" 1461363279 71627 94eb9990bed73c364d7f53f960cc8c5b ""
|
| 39 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/atbegshi/atbegshi.sty" 1575674566 24708 5584a51a7101caf7e6bbf1fc27d8f7b1 ""
|
| 40 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty" 1576625341 40635 c40361e206be584d448876bba8a64a3b ""
|
| 41 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/bitset/bitset.sty" 1576016050 33961 6b5c75130e435b2bfdb9f480a09a39f9 ""
|
| 42 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/etexcmds/etexcmds.sty" 1576625273 7734 b98cbb34c81f667027c1e3ebdbfce34b ""
|
| 43 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty" 1576625223 8371 9d55b8bd010bc717624922fb3477d92e ""
|
| 44 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/iftex.sty" 1734129479 7984 7dbb9280f03c0a315425f1b4f35d43ee ""
|
| 45 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifvtex.sty" 1572645307 1057 525c2192b5febbd8c1f662c9468335bb ""
|
| 46 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/infwarerr/infwarerr.sty" 1575499628 8356 7bbb2c2373aa810be568c29e333da8ed ""
|
| 47 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/intcalc/intcalc.sty" 1576625065 31769 002a487f55041f8e805cfbf6385ffd97 ""
|
| 48 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty" 1576878844 5412 d5a2436094cd7be85769db90f29250a6 ""
|
| 49 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty" 1701727651 17865 1a9bd36b4f98178fa551aca822290953 ""
|
| 50 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pdfescape/pdfescape.sty" 1576015897 19007 15924f7228aca6c6d184b115f4baa231 ""
|
| 51 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty" 1593379760 20089 80423eac55aa175305d35b49e04fe23b ""
|
| 52 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex" 1673816307 1016 1c2b89187d12a2768764b83b4945667c ""
|
| 53 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorearrows.code.tex" 1601326656 43820 1fef971b75380574ab35a0d37fd92608 ""
|
| 54 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreexternal.code.tex" 1601326656 19324 f4e4c6403dd0f1605fd20ed22fa79dea ""
|
| 55 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoregraphicstate.code.tex" 1601326656 6038 ccb406740cc3f03bbfb58ad504fe8c27 ""
|
| 56 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreimage.code.tex" 1673816307 6911 f6d4cf5a3fef5cc879d668b810e82868 ""
|
| 57 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorelayers.code.tex" 1601326656 4883 42daaf41e27c3735286e23e48d2d7af9 ""
|
| 58 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreobjects.code.tex" 1601326656 2544 8c06d2a7f0f469616ac9e13db6d2f842 ""
|
| 59 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathconstruct.code.tex" 1601326656 44195 5e390c414de027626ca5e2df888fa68d ""
|
| 60 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathprocessing.code.tex" 1601326656 17311 2ef6b2e29e2fc6a2fc8d6d652176e257 ""
|
| 61 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathusage.code.tex" 1601326656 21302 788a79944eb22192a4929e46963a3067 ""
|
| 62 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepatterns.code.tex" 1673816307 9691 3d42d89522f4650c2f3dc616ca2b925e ""
|
| 63 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepoints.code.tex" 1601326656 33335 dd1fa4814d4e51f18be97d88bf0da60c ""
|
| 64 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorequick.code.tex" 1601326656 2965 4c2b1f4e0826925746439038172e5d6f ""
|
| 65 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorerdf.code.tex" 1601326656 5196 2cc249e0ee7e03da5f5f6589257b1e5b ""
|
| 66 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorescopes.code.tex" 1673816307 20821 7579108c1e9363e61a0b1584778804aa ""
|
| 67 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreshade.code.tex" 1601326656 35249 abd4adf948f960299a4b3d27c5dddf46 ""
|
| 68 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransformations.code.tex" 1673816307 22012 81b34a0aa8fa1a6158cc6220b00e4f10 ""
|
| 69 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransparency.code.tex" 1601326656 8893 e851de2175338fdf7c17f3e091d94618 ""
|
| 70 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarytopaths.code.tex" 1608933718 11518 738408f795261b70ce8dd47459171309 ""
|
| 71 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex" 1673816307 186782 af500404a9edec4d362912fe762ded92 ""
|
| 72 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryplothandlers.code.tex" 1601326656 32995 ac577023e12c0e4bd8aa420b2e852d1a ""
|
| 73 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfint.code.tex" 1557692582 3063 8c415c68a0f3394e45cfeca0b65f6ee6 ""
|
| 74 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex" 1673816307 949 cea70942e7b7eddabfb3186befada2e6 ""
|
| 75 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathcalc.code.tex" 1673816307 13270 2e54f2ce7622437bf37e013d399743e3 ""
|
| 76 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfloat.code.tex" 1673816307 104717 9b2393fbf004a0ce7fa688dbce423848 ""
|
| 77 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.base.code.tex" 1601326656 10165 cec5fa73d49da442e56efc2d605ef154 ""
|
| 78 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.basic.code.tex" 1601326656 28178 41c17713108e0795aac6fef3d275fbca ""
|
| 79 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.code.tex" 1673816307 9649 85779d3d8d573bfd2cd4137ba8202e60 ""
|
| 80 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.comparison.code.tex" 1601326656 3865 ac538ab80c5cf82b345016e474786549 ""
|
| 81 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.integerarithmetics.code.tex" 1557692582 3177 27d85c44fbfe09ff3b2cf2879e3ea434 ""
|
| 82 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.misc.code.tex" 1621110968 11024 0179538121bc2dba172013a3ef89519f ""
|
| 83 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.random.code.tex" 1673816307 7890 0a86dbf4edfd88d022e0d889ec78cc03 ""
|
| 84 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.round.code.tex" 1601326656 3379 781797a101f647bab82741a99944a229 ""
|
| 85 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.trigonometric.code.tex" 1601326656 92405 f515f31275db273f97b9d8f52e1b0736 ""
|
| 86 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathparser.code.tex" 1673816307 37466 97b0a1ba732e306a1a2034f5a73e239f ""
|
| 87 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathutil.code.tex" 1601326656 8471 c2883569d03f69e8e1cabfef4999cfd7 ""
|
| 88 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmodulematrix.code.tex" 1673816307 21211 1e73ec76bd73964d84197cc3d2685b01 ""
|
| 89 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleplot.code.tex" 1601326656 16121 346f9013d34804439f7436ff6786cef7 ""
|
| 90 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleshapes.code.tex" 1673816307 44792 271e2e1934f34c759f4dedb1e14a5015 ""
|
| 91 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/pgf.revision.tex" 1673816307 114 e6d443369d0673933b38834bf99e422d ""
|
| 92 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgf.cfg" 1601326656 926 2963ea0dcf6cc6c0a770b69ec46a477b ""
|
| 93 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-common-pdf.def" 1673816307 5542 32f75a31ea6c3a7e1148cd6d5e93dbb7 ""
|
| 94 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.def" 1673816307 12612 7774ba67bfd72e593c4436c2de6201e3 ""
|
| 95 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex" 1673816307 61351 bc5f86e0355834391e736e97a61abced ""
|
| 96 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.tex" 1601326656 1896 b8e0ca0ac371d74c0ca05583f6313c91 ""
|
| 97 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.tex" 1601326656 7778 53c8b5623d80238f6a20aa1df1868e63 ""
|
| 98 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex" 1673816307 24033 d8893a1ec4d1bfa101b172754743d340 ""
|
| 99 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex" 1673816307 39784 414c54e866ebab4b801e2ad81d9b21d8 ""
|
| 100 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeyslibraryfiltered.code.tex" 1673816307 37433 940bc6d409f1ffd298adfdcaf125dd86 ""
|
| 101 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex" 1673816307 4385 510565c2f07998c8a0e14f0ec07ff23c ""
|
| 102 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-common.tex" 1673816307 29239 22e8c7516012992a49873eff0d868fed ""
|
| 103 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-latex.def" 1673816307 6950 8524a062d82b7afdc4a88a57cb377784 ""
|
| 104 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/stringenc/stringenc.sty" 1575152242 21514 b7557edcee22835ef6b03ede1802dad4 ""
|
| 105 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/ulem/ulem.sty" 1578692523 15682 94f55b803e160cf7fb6e4d77d07cfe1d ""
|
| 106 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty" 1576624663 7008 f92eaa0a3872ed622bbf538217cd2ab7 ""
|
| 107 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkeyval.tex" 1655411236 19231 27205ee17aaa2902aea3e0c07a3cfc65 ""
|
| 108 |
+
"/usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkvutils.tex" 1655411236 7677 9cb1a74d945bc9331f2181c0a59ff34a ""
|
| 109 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjcalc.sty" 1666037967 5598 c49b91713cbe5e50a1fabefb733eda0d ""
|
| 110 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjustbox.sty" 1740604409 56907 b74d2bd6fed8dc761953edb2fbea781b ""
|
| 111 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/tc-pdftex.def" 1740604409 4304 461724faa0dfbdec2d80de16c11f407c ""
|
| 112 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/trimclip.sty" 1740176375 7245 2bf1779563af51e666da8f26ea1f8455 ""
|
| 113 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amscls/amsthm.sty" 1591045760 12594 0d51ac3a545aaaa555021326ff22a6cc ""
|
| 114 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amsfonts.sty" 1359763108 5949 3f3fd50a8cc94c3d4cbf4fc66cd3df1c ""
|
| 115 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amssymb.sty" 1359763108 13829 94730e64147574077f8ecfea9bb69af4 ""
|
| 116 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsa.fd" 1359763108 961 6518c6525a34feb5e8250ffa91731cff ""
|
| 117 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsb.fd" 1359763108 961 d02606146ba5601b5645f987c92e6193 ""
|
| 118 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsbsy.sty" 1717359999 2222 2166a1f7827be30ddc30434e5efcee1b ""
|
| 119 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsgen.sty" 1717359999 4173 d22509bc0c91281d991b2de7c88720dd ""
|
| 120 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsmath.sty" 1730928152 88370 c780f23aea0ece6add91e09b44dca2cd ""
|
| 121 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsopn.sty" 1717359999 4474 23ca1d3a79a57b405388059456d0a8df ""
|
| 122 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amstext.sty" 1717359999 2444 71618ea5f2377e33b04fb97afdd0eac2 ""
|
| 123 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/animate/animate.sty" 1728933111 138944 e44d31c0c9cfc077489d61b6146ebcbc ""
|
| 124 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/atveryend/atveryend.sty" 1728505250 1695 be6b4d13b33db697fd3fd30b24716c1a ""
|
| 125 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/auxhook/auxhook.sty" 1576625391 3935 57aa3c3e203a5c2effb4d2bd2efbc323 ""
|
| 126 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/atbegshi-ltx.sty" 1738182759 2963 d8ec5a1b4e0a106c5c737900202763e4 ""
|
| 127 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/atveryend-ltx.sty" 1738182759 2378 14b657ee5031da98cf91648f19642694 ""
|
| 128 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/ifthen.sty" 1738182759 5525 9dced5929f36b19fa837947f5175b331 ""
|
| 129 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/base/size11.clo" 1738182759 8464 e73911cdcc738e82d6adccd28e654bb1 ""
|
| 130 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamer.cls" 1738788133 12512 d70d58b808fe77e0093f8c1fa95fea1a ""
|
| 131 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseauxtemplates.sty" 1738788133 24485 3d4d9814062dfbb67c51a9ccf7540b9f ""
|
| 132 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseboxes.sty" 1738788133 8738 2483757f2c8ab2672a47eb68e7a56653 ""
|
| 133 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasecolor.sty" 1738788133 13197 ce933773f9b5f347768c819ffb36fae1 ""
|
| 134 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasecompatibility.sty" 1738788133 27676 4e430116ea2a9b9dbbbcfb5ad5cfc3f8 ""
|
| 135 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasedecode.sty" 1684185199 9397 90105d8818f445af9ed5a33927eeaf84 ""
|
| 136 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasefont.sty" 1738788133 13685 c75872434f714c86134e8d57338b7409 ""
|
| 137 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframe.sty" 1738788133 25314 d2c512b7583539b7bb53dd42152a85db ""
|
| 138 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframecomponents.sty" 1704576813 12211 e8a8c93c0e907b9b5ed419333c269485 ""
|
| 139 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframesize.sty" 1704576813 9014 50b422b9f379c19ffa1e9a50b4cea3d0 ""
|
| 140 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaselocalstructure.sty" 1738788133 18145 3cbdbd9b3c941e7390134e8146487a3c ""
|
| 141 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasemisc.sty" 1684185199 8303 3459317cb46f83a5b3ddc3bb5d011a4a ""
|
| 142 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasemodes.sty" 1738788133 7952 ce765494e1ab84181455c708276147c6 ""
|
| 143 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigation.sty" 1738788133 21616 d28ad1a22082bd3669b342f9da359882 ""
|
| 144 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigationsymbols.tex" 1704576813 8137 32c2718131d54d3e6ff6150b81026fa7 ""
|
| 145 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenotes.sty" 1704576813 5752 a814a0d1bc4946fe3bc3e616446c7d36 ""
|
| 146 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseoptions.sty" 1684185199 1743 5acd9fac8c2fc5a96f2f36385ae738b3 ""
|
| 147 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseoverlay.sty" 1738788133 28593 f551c58bf566fd73265ee6038ff91caa ""
|
| 148 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaserequires.sty" 1684185199 1583 12314c3bb8ab13b289cdcb9f2bb13580 ""
|
| 149 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasesection.sty" 1704576813 13842 6aadb3bc34d2950caeb4f29bbea57a21 ""
|
| 150 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetemplates.sty" 1684185199 5743 fc0d51414dd291b72b11cad049170b85 ""
|
| 151 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasethemes.sty" 1684185199 1130 844d3db83413a2cb0d2619d67ae2df4e ""
|
| 152 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetheorems.sty" 1738788133 4539 c1b0e0b38fa0c8a327495cc0e25ae3c9 ""
|
| 153 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetitle.sty" 1738788133 5334 80b533be8409f2601cd51ff3a1948a2e ""
|
| 154 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetoc.sty" 1704576813 7795 dd70e26ab078785a98e459b9ccd65649 ""
|
| 155 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetranslator.sty" 1684185199 627 47d7193c3a1da10f5aa663a70b6d149b ""
|
| 156 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetwoscreens.sty" 1704576813 1848 a36eaf6bee3ae23c7df106497df8f842 ""
|
| 157 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseverbatim.sty" 1684185199 4016 c25a9e117ac7f79cd712d692979a9ed5 ""
|
| 158 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemedefault.sty" 1704576813 7202 1f79be9366ab4084ba924d7ab3a08756 ""
|
| 159 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemeorchid.sty" 1684185199 825 00d1909bb78322ff02c6a717d43a1eca ""
|
| 160 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemewhale.sty" 1684185199 1003 64c64faa8cc4168b6b274c7b9adbbf8b ""
|
| 161 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerfontthemedefault.sty" 1684185199 4226 4a3a91ecbea18e5e04836d585ff0e257 ""
|
| 162 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonarticle.20.pdf" 1513642141 2958 4e0c4a6e994e5c4d9da11c477e927f0f ""
|
| 163 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonarticle.pdf" 1513642141 2936 6cc3ef0682cbb62be8aa1b19f0a84ed6 ""
|
| 164 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonbook.20.pdf" 1513642141 2734 0bcf939051dd2a936cdfe5982f7c233b ""
|
| 165 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonbook.pdf" 1513642141 2667 7624351b441ffe4bd2d14e08fbcf063d ""
|
| 166 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericononline.20.pdf" 1513642141 24451 195d2c060e84f339954bc6d9b52131d7 ""
|
| 167 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericononline.pdf" 1513642141 24611 df07010540266b2b205b492a4d02e7e1 ""
|
| 168 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerinnerthemedefault.sty" 1738788133 13780 94616ca1c0702ca67eb52c3bcb9bac94 ""
|
| 169 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerinnerthemerounded.sty" 1684185199 1002 063bf9e9a79365fd3185a6b9a8e96af9 ""
|
| 170 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerouterthemedefault.sty" 1738788133 6863 1d55acbe91655dfe8cdddb3dad95ae7c ""
|
| 171 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerouterthemeinfolines.sty" 1738788133 2362 480b1768526a0d8bf6a6978690fa9cfd ""
|
| 172 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerthemeMadrid.sty" 1684185199 547 fddb0cdcc0d6e52398f228cd29a39f8a ""
|
| 173 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerthemedefault.sty" 1684185199 345 b9f1afd5eccd808064d49a802f119443 ""
|
| 174 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/booktabs/booktabs.sty" 1579038678 6078 f1cb470c9199e7110a27851508ed7a5c ""
|
| 175 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption-beamer.sto" 1645391520 4350 a9295a4610cd29113396b45a37d92606 ""
|
| 176 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption.sty" 1696191071 56128 c2ccf1a29d78c33bc553880402e4fb9a ""
|
| 177 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption3.sty" 1696191071 72619 ee90b6612147680fd73c3b1406a74245 ""
|
| 178 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/caption/subcaption.sty" 1690576852 12494 0c0cdb824278a4d51cefeb2e79901315 ""
|
| 179 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/collectbox/collectbox.sty" 1666037909 9124 59c3b56f1a073de66e3eea35f9c173c8 ""
|
| 180 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty" 1579991033 13886 d1306dcf79a944f6988e688c1785f9ce ""
|
| 181 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/etoolbox/etoolbox.sty" 1739306980 46850 d87daedc2abdc653769a6f1067849fe0 ""
|
| 182 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/geometry/geometry.sty" 1578002852 41601 9cf6c5257b1bc7af01a58859749dd37a ""
|
| 183 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg" 1459978653 1213 620bba36b25224fa9b7e1ccb4ecb76fd ""
|
| 184 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg" 1465944070 1224 978390e9c2234eab29404bc21b268d1e ""
|
| 185 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def" 1713382759 19440 9da9dcbb27470349a580fca7372d454b ""
|
| 186 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphics.sty" 1730496337 18363 dee506cb8d56825d8a4d020f5d5f8704 ""
|
| 187 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphicx.sty" 1717359999 8010 6f2ad8c2b2ffbd607af6475441c7b5e4 ""
|
| 188 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/keyval.sty" 1717359999 2671 70891d50dac933918b827d326687c6e8 ""
|
| 189 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx" 1667332637 2885 9c645d672ae17285bba324998918efd8 ""
|
| 190 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/trig.sty" 1717359999 4023 2c9f39712cf7b43d3eb93a8bbd5c8f67 ""
|
| 191 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hycolor/hycolor.sty" 1580250785 17914 4c28a13fc3d975e6e81c9bea1d697276 ""
|
| 192 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def" 1730838014 48154 82da9991b9f0390b3a9d3af6c8618af4 ""
|
| 193 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hyperref.sty" 1730838014 222112 c22dbd2288f89f7ba942ac22f7d00f11 ""
|
| 194 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/nameref.sty" 1705871765 11026 182c63f139a71afd30a28e5f1ed2cd1c ""
|
| 195 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def" 1730838014 14249 ff700eb13ce975a424b2dd99b1a83044 ""
|
| 196 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def" 1730838014 117112 7533bff456301d32e6d6356fad15f543 ""
|
| 197 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/ifoddpage/ifoddpage.sty" 1666126449 2142 eae42205b97b7a3ad0e58db5fe99e3e6 ""
|
| 198 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile-hook.sty" 1729800159 11185 08107e8d26d093ccd4c424c2b74809f6 ""
|
| 199 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile.sty" 1729800159 3328 17a5a2d4f4e9d388803c10ac9fffe9d3 ""
|
| 200 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlogo.sty" 1729800159 2162 e219c1ddf641a7cd0ee0103af3ac7f3d ""
|
| 201 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/kvoptions/kvoptions.sty" 1655478651 22555 6d8e155cfef6d82c3d5c742fea7c992e ""
|
| 202 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty" 1665067230 13815 760b0c02f691ea230f5359c4e1de23a7 ""
|
| 203 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def" 1716410060 29785 9f93ab201fe5dd053afcc6c1bcf7d266 ""
|
| 204 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg" 1279039959 678 4792914a8f45be57bb98413425e4c7af ""
|
| 205 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/media9/media9.sty" 1726517605 162092 c5c8362c448944ef2e20ea6c420defea ""
|
| 206 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/media9/pdfbase.sty" 1726517605 101085 7a695eb850a3ff917185daa25fd56ee2 ""
|
| 207 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/multirow/multirow.sty" 1731446765 6696 886c9f3087d0b973ed2c19aa79cb3023 ""
|
| 208 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/natbib/natbib.sty" 1291685959 45456 1c8843383c0bd05870c45fa0ebea6cc2 ""
|
| 209 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/oberdiek/ifdraft.sty" 1575152444 1922 5bdcc31b0573e5e7f31c36f1b88b6a7d ""
|
| 210 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/ocgx2/ocgbase.sty" 1726430178 21269 3639b280a82cb97074ba74beb2720ea2 ""
|
| 211 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgf.sty" 1601326656 1090 bae35ef70b3168089ef166db3e66f5b2 ""
|
| 212 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty" 1673816307 373 00b204b1d7d095b892ad31a7494b0373 ""
|
| 213 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-0-65.sty" 1601326656 21013 f4ff83d25bb56552493b030f27c075ae ""
|
| 214 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-1-18.sty" 1601326656 989 c49c8ae06d96f8b15869da7428047b1e ""
|
| 215 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/frontendlayer/tikz.sty" 1601326656 339 c2e180022e3afdb99c7d0ea5ce469b7d ""
|
| 216 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/math/pgfmath.sty" 1601326656 306 c56a323ca5bf9242f54474ced10fca71 ""
|
| 217 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty" 1601326656 443 8c872229db56122037e86bcda49e14f3 ""
|
| 218 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgffor.sty" 1601326656 348 ee405e64380c11319f0e249fed57e6c5 ""
|
| 219 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty" 1601326656 274 5ae372b7df79135d240456a1c6f2cf9a ""
|
| 220 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty" 1601326656 325 f9f16d12354225b7dd52a3321f085955 ""
|
| 221 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/xxcolor.sty" 1601326656 2232 b9a67bccba736ed334b4b1a860a85c6f ""
|
| 222 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/pifont.sty" 1586716065 2283 62e73848f29fd8cd37fb7974c7cf2221 ""
|
| 223 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upsy.fd" 1137110629 148 2da0acd77cba348f34823f44cabf0058 ""
|
| 224 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upzd.fd" 1137110629 148 b2a94082cb802f90d3daf6dd0c7188a0 ""
|
| 225 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/refcount/refcount.sty" 1576624809 9878 9e94e8fa600d95f9c7731bb21dfb67a4 ""
|
| 226 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty" 1657483315 9714 ba3194bd52c8499b3f1e3eb91d409670 ""
|
| 227 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/ot1mathkerncmss.fd" 1580595219 1299 5a2b7aad8540e4f7415f2af0eb91bc10 ""
|
| 228 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/sansmathaccent.sty" 1580595219 4282 5d27280ace1239baaa4a225df16125ff ""
|
| 229 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tools/calc.sty" 1717359999 10214 61188260d324e94bc2f66825d7d3fdf4 ""
|
| 230 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/tools/enumerate.sty" 1717359999 3468 ad69b54642e68f9fdf39ec1a16dd7341 ""
|
| 231 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-basic-dictionary-English.dict" 1596662134 3535 7dc96051305a7e943219126c49c44cd6 ""
|
| 232 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-bibliography-dictionary-English.dict" 1512078926 903 c6d17f0656e9e1abb172b4faebabd617 ""
|
| 233 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-environment-dictionary-English.dict" 1512078926 433 bfb8d1c2c020defd2de8e5c276710094 ""
|
| 234 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-months-dictionary-English.dict" 1512078926 1337 9a6c05e8f0c8b3c5f27cbd0e455cf475 ""
|
| 235 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-numbers-dictionary-English.dict" 1512078926 1638 2bf1a1dea98f8a4d28033fce76e9cc67 ""
|
| 236 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-theorem-dictionary-English.dict" 1512078926 3523 1f9d9b91f7d78b73e74c7e97bca30fb0 ""
|
| 237 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator.sty" 1622492733 8765 56d370785f0143111ff9898b5adfe08e ""
|
| 238 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/url/url.sty" 1388531844 12796 8edb7d69a20b857904dd0ea757c14ec9 ""
|
| 239 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/varwidth/varwidth.sty" 1238697683 10894 d359a13923460b2a73d4312d613554c8 ""
|
| 240 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/xcolor/xcolor.sty" 1727642399 55384 b454dec21c2d9f45ec0b793f0995b992 ""
|
| 241 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/xkeyval/xkeyval.sty" 1655411236 4937 4ce600ce9bd4ec84d0250eb6892fcf4f ""
|
| 242 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/zref/zref-abspage.sty" 1694723124 2269 4fe6fb3396214c91f164d45faf9bec45 ""
|
| 243 |
+
"/usr/local/texlive/2025/texmf-dist/tex/latex/zref/zref-base.sty" 1694723124 20785 f7d1133a287929598a897c93f8f4c44b ""
|
| 244 |
+
"/usr/local/texlive/2025/texmf-dist/web2c/texmf.cnf" 1739380943 42148 61becc7c670cd061bb319c643c27fdd4 ""
|
| 245 |
+
"/usr/local/texlive/2025/texmf-var/fonts/map/pdftex/updmap/pdftex.map" 1756208942 5467155 19efa205003f9ecad95fbbaa6ff24da1 ""
|
| 246 |
+
"/usr/local/texlive/2025/texmf-var/web2c/pdftex/pdflatex.fmt" 1756208917 3345738 bbbb93a25a0c937f0c0915ef8b1d5cd7 ""
|
| 247 |
+
"/usr/local/texlive/2025/texmf.cnf" 1741450484 577 418a7058ec8e006d8704f60ecd22c938 ""
|
| 248 |
+
"01.tex" 1758794165.95443 393 fd571c023b1e97d5e290e619b0c783fe ""
|
| 249 |
+
"helpers.tex" 1758030276.97871 920 4bbbbd32d2ac525e5ef151a77d7cd97f ""
|
| 250 |
+
"preamble.tex" 1758030363.57764 2005 c04fa087be37dce842ec4719319567d3 ""
|
| 251 |
+
"presentation.aux" 1758794170.30266 802 c00726caf8840ef93bf48f64bb7abda5 "pdflatex"
|
| 252 |
+
"presentation.nav" 1758794170.30334 307 9e520c26371a1250ff9fc1fe87f9c655 "pdflatex"
|
| 253 |
+
"presentation.out" 1758794169.1438 0 d41d8cd98f00b204e9800998ecf8427e "pdflatex"
|
| 254 |
+
"presentation.tex" 1758030276.98118 473 a8999eb3f52e1fbbc4f9e94dc7e675fc ""
|
| 255 |
+
(generated)
|
| 256 |
+
"presentation.aux"
|
| 257 |
+
"presentation.log"
|
| 258 |
+
"presentation.nav"
|
| 259 |
+
"presentation.out"
|
| 260 |
+
"presentation.pdf"
|
| 261 |
+
"presentation.snm"
|
| 262 |
+
"presentation.toc"
|
| 263 |
+
(rewritten before read)
|
app/scripts/latex-to-mdx/input/slides/presentation.fls
ADDED
|
@@ -0,0 +1,495 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
PWD /Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/slides
|
| 2 |
+
INPUT /usr/local/texlive/2025/texmf.cnf
|
| 3 |
+
INPUT /usr/local/texlive/2025/texmf-dist/web2c/texmf.cnf
|
| 4 |
+
INPUT /usr/local/texlive/2025/texmf-var/web2c/pdftex/pdflatex.fmt
|
| 5 |
+
INPUT /Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/slides/presentation.tex
|
| 6 |
+
OUTPUT presentation.log
|
| 7 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamer.cls
|
| 8 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamer.cls
|
| 9 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasemodes.sty
|
| 10 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasemodes.sty
|
| 11 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/etoolbox/etoolbox.sty
|
| 12 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/etoolbox/etoolbox.sty
|
| 13 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasedecode.sty
|
| 14 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasedecode.sty
|
| 15 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/iftex/iftex.sty
|
| 16 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/iftex/iftex.sty
|
| 17 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseoptions.sty
|
| 18 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseoptions.sty
|
| 19 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/keyval.sty
|
| 20 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/keyval.sty
|
| 21 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hyperref.sty
|
| 22 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/geometry/geometry.sty
|
| 23 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/geometry/geometry.sty
|
| 24 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifvtex.sty
|
| 25 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifvtex.sty
|
| 26 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/math/pgfmath.sty
|
| 27 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/math/pgfmath.sty
|
| 28 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty
|
| 29 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty
|
| 30 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-common.tex
|
| 31 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-latex.def
|
| 32 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex
|
| 33 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex
|
| 34 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex
|
| 35 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/pgf.revision.tex
|
| 36 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/pgf.revision.tex
|
| 37 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty
|
| 38 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty
|
| 39 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
|
| 40 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
|
| 41 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
|
| 42 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeyslibraryfiltered.code.tex
|
| 43 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex
|
| 44 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex
|
| 45 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex
|
| 46 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathutil.code.tex
|
| 47 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathparser.code.tex
|
| 48 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.code.tex
|
| 49 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.basic.code.tex
|
| 50 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.trigonometric.code.tex
|
| 51 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.random.code.tex
|
| 52 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.comparison.code.tex
|
| 53 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.base.code.tex
|
| 54 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.round.code.tex
|
| 55 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.misc.code.tex
|
| 56 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.integerarithmetics.code.tex
|
| 57 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathcalc.code.tex
|
| 58 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfloat.code.tex
|
| 59 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/size11.clo
|
| 60 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/size11.clo
|
| 61 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/size11.clo
|
| 62 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/map/fontname/texfonts.map
|
| 63 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmr10.tfm
|
| 64 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty
|
| 65 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty
|
| 66 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphicx.sty
|
| 67 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphicx.sty
|
| 68 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphics.sty
|
| 69 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphics.sty
|
| 70 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/trig.sty
|
| 71 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/trig.sty
|
| 72 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg
|
| 73 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg
|
| 74 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg
|
| 75 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def
|
| 76 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def
|
| 77 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def
|
| 78 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty
|
| 79 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty
|
| 80 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
|
| 81 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
|
| 82 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
|
| 83 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgf.cfg
|
| 84 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.def
|
| 85 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.def
|
| 86 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-common-pdf.def
|
| 87 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.tex
|
| 88 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.tex
|
| 89 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.tex
|
| 90 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.tex
|
| 91 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.tex
|
| 92 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.tex
|
| 93 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/xcolor/xcolor.sty
|
| 94 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/xcolor/xcolor.sty
|
| 95 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg
|
| 96 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg
|
| 97 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg
|
| 98 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx
|
| 99 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx
|
| 100 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx
|
| 101 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
|
| 102 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
|
| 103 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
|
| 104 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfint.code.tex
|
| 105 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepoints.code.tex
|
| 106 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathconstruct.code.tex
|
| 107 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathusage.code.tex
|
| 108 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorescopes.code.tex
|
| 109 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoregraphicstate.code.tex
|
| 110 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransformations.code.tex
|
| 111 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorequick.code.tex
|
| 112 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreobjects.code.tex
|
| 113 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathprocessing.code.tex
|
| 114 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorearrows.code.tex
|
| 115 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreshade.code.tex
|
| 116 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreimage.code.tex
|
| 117 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreexternal.code.tex
|
| 118 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorelayers.code.tex
|
| 119 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransparency.code.tex
|
| 120 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepatterns.code.tex
|
| 121 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorerdf.code.tex
|
| 122 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/xxcolor.sty
|
| 123 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/xxcolor.sty
|
| 124 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/atbegshi/atbegshi.sty
|
| 125 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/atbegshi-ltx.sty
|
| 126 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/atbegshi-ltx.sty
|
| 127 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hyperref.sty
|
| 128 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty
|
| 129 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty
|
| 130 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty
|
| 131 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty
|
| 132 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pdfescape/pdfescape.sty
|
| 133 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pdfescape/pdfescape.sty
|
| 134 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty
|
| 135 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty
|
| 136 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty
|
| 137 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty
|
| 138 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/infwarerr/infwarerr.sty
|
| 139 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/infwarerr/infwarerr.sty
|
| 140 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hycolor/hycolor.sty
|
| 141 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hycolor/hycolor.sty
|
| 142 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/nameref.sty
|
| 143 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/nameref.sty
|
| 144 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/refcount/refcount.sty
|
| 145 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/refcount/refcount.sty
|
| 146 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty
|
| 147 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty
|
| 148 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/kvoptions/kvoptions.sty
|
| 149 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/kvoptions/kvoptions.sty
|
| 150 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/stringenc/stringenc.sty
|
| 151 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/stringenc/stringenc.sty
|
| 152 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def
|
| 153 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def
|
| 154 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def
|
| 155 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/intcalc/intcalc.sty
|
| 156 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/intcalc/intcalc.sty
|
| 157 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def
|
| 158 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def
|
| 159 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def
|
| 160 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/url/url.sty
|
| 161 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/url/url.sty
|
| 162 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/bitset/bitset.sty
|
| 163 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/bitset/bitset.sty
|
| 164 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty
|
| 165 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty
|
| 166 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def
|
| 167 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def
|
| 168 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def
|
| 169 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/atveryend/atveryend.sty
|
| 170 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/atveryend-ltx.sty
|
| 171 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/atveryend-ltx.sty
|
| 172 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty
|
| 173 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty
|
| 174 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty
|
| 175 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty
|
| 176 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaserequires.sty
|
| 177 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaserequires.sty
|
| 178 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasecompatibility.sty
|
| 179 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasecompatibility.sty
|
| 180 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasefont.sty
|
| 181 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasefont.sty
|
| 182 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amssymb.sty
|
| 183 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amssymb.sty
|
| 184 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amsfonts.sty
|
| 185 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amsfonts.sty
|
| 186 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/sansmathaccent.sty
|
| 187 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/sansmathaccent.sty
|
| 188 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/sansmathaccent.sty
|
| 189 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile.sty
|
| 190 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile.sty
|
| 191 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile.sty
|
| 192 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile-hook.sty
|
| 193 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile-hook.sty
|
| 194 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlogo.sty
|
| 195 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlogo.sty
|
| 196 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetranslator.sty
|
| 197 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetranslator.sty
|
| 198 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator.sty
|
| 199 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator.sty
|
| 200 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasemisc.sty
|
| 201 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasemisc.sty
|
| 202 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetwoscreens.sty
|
| 203 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetwoscreens.sty
|
| 204 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseoverlay.sty
|
| 205 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseoverlay.sty
|
| 206 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetitle.sty
|
| 207 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetitle.sty
|
| 208 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasesection.sty
|
| 209 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasesection.sty
|
| 210 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframe.sty
|
| 211 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframe.sty
|
| 212 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseverbatim.sty
|
| 213 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseverbatim.sty
|
| 214 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframesize.sty
|
| 215 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframesize.sty
|
| 216 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframecomponents.sty
|
| 217 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframecomponents.sty
|
| 218 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasecolor.sty
|
| 219 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasecolor.sty
|
| 220 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenotes.sty
|
| 221 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenotes.sty
|
| 222 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetoc.sty
|
| 223 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetoc.sty
|
| 224 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetemplates.sty
|
| 225 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetemplates.sty
|
| 226 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseauxtemplates.sty
|
| 227 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseauxtemplates.sty
|
| 228 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseboxes.sty
|
| 229 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseboxes.sty
|
| 230 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaselocalstructure.sty
|
| 231 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaselocalstructure.sty
|
| 232 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/enumerate.sty
|
| 233 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/enumerate.sty
|
| 234 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigation.sty
|
| 235 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigation.sty
|
| 236 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigationsymbols.tex
|
| 237 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigationsymbols.tex
|
| 238 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigationsymbols.tex
|
| 239 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigationsymbols.tex
|
| 240 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigationsymbols.tex
|
| 241 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetheorems.sty
|
| 242 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetheorems.sty
|
| 243 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsmath.sty
|
| 244 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsmath.sty
|
| 245 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsopn.sty
|
| 246 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amstext.sty
|
| 247 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amstext.sty
|
| 248 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsgen.sty
|
| 249 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsgen.sty
|
| 250 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsbsy.sty
|
| 251 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsbsy.sty
|
| 252 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsopn.sty
|
| 253 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amscls/amsthm.sty
|
| 254 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amscls/amsthm.sty
|
| 255 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasethemes.sty
|
| 256 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasethemes.sty
|
| 257 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmss10.tfm
|
| 258 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerthemedefault.sty
|
| 259 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerthemedefault.sty
|
| 260 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerfontthemedefault.sty
|
| 261 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerfontthemedefault.sty
|
| 262 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemedefault.sty
|
| 263 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemedefault.sty
|
| 264 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerinnerthemedefault.sty
|
| 265 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerinnerthemedefault.sty
|
| 266 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonbook.pdf
|
| 267 |
+
OUTPUT presentation.pdf
|
| 268 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonbook.pdf
|
| 269 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonbook.20.pdf
|
| 270 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonbook.20.pdf
|
| 271 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonarticle.pdf
|
| 272 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonarticle.pdf
|
| 273 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonarticle.20.pdf
|
| 274 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericonarticle.20.pdf
|
| 275 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericononline.pdf
|
| 276 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericononline.pdf
|
| 277 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericononline.20.pdf
|
| 278 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamericononline.20.pdf
|
| 279 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerouterthemedefault.sty
|
| 280 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerouterthemedefault.sty
|
| 281 |
+
INPUT ./preamble.tex
|
| 282 |
+
INPUT ./preamble.tex
|
| 283 |
+
INPUT preamble.tex
|
| 284 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/frontendlayer/tikz.sty
|
| 285 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/frontendlayer/tikz.sty
|
| 286 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgf.sty
|
| 287 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgf.sty
|
| 288 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleshapes.code.tex
|
| 289 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleplot.code.tex
|
| 290 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-0-65.sty
|
| 291 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-0-65.sty
|
| 292 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-1-18.sty
|
| 293 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-1-18.sty
|
| 294 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgffor.sty
|
| 295 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgffor.sty
|
| 296 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex
|
| 297 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex
|
| 298 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex
|
| 299 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex
|
| 300 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex
|
| 301 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex
|
| 302 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryplothandlers.code.tex
|
| 303 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryplothandlers.code.tex
|
| 304 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmodulematrix.code.tex
|
| 305 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarytopaths.code.tex
|
| 306 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarytopaths.code.tex
|
| 307 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/animate/animate.sty
|
| 308 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/animate/animate.sty
|
| 309 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/ifthen.sty
|
| 310 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/base/ifthen.sty
|
| 311 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/oberdiek/ifdraft.sty
|
| 312 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/oberdiek/ifdraft.sty
|
| 313 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/calc.sty
|
| 314 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/tools/calc.sty
|
| 315 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/media9/pdfbase.sty
|
| 316 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/media9/pdfbase.sty
|
| 317 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def
|
| 318 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def
|
| 319 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/zref/zref-abspage.sty
|
| 320 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/zref/zref-abspage.sty
|
| 321 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/zref/zref-base.sty
|
| 322 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/zref/zref-base.sty
|
| 323 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/etexcmds/etexcmds.sty
|
| 324 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/etexcmds/etexcmds.sty
|
| 325 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/auxhook/auxhook.sty
|
| 326 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/auxhook/auxhook.sty
|
| 327 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/ocgx2/ocgbase.sty
|
| 328 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/ocgx2/ocgbase.sty
|
| 329 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/media9/media9.sty
|
| 330 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/media9/media9.sty
|
| 331 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjustbox.sty
|
| 332 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjustbox.sty
|
| 333 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/xkeyval/xkeyval.sty
|
| 334 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/xkeyval/xkeyval.sty
|
| 335 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkeyval.tex
|
| 336 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkvutils.tex
|
| 337 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjcalc.sty
|
| 338 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjcalc.sty
|
| 339 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/trimclip.sty
|
| 340 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/trimclip.sty
|
| 341 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/collectbox/collectbox.sty
|
| 342 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/collectbox/collectbox.sty
|
| 343 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/tc-pdftex.def
|
| 344 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/tc-pdftex.def
|
| 345 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/tc-pdftex.def
|
| 346 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/ifoddpage/ifoddpage.sty
|
| 347 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/ifoddpage/ifoddpage.sty
|
| 348 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/ifoddpage/ifoddpage.sty
|
| 349 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/varwidth/varwidth.sty
|
| 350 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/varwidth/varwidth.sty
|
| 351 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/varwidth/varwidth.sty
|
| 352 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/multirow/multirow.sty
|
| 353 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/multirow/multirow.sty
|
| 354 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/booktabs/booktabs.sty
|
| 355 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/booktabs/booktabs.sty
|
| 356 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/natbib/natbib.sty
|
| 357 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/natbib/natbib.sty
|
| 358 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/pifont.sty
|
| 359 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/pifont.sty
|
| 360 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upzd.fd
|
| 361 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upzd.fd
|
| 362 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upzd.fd
|
| 363 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/adobe/zapfding/pzdr.tfm
|
| 364 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upsy.fd
|
| 365 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upsy.fd
|
| 366 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upsy.fd
|
| 367 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/adobe/symbol/psyr.tfm
|
| 368 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/ulem/ulem.sty
|
| 369 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/generic/ulem/ulem.sty
|
| 370 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/latex-fonts/lasy6.tfm
|
| 371 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/subcaption.sty
|
| 372 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/subcaption.sty
|
| 373 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption.sty
|
| 374 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption.sty
|
| 375 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption3.sty
|
| 376 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption3.sty
|
| 377 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption-beamer.sto
|
| 378 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption-beamer.sto
|
| 379 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption-beamer.sto
|
| 380 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerthemeMadrid.sty
|
| 381 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerthemeMadrid.sty
|
| 382 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemewhale.sty
|
| 383 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemewhale.sty
|
| 384 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemeorchid.sty
|
| 385 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemeorchid.sty
|
| 386 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerinnerthemerounded.sty
|
| 387 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerinnerthemerounded.sty
|
| 388 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerouterthemeinfolines.sty
|
| 389 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerouterthemeinfolines.sty
|
| 390 |
+
INPUT ./helpers.tex
|
| 391 |
+
INPUT ./helpers.tex
|
| 392 |
+
INPUT helpers.tex
|
| 393 |
+
INPUT ./presentation.aux
|
| 394 |
+
INPUT ./presentation.aux
|
| 395 |
+
INPUT presentation.aux
|
| 396 |
+
OUTPUT presentation.aux
|
| 397 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
|
| 398 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
|
| 399 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
|
| 400 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty
|
| 401 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty
|
| 402 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg
|
| 403 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg
|
| 404 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg
|
| 405 |
+
INPUT ./presentation.out
|
| 406 |
+
INPUT ./presentation.out
|
| 407 |
+
INPUT presentation.out
|
| 408 |
+
INPUT presentation.out
|
| 409 |
+
INPUT ./presentation.out
|
| 410 |
+
INPUT ./presentation.out
|
| 411 |
+
OUTPUT presentation.out
|
| 412 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-basic-dictionary-English.dict
|
| 413 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-basic-dictionary-English.dict
|
| 414 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-basic-dictionary-English.dict
|
| 415 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-bibliography-dictionary-English.dict
|
| 416 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-bibliography-dictionary-English.dict
|
| 417 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-bibliography-dictionary-English.dict
|
| 418 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-environment-dictionary-English.dict
|
| 419 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-environment-dictionary-English.dict
|
| 420 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-environment-dictionary-English.dict
|
| 421 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-months-dictionary-English.dict
|
| 422 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-months-dictionary-English.dict
|
| 423 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-months-dictionary-English.dict
|
| 424 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-numbers-dictionary-English.dict
|
| 425 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-numbers-dictionary-English.dict
|
| 426 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-numbers-dictionary-English.dict
|
| 427 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-theorem-dictionary-English.dict
|
| 428 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-theorem-dictionary-English.dict
|
| 429 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-theorem-dictionary-English.dict
|
| 430 |
+
INPUT ./presentation.nav
|
| 431 |
+
INPUT ./presentation.nav
|
| 432 |
+
INPUT presentation.nav
|
| 433 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmss8.tfm
|
| 434 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmtt8.tfm
|
| 435 |
+
INPUT ./01.tex
|
| 436 |
+
INPUT ./01.tex
|
| 437 |
+
INPUT 01.tex
|
| 438 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmss12.tfm
|
| 439 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmtt12.tfm
|
| 440 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmss8.tfm
|
| 441 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi10.tfm
|
| 442 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi8.tfm
|
| 443 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmmi6.tfm
|
| 444 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy10.tfm
|
| 445 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy8.tfm
|
| 446 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmsy6.tfm
|
| 447 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmex10.tfm
|
| 448 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex8.tfm
|
| 449 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex7.tfm
|
| 450 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsa.fd
|
| 451 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsa.fd
|
| 452 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsa.fd
|
| 453 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam10.tfm
|
| 454 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam10.tfm
|
| 455 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam7.tfm
|
| 456 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsb.fd
|
| 457 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsb.fd
|
| 458 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsb.fd
|
| 459 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm10.tfm
|
| 460 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm10.tfm
|
| 461 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm7.tfm
|
| 462 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/ot1mathkerncmss.fd
|
| 463 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/ot1mathkerncmss.fd
|
| 464 |
+
INPUT /usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/ot1mathkerncmss.fd
|
| 465 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/sansmathaccent/mathkerncmssi10.tfm
|
| 466 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/sansmathaccent/mathkerncmssi8.tfm
|
| 467 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/sansmathaccent/mathkerncmssi8.tfm
|
| 468 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmss8.tfm
|
| 469 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/cmextra/cmex7.tfm
|
| 470 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msam5.tfm
|
| 471 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/amsfonts/symbols/msbm5.tfm
|
| 472 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/sansmathaccent/mathkerncmssi8.tfm
|
| 473 |
+
INPUT ../logos/hf.pdf
|
| 474 |
+
INPUT ../logos/hf.pdf
|
| 475 |
+
INPUT ../logos/hf.pdf
|
| 476 |
+
INPUT ../logos/hf.pdf
|
| 477 |
+
INPUT ../logos/hf.pdf
|
| 478 |
+
INPUT ../logos/lerobot.png
|
| 479 |
+
INPUT ../logos/lerobot.png
|
| 480 |
+
INPUT ../logos/lerobot.png
|
| 481 |
+
INPUT ../logos/lerobot.png
|
| 482 |
+
INPUT ../logos/lerobot.png
|
| 483 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/tfm/public/cm/cmss8.tfm
|
| 484 |
+
INPUT /usr/local/texlive/2025/texmf-var/fonts/map/pdftex/updmap/pdftex.map
|
| 485 |
+
OUTPUT presentation.nav
|
| 486 |
+
OUTPUT presentation.toc
|
| 487 |
+
OUTPUT presentation.snm
|
| 488 |
+
INPUT presentation.aux
|
| 489 |
+
INPUT ./presentation.out
|
| 490 |
+
INPUT ./presentation.out
|
| 491 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmss10.pfb
|
| 492 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmss12.pfb
|
| 493 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmss8.pfb
|
| 494 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmtt12.pfb
|
| 495 |
+
INPUT /usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmtt8.pfb
|
app/scripts/latex-to-mdx/input/slides/presentation.log
ADDED
|
@@ -0,0 +1,1004 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
This is pdfTeX, Version 3.141592653-2.6-1.40.27 (TeX Live 2025) (preloaded format=pdflatex 2025.8.26) 25 SEP 2025 11:56
|
| 2 |
+
entering extended mode
|
| 3 |
+
restricted \write18 enabled.
|
| 4 |
+
file:line:error style messages enabled.
|
| 5 |
+
%&-line parsing enabled.
|
| 6 |
+
**/Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/slides/presentation.tex
|
| 7 |
+
(/Users/fracapuano/Desktop/robots-tutorial/robot-learning-tutorial/slides/presentation.tex
|
| 8 |
+
LaTeX2e <2024-11-01> patch level 2
|
| 9 |
+
L3 programming layer <2025-01-18>
|
| 10 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamer.cls
|
| 11 |
+
Document Class: beamer 2025/02/04 v3.72 A class for typesetting presentations
|
| 12 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasemodes.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/etoolbox/etoolbox.sty
|
| 13 |
+
Package: etoolbox 2025/02/11 v2.5l e-TeX tools for LaTeX (JAW)
|
| 14 |
+
\etb@tempcnta=\count196
|
| 15 |
+
)
|
| 16 |
+
\beamer@tempbox=\box52
|
| 17 |
+
\beamer@tempcount=\count197
|
| 18 |
+
\c@beamerpauses=\count198
|
| 19 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasedecode.sty
|
| 20 |
+
\beamer@slideinframe=\count199
|
| 21 |
+
\beamer@minimum=\count266
|
| 22 |
+
\beamer@decode@box=\box53
|
| 23 |
+
)
|
| 24 |
+
\beamer@commentbox=\box54
|
| 25 |
+
\beamer@modecount=\count267
|
| 26 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/iftex.sty
|
| 27 |
+
Package: iftex 2024/12/12 v1.0g TeX engine tests
|
| 28 |
+
)
|
| 29 |
+
\headdp=\dimen141
|
| 30 |
+
\footheight=\dimen142
|
| 31 |
+
\sidebarheight=\dimen143
|
| 32 |
+
\beamer@tempdim=\dimen144
|
| 33 |
+
\beamer@finalheight=\dimen145
|
| 34 |
+
\beamer@animht=\dimen146
|
| 35 |
+
\beamer@animdp=\dimen147
|
| 36 |
+
\beamer@animwd=\dimen148
|
| 37 |
+
\beamer@leftmargin=\dimen149
|
| 38 |
+
\beamer@rightmargin=\dimen150
|
| 39 |
+
\beamer@leftsidebar=\dimen151
|
| 40 |
+
\beamer@rightsidebar=\dimen152
|
| 41 |
+
\beamer@boxsize=\dimen153
|
| 42 |
+
\beamer@vboxoffset=\dimen154
|
| 43 |
+
\beamer@descdefault=\dimen155
|
| 44 |
+
\beamer@descriptionwidth=\dimen156
|
| 45 |
+
\beamer@lastskip=\skip49
|
| 46 |
+
\beamer@areabox=\box55
|
| 47 |
+
\beamer@animcurrent=\box56
|
| 48 |
+
\beamer@animshowbox=\box57
|
| 49 |
+
\beamer@sectionbox=\box58
|
| 50 |
+
\beamer@logobox=\box59
|
| 51 |
+
\beamer@linebox=\box60
|
| 52 |
+
\beamer@sectioncount=\count268
|
| 53 |
+
\beamer@subsubsectionmax=\count269
|
| 54 |
+
\beamer@subsectionmax=\count270
|
| 55 |
+
\beamer@sectionmax=\count271
|
| 56 |
+
\beamer@totalheads=\count272
|
| 57 |
+
\beamer@headcounter=\count273
|
| 58 |
+
\beamer@partstartpage=\count274
|
| 59 |
+
\beamer@sectionstartpage=\count275
|
| 60 |
+
\beamer@subsectionstartpage=\count276
|
| 61 |
+
\beamer@animationtempa=\count277
|
| 62 |
+
\beamer@animationtempb=\count278
|
| 63 |
+
\beamer@xpos=\count279
|
| 64 |
+
\beamer@ypos=\count280
|
| 65 |
+
\beamer@ypos@offset=\count281
|
| 66 |
+
\beamer@showpartnumber=\count282
|
| 67 |
+
\beamer@currentsubsection=\count283
|
| 68 |
+
\beamer@coveringdepth=\count284
|
| 69 |
+
\beamer@sectionadjust=\count285
|
| 70 |
+
\beamer@toclastsection=\count286
|
| 71 |
+
\beamer@tocsectionnumber=\count287
|
| 72 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseoptions.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/keyval.sty
|
| 73 |
+
Package: keyval 2022/05/29 v1.15 key=value parser (DPC)
|
| 74 |
+
\KV@toks@=\toks17
|
| 75 |
+
))
|
| 76 |
+
\beamer@paperwidth=\skip50
|
| 77 |
+
\beamer@paperheight=\skip51
|
| 78 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/geometry/geometry.sty
|
| 79 |
+
Package: geometry 2020/01/02 v5.9 Page Geometry
|
| 80 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/iftex/ifvtex.sty
|
| 81 |
+
Package: ifvtex 2019/10/25 v1.7 ifvtex legacy package. Use iftex instead.
|
| 82 |
+
)
|
| 83 |
+
\Gm@cnth=\count288
|
| 84 |
+
\Gm@cntv=\count289
|
| 85 |
+
\c@Gm@tempcnt=\count290
|
| 86 |
+
\Gm@bindingoffset=\dimen157
|
| 87 |
+
\Gm@wd@mp=\dimen158
|
| 88 |
+
\Gm@odd@mp=\dimen159
|
| 89 |
+
\Gm@even@mp=\dimen160
|
| 90 |
+
\Gm@layoutwidth=\dimen161
|
| 91 |
+
\Gm@layoutheight=\dimen162
|
| 92 |
+
\Gm@layouthoffset=\dimen163
|
| 93 |
+
\Gm@layoutvoffset=\dimen164
|
| 94 |
+
\Gm@dimlist=\toks18
|
| 95 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/math/pgfmath.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfrcs.sty (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-common.tex
|
| 96 |
+
\pgfutil@everybye=\toks19
|
| 97 |
+
\pgfutil@tempdima=\dimen165
|
| 98 |
+
\pgfutil@tempdimb=\dimen166
|
| 99 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfutil-latex.def
|
| 100 |
+
\pgfutil@abb=\box61
|
| 101 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfrcs.code.tex (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/pgf.revision.tex)
|
| 102 |
+
Package: pgfrcs 2023-01-15 v3.1.10 (3.1.10)
|
| 103 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgfkeys.sty (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeys.code.tex
|
| 104 |
+
\pgfkeys@pathtoks=\toks20
|
| 105 |
+
\pgfkeys@temptoks=\toks21
|
| 106 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgfkeyslibraryfiltered.code.tex
|
| 107 |
+
\pgfkeys@tmptoks=\toks22
|
| 108 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmath.code.tex (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathutil.code.tex
|
| 109 |
+
\pgf@x=\dimen167
|
| 110 |
+
\pgf@xa=\dimen168
|
| 111 |
+
\pgf@xb=\dimen169
|
| 112 |
+
\pgf@xc=\dimen170
|
| 113 |
+
\pgf@y=\dimen171
|
| 114 |
+
\pgf@ya=\dimen172
|
| 115 |
+
\pgf@yb=\dimen173
|
| 116 |
+
\pgf@yc=\dimen174
|
| 117 |
+
\c@pgf@counta=\count291
|
| 118 |
+
\c@pgf@countb=\count292
|
| 119 |
+
\c@pgf@countc=\count293
|
| 120 |
+
\c@pgf@countd=\count294
|
| 121 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathparser.code.tex
|
| 122 |
+
\pgfmath@dimen=\dimen175
|
| 123 |
+
\pgfmath@count=\count295
|
| 124 |
+
\pgfmath@box=\box62
|
| 125 |
+
\pgfmath@toks=\toks23
|
| 126 |
+
\pgfmath@stack@operand=\toks24
|
| 127 |
+
\pgfmath@stack@operation=\toks25
|
| 128 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.basic.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.trigonometric.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.random.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.comparison.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.base.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.round.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.misc.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfunctions.integerarithmetics.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathcalc.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfmathfloat.code.tex
|
| 129 |
+
\c@pgfmathroundto@lastzeros=\count296
|
| 130 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/latex/base/size11.clo
|
| 131 |
+
File: size11.clo 2024/06/29 v1.4n Standard LaTeX file (size option)
|
| 132 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgfcore.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphicx.sty
|
| 133 |
+
Package: graphicx 2021/09/16 v1.2d Enhanced LaTeX Graphics (DPC,SPQR)
|
| 134 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/graphics.sty
|
| 135 |
+
Package: graphics 2024/08/06 v1.4g Standard LaTeX Graphics (DPC,SPQR)
|
| 136 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/trig.sty
|
| 137 |
+
Package: trig 2023/12/02 v1.11 sin cos tan (DPC)
|
| 138 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/graphics.cfg
|
| 139 |
+
File: graphics.cfg 2016/06/04 v1.11 sample graphics configuration
|
| 140 |
+
)
|
| 141 |
+
Package graphics Info: Driver file: pdftex.def on input line 106.
|
| 142 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-def/pdftex.def
|
| 143 |
+
File: pdftex.def 2024/04/13 v1.2c Graphics/color driver for pdftex
|
| 144 |
+
))
|
| 145 |
+
\Gin@req@height=\dimen176
|
| 146 |
+
\Gin@req@width=\dimen177
|
| 147 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/systemlayer/pgfsys.sty (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys.code.tex
|
| 148 |
+
Package: pgfsys 2023-01-15 v3.1.10 (3.1.10)
|
| 149 |
+
\pgf@x=\dimen178
|
| 150 |
+
\pgf@y=\dimen179
|
| 151 |
+
\pgf@xa=\dimen180
|
| 152 |
+
\pgf@ya=\dimen181
|
| 153 |
+
\pgf@xb=\dimen182
|
| 154 |
+
\pgf@yb=\dimen183
|
| 155 |
+
\pgf@xc=\dimen184
|
| 156 |
+
\pgf@yc=\dimen185
|
| 157 |
+
\pgf@xd=\dimen186
|
| 158 |
+
\pgf@yd=\dimen187
|
| 159 |
+
\w@pgf@writea=\write3
|
| 160 |
+
\r@pgf@reada=\read2
|
| 161 |
+
\c@pgf@counta=\count297
|
| 162 |
+
\c@pgf@countb=\count298
|
| 163 |
+
\c@pgf@countc=\count299
|
| 164 |
+
\c@pgf@countd=\count300
|
| 165 |
+
\t@pgf@toka=\toks26
|
| 166 |
+
\t@pgf@tokb=\toks27
|
| 167 |
+
\t@pgf@tokc=\toks28
|
| 168 |
+
\pgf@sys@id@count=\count301
|
| 169 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgf.cfg
|
| 170 |
+
File: pgf.cfg 2023-01-15 v3.1.10 (3.1.10)
|
| 171 |
+
)
|
| 172 |
+
Driver file for pgf: pgfsys-pdftex.def
|
| 173 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-pdftex.def
|
| 174 |
+
File: pgfsys-pdftex.def 2023-01-15 v3.1.10 (3.1.10)
|
| 175 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsys-common-pdf.def
|
| 176 |
+
File: pgfsys-common-pdf.def 2023-01-15 v3.1.10 (3.1.10)
|
| 177 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsyssoftpath.code.tex
|
| 178 |
+
File: pgfsyssoftpath.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 179 |
+
\pgfsyssoftpath@smallbuffer@items=\count302
|
| 180 |
+
\pgfsyssoftpath@bigbuffer@items=\count303
|
| 181 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/systemlayer/pgfsysprotocol.code.tex
|
| 182 |
+
File: pgfsysprotocol.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 183 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/xcolor/xcolor.sty
|
| 184 |
+
Package: xcolor 2024/09/29 v3.02 LaTeX color extensions (UK)
|
| 185 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics-cfg/color.cfg
|
| 186 |
+
File: color.cfg 2016/01/02 v1.6 sample color configuration
|
| 187 |
+
)
|
| 188 |
+
Package xcolor Info: Driver file: pdftex.def on input line 274.
|
| 189 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/graphics/mathcolor.ltx)
|
| 190 |
+
Package xcolor Info: Model `cmy' substituted by `cmy0' on input line 1349.
|
| 191 |
+
Package xcolor Info: Model `hsb' substituted by `rgb' on input line 1353.
|
| 192 |
+
Package xcolor Info: Model `RGB' extended on input line 1365.
|
| 193 |
+
Package xcolor Info: Model `HTML' substituted by `rgb' on input line 1367.
|
| 194 |
+
Package xcolor Info: Model `Hsb' substituted by `hsb' on input line 1368.
|
| 195 |
+
Package xcolor Info: Model `tHsb' substituted by `hsb' on input line 1369.
|
| 196 |
+
Package xcolor Info: Model `HSB' substituted by `hsb' on input line 1370.
|
| 197 |
+
Package xcolor Info: Model `Gray' substituted by `gray' on input line 1371.
|
| 198 |
+
Package xcolor Info: Model `wave' substituted by `hsb' on input line 1372.
|
| 199 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcore.code.tex
|
| 200 |
+
Package: pgfcore 2023-01-15 v3.1.10 (3.1.10)
|
| 201 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/math/pgfint.code.tex) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepoints.code.tex
|
| 202 |
+
File: pgfcorepoints.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 203 |
+
\pgf@picminx=\dimen188
|
| 204 |
+
\pgf@picmaxx=\dimen189
|
| 205 |
+
\pgf@picminy=\dimen190
|
| 206 |
+
\pgf@picmaxy=\dimen191
|
| 207 |
+
\pgf@pathminx=\dimen192
|
| 208 |
+
\pgf@pathmaxx=\dimen193
|
| 209 |
+
\pgf@pathminy=\dimen194
|
| 210 |
+
\pgf@pathmaxy=\dimen195
|
| 211 |
+
\pgf@xx=\dimen196
|
| 212 |
+
\pgf@xy=\dimen197
|
| 213 |
+
\pgf@yx=\dimen198
|
| 214 |
+
\pgf@yy=\dimen199
|
| 215 |
+
\pgf@zx=\dimen256
|
| 216 |
+
\pgf@zy=\dimen257
|
| 217 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathconstruct.code.tex
|
| 218 |
+
File: pgfcorepathconstruct.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 219 |
+
\pgf@path@lastx=\dimen258
|
| 220 |
+
\pgf@path@lasty=\dimen259
|
| 221 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathusage.code.tex
|
| 222 |
+
File: pgfcorepathusage.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 223 |
+
\pgf@shorten@end@additional=\dimen260
|
| 224 |
+
\pgf@shorten@start@additional=\dimen261
|
| 225 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorescopes.code.tex
|
| 226 |
+
File: pgfcorescopes.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 227 |
+
\pgfpic=\box63
|
| 228 |
+
\pgf@hbox=\box64
|
| 229 |
+
\pgf@layerbox@main=\box65
|
| 230 |
+
\pgf@picture@serial@count=\count304
|
| 231 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoregraphicstate.code.tex
|
| 232 |
+
File: pgfcoregraphicstate.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 233 |
+
\pgflinewidth=\dimen262
|
| 234 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransformations.code.tex
|
| 235 |
+
File: pgfcoretransformations.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 236 |
+
\pgf@pt@x=\dimen263
|
| 237 |
+
\pgf@pt@y=\dimen264
|
| 238 |
+
\pgf@pt@temp=\dimen265
|
| 239 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorequick.code.tex
|
| 240 |
+
File: pgfcorequick.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 241 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreobjects.code.tex
|
| 242 |
+
File: pgfcoreobjects.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 243 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepathprocessing.code.tex
|
| 244 |
+
File: pgfcorepathprocessing.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 245 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorearrows.code.tex
|
| 246 |
+
File: pgfcorearrows.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 247 |
+
\pgfarrowsep=\dimen266
|
| 248 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreshade.code.tex
|
| 249 |
+
File: pgfcoreshade.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 250 |
+
\pgf@max=\dimen267
|
| 251 |
+
\pgf@sys@shading@range@num=\count305
|
| 252 |
+
\pgf@shadingcount=\count306
|
| 253 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreimage.code.tex
|
| 254 |
+
File: pgfcoreimage.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 255 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoreexternal.code.tex
|
| 256 |
+
File: pgfcoreexternal.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 257 |
+
\pgfexternal@startupbox=\box66
|
| 258 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorelayers.code.tex
|
| 259 |
+
File: pgfcorelayers.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 260 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcoretransparency.code.tex
|
| 261 |
+
File: pgfcoretransparency.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 262 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorepatterns.code.tex
|
| 263 |
+
File: pgfcorepatterns.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 264 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/basiclayer/pgfcorerdf.code.tex
|
| 265 |
+
File: pgfcorerdf.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 266 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/xxcolor.sty
|
| 267 |
+
Package: xxcolor 2003/10/24 ver 0.1
|
| 268 |
+
\XC@nummixins=\count307
|
| 269 |
+
\XC@countmixins=\count308
|
| 270 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/base/atbegshi-ltx.sty
|
| 271 |
+
Package: atbegshi-ltx 2021/01/10 v1.0c Emulation of the original atbegshi
|
| 272 |
+
package with kernel methods
|
| 273 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hyperref.sty
|
| 274 |
+
Package: hyperref 2024-11-05 v7.01l Hypertext links for LaTeX
|
| 275 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/kvsetkeys/kvsetkeys.sty
|
| 276 |
+
Package: kvsetkeys 2022-10-05 v1.19 Key value parser (HO)
|
| 277 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/kvdefinekeys/kvdefinekeys.sty
|
| 278 |
+
Package: kvdefinekeys 2019-12-19 v1.6 Define keys (HO)
|
| 279 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pdfescape/pdfescape.sty
|
| 280 |
+
Package: pdfescape 2019/12/09 v1.15 Implements pdfTeX's escape features (HO)
|
| 281 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/ltxcmds/ltxcmds.sty
|
| 282 |
+
Package: ltxcmds 2023-12-04 v1.26 LaTeX kernel commands for general use (HO)
|
| 283 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pdftexcmds/pdftexcmds.sty
|
| 284 |
+
Package: pdftexcmds 2020-06-27 v0.33 Utility functions of pdfTeX for LuaTeX (HO)
|
| 285 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/infwarerr/infwarerr.sty
|
| 286 |
+
Package: infwarerr 2019/12/03 v1.5 Providing info/warning/error messages (HO)
|
| 287 |
+
)
|
| 288 |
+
Package pdftexcmds Info: \pdf@primitive is available.
|
| 289 |
+
Package pdftexcmds Info: \pdf@ifprimitive is available.
|
| 290 |
+
Package pdftexcmds Info: \pdfdraftmode found.
|
| 291 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/hycolor/hycolor.sty
|
| 292 |
+
Package: hycolor 2020-01-27 v1.10 Color options for hyperref/bookmark (HO)
|
| 293 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/nameref.sty
|
| 294 |
+
Package: nameref 2023-11-26 v2.56 Cross-referencing by name of section
|
| 295 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/refcount/refcount.sty
|
| 296 |
+
Package: refcount 2019/12/15 v3.6 Data extraction from label references (HO)
|
| 297 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/gettitlestring/gettitlestring.sty
|
| 298 |
+
Package: gettitlestring 2019/12/15 v1.6 Cleanup title references (HO)
|
| 299 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/kvoptions/kvoptions.sty
|
| 300 |
+
Package: kvoptions 2022-06-15 v3.15 Key value format for package options (HO)
|
| 301 |
+
))
|
| 302 |
+
\c@section@level=\count309
|
| 303 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/stringenc/stringenc.sty
|
| 304 |
+
Package: stringenc 2019/11/29 v1.12 Convert strings between diff. encodings (HO)
|
| 305 |
+
)
|
| 306 |
+
\@linkdim=\dimen268
|
| 307 |
+
\Hy@linkcounter=\count310
|
| 308 |
+
\Hy@pagecounter=\count311
|
| 309 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/pd1enc.def
|
| 310 |
+
File: pd1enc.def 2024-11-05 v7.01l Hyperref: PDFDocEncoding definition (HO)
|
| 311 |
+
Now handling font encoding PD1 ...
|
| 312 |
+
... no UTF-8 mapping file for font encoding PD1
|
| 313 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/intcalc/intcalc.sty
|
| 314 |
+
Package: intcalc 2019/12/15 v1.3 Expandable calculations with integers (HO)
|
| 315 |
+
)
|
| 316 |
+
\Hy@SavedSpaceFactor=\count312
|
| 317 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/puenc.def
|
| 318 |
+
File: puenc.def 2024-11-05 v7.01l Hyperref: PDF Unicode definition (HO)
|
| 319 |
+
Now handling font encoding PU ...
|
| 320 |
+
... no UTF-8 mapping file for font encoding PU
|
| 321 |
+
)
|
| 322 |
+
Package hyperref Info: Option `bookmarks' set `true' on input line 4040.
|
| 323 |
+
Package hyperref Info: Option `bookmarksopen' set `true' on input line 4040.
|
| 324 |
+
Package hyperref Info: Option `implicit' set `false' on input line 4040.
|
| 325 |
+
Package hyperref Info: Hyper figures OFF on input line 4157.
|
| 326 |
+
Package hyperref Info: Link nesting OFF on input line 4162.
|
| 327 |
+
Package hyperref Info: Hyper index ON on input line 4165.
|
| 328 |
+
Package hyperref Info: Plain pages OFF on input line 4172.
|
| 329 |
+
Package hyperref Info: Backreferencing OFF on input line 4177.
|
| 330 |
+
Package hyperref Info: Implicit mode OFF; no redefinition of LaTeX internals.
|
| 331 |
+
Package hyperref Info: Bookmarks ON on input line 4424.
|
| 332 |
+
\c@Hy@tempcnt=\count313
|
| 333 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/url/url.sty
|
| 334 |
+
\Urlmuskip=\muskip17
|
| 335 |
+
Package: url 2013/09/16 ver 3.4 Verb mode for urls, etc.
|
| 336 |
+
)
|
| 337 |
+
LaTeX Info: Redefining \url on input line 4763.
|
| 338 |
+
\XeTeXLinkMargin=\dimen269
|
| 339 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/bitset/bitset.sty
|
| 340 |
+
Package: bitset 2019/12/09 v1.3 Handle bit-vector datatype (HO)
|
| 341 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/bigintcalc/bigintcalc.sty
|
| 342 |
+
Package: bigintcalc 2019/12/15 v1.5 Expandable calculations on big integers (HO)
|
| 343 |
+
))
|
| 344 |
+
\Fld@menulength=\count314
|
| 345 |
+
\Field@Width=\dimen270
|
| 346 |
+
\Fld@charsize=\dimen271
|
| 347 |
+
Package hyperref Info: Hyper figures OFF on input line 6042.
|
| 348 |
+
Package hyperref Info: Link nesting OFF on input line 6047.
|
| 349 |
+
Package hyperref Info: Hyper index ON on input line 6050.
|
| 350 |
+
Package hyperref Info: backreferencing OFF on input line 6057.
|
| 351 |
+
Package hyperref Info: Link coloring OFF on input line 6062.
|
| 352 |
+
Package hyperref Info: Link coloring with OCG OFF on input line 6067.
|
| 353 |
+
Package hyperref Info: PDF/A mode OFF on input line 6072.
|
| 354 |
+
\Hy@abspage=\count315
|
| 355 |
+
|
| 356 |
+
|
| 357 |
+
Package hyperref Message: Stopped early.
|
| 358 |
+
|
| 359 |
+
)
|
| 360 |
+
Package hyperref Info: Driver (autodetected): hpdftex.
|
| 361 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/hyperref/hpdftex.def
|
| 362 |
+
File: hpdftex.def 2024-11-05 v7.01l Hyperref driver for pdfTeX
|
| 363 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/base/atveryend-ltx.sty
|
| 364 |
+
Package: atveryend-ltx 2020/08/19 v1.0a Emulation of the original atveryend package
|
| 365 |
+
with kernel methods
|
| 366 |
+
)
|
| 367 |
+
\Fld@listcount=\count316
|
| 368 |
+
\c@bookmark@seq@number=\count317
|
| 369 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/rerunfilecheck/rerunfilecheck.sty
|
| 370 |
+
Package: rerunfilecheck 2022-07-10 v1.10 Rerun checks for auxiliary files (HO)
|
| 371 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/uniquecounter/uniquecounter.sty
|
| 372 |
+
Package: uniquecounter 2019/12/15 v1.4 Provide unlimited unique counter (HO)
|
| 373 |
+
)
|
| 374 |
+
Package uniquecounter Info: New unique counter `rerunfilecheck' on input line 285.
|
| 375 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaserequires.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasecompatibility.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasefont.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amssymb.sty
|
| 376 |
+
Package: amssymb 2013/01/14 v3.01 AMS font symbols
|
| 377 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/amsfonts.sty
|
| 378 |
+
Package: amsfonts 2013/01/14 v3.01 Basic AMSFonts support
|
| 379 |
+
\@emptytoks=\toks29
|
| 380 |
+
\symAMSa=\mathgroup4
|
| 381 |
+
\symAMSb=\mathgroup5
|
| 382 |
+
LaTeX Font Info: Redeclaring math symbol \hbar on input line 98.
|
| 383 |
+
LaTeX Font Info: Overwriting math alphabet `\mathfrak' in version `bold'
|
| 384 |
+
(Font) U/euf/m/n --> U/euf/b/n on input line 106.
|
| 385 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/sansmathaccent.sty
|
| 386 |
+
Package: sansmathaccent 2020/01/31
|
| 387 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile.sty
|
| 388 |
+
Package: scrlfile 2024/10/24 v3.43 KOMA-Script package (file load hooks)
|
| 389 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlfile-hook.sty
|
| 390 |
+
Package: scrlfile-hook 2024/10/24 v3.43 KOMA-Script package (using LaTeX hooks)
|
| 391 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/koma-script/scrlogo.sty
|
| 392 |
+
Package: scrlogo 2024/10/24 v3.43 KOMA-Script package (logo)
|
| 393 |
+
))))) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetranslator.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator.sty
|
| 394 |
+
Package: translator 2021-05-31 v1.12d Easy translation of strings in LaTeX
|
| 395 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasemisc.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetwoscreens.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseoverlay.sty
|
| 396 |
+
\beamer@argscount=\count318
|
| 397 |
+
\beamer@lastskipcover=\skip52
|
| 398 |
+
\beamer@trivlistdepth=\count319
|
| 399 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetitle.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasesection.sty
|
| 400 |
+
\c@lecture=\count320
|
| 401 |
+
\c@part=\count321
|
| 402 |
+
\c@section=\count322
|
| 403 |
+
\c@subsection=\count323
|
| 404 |
+
\c@subsubsection=\count324
|
| 405 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframe.sty
|
| 406 |
+
\beamer@framebox=\box67
|
| 407 |
+
\beamer@frametitlebox=\box68
|
| 408 |
+
\beamer@zoombox=\box69
|
| 409 |
+
\beamer@zoomcount=\count325
|
| 410 |
+
\beamer@zoomframecount=\count326
|
| 411 |
+
\beamer@frametextheight=\dimen272
|
| 412 |
+
\c@subsectionslide=\count327
|
| 413 |
+
\beamer@frametopskip=\skip53
|
| 414 |
+
\beamer@framebottomskip=\skip54
|
| 415 |
+
\beamer@frametopskipautobreak=\skip55
|
| 416 |
+
\beamer@framebottomskipautobreak=\skip56
|
| 417 |
+
\beamer@envbody=\toks30
|
| 418 |
+
\framewidth=\dimen273
|
| 419 |
+
\c@framenumber=\count328
|
| 420 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseverbatim.sty
|
| 421 |
+
\beamer@verbatimfileout=\write4
|
| 422 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframesize.sty
|
| 423 |
+
\beamer@splitbox=\box70
|
| 424 |
+
\beamer@autobreakcount=\count329
|
| 425 |
+
\beamer@autobreaklastheight=\dimen274
|
| 426 |
+
\beamer@frametitletoks=\toks31
|
| 427 |
+
\beamer@framesubtitletoks=\toks32
|
| 428 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseframecomponents.sty
|
| 429 |
+
\beamer@footins=\box71
|
| 430 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasecolor.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenotes.sty
|
| 431 |
+
\beamer@frameboxcopy=\box72
|
| 432 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetoc.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetemplates.sty
|
| 433 |
+
\beamer@sbttoks=\toks33
|
| 434 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseauxtemplates.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaseboxes.sty
|
| 435 |
+
\bmb@box=\box73
|
| 436 |
+
\bmb@colorbox=\box74
|
| 437 |
+
\bmb@boxwidth=\dimen275
|
| 438 |
+
\bmb@boxheight=\dimen276
|
| 439 |
+
\bmb@prevheight=\dimen277
|
| 440 |
+
\bmb@temp=\dimen278
|
| 441 |
+
\bmb@dima=\dimen279
|
| 442 |
+
\bmb@dimb=\dimen280
|
| 443 |
+
\bmb@prevheight=\dimen281
|
| 444 |
+
)
|
| 445 |
+
\beamer@blockheadheight=\dimen282
|
| 446 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbaselocalstructure.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/tools/enumerate.sty
|
| 447 |
+
Package: enumerate 2023/07/04 v3.00 enumerate extensions (DPC)
|
| 448 |
+
\@enLab=\toks34
|
| 449 |
+
)
|
| 450 |
+
\beamer@bibiconwidth=\skip57
|
| 451 |
+
\c@figure=\count330
|
| 452 |
+
\c@table=\count331
|
| 453 |
+
\abovecaptionskip=\skip58
|
| 454 |
+
\belowcaptionskip=\skip59
|
| 455 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigation.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasenavigationsymbols.tex)
|
| 456 |
+
\beamer@section@min@dim=\dimen283
|
| 457 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasetheorems.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsmath.sty
|
| 458 |
+
Package: amsmath 2024/11/05 v2.17t AMS math features
|
| 459 |
+
\@mathmargin=\skip60
|
| 460 |
+
|
| 461 |
+
For additional information on amsmath, use the `?' option.
|
| 462 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amstext.sty
|
| 463 |
+
Package: amstext 2021/08/26 v2.01 AMS text
|
| 464 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsgen.sty
|
| 465 |
+
File: amsgen.sty 1999/11/30 v2.0 generic functions
|
| 466 |
+
\@emptytoks=\toks35
|
| 467 |
+
\ex@=\dimen284
|
| 468 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsbsy.sty
|
| 469 |
+
Package: amsbsy 1999/11/29 v1.2d Bold Symbols
|
| 470 |
+
\pmbraise@=\dimen285
|
| 471 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/amsmath/amsopn.sty
|
| 472 |
+
Package: amsopn 2022/04/08 v2.04 operator names
|
| 473 |
+
)
|
| 474 |
+
\inf@bad=\count332
|
| 475 |
+
LaTeX Info: Redefining \frac on input line 233.
|
| 476 |
+
\uproot@=\count333
|
| 477 |
+
\leftroot@=\count334
|
| 478 |
+
LaTeX Info: Redefining \overline on input line 398.
|
| 479 |
+
LaTeX Info: Redefining \colon on input line 409.
|
| 480 |
+
\classnum@=\count335
|
| 481 |
+
\DOTSCASE@=\count336
|
| 482 |
+
LaTeX Info: Redefining \ldots on input line 495.
|
| 483 |
+
LaTeX Info: Redefining \dots on input line 498.
|
| 484 |
+
LaTeX Info: Redefining \cdots on input line 619.
|
| 485 |
+
\Mathstrutbox@=\box75
|
| 486 |
+
\strutbox@=\box76
|
| 487 |
+
LaTeX Info: Redefining \big on input line 721.
|
| 488 |
+
LaTeX Info: Redefining \Big on input line 722.
|
| 489 |
+
LaTeX Info: Redefining \bigg on input line 723.
|
| 490 |
+
LaTeX Info: Redefining \Bigg on input line 724.
|
| 491 |
+
\big@size=\dimen286
|
| 492 |
+
LaTeX Font Info: Redeclaring font encoding OML on input line 742.
|
| 493 |
+
LaTeX Font Info: Redeclaring font encoding OMS on input line 743.
|
| 494 |
+
\macc@depth=\count337
|
| 495 |
+
LaTeX Info: Redefining \bmod on input line 904.
|
| 496 |
+
LaTeX Info: Redefining \pmod on input line 909.
|
| 497 |
+
LaTeX Info: Redefining \smash on input line 939.
|
| 498 |
+
LaTeX Info: Redefining \relbar on input line 969.
|
| 499 |
+
LaTeX Info: Redefining \Relbar on input line 970.
|
| 500 |
+
\c@MaxMatrixCols=\count338
|
| 501 |
+
\dotsspace@=\muskip18
|
| 502 |
+
\c@parentequation=\count339
|
| 503 |
+
\dspbrk@lvl=\count340
|
| 504 |
+
\tag@help=\toks36
|
| 505 |
+
\row@=\count341
|
| 506 |
+
\column@=\count342
|
| 507 |
+
\maxfields@=\count343
|
| 508 |
+
\andhelp@=\toks37
|
| 509 |
+
\eqnshift@=\dimen287
|
| 510 |
+
\alignsep@=\dimen288
|
| 511 |
+
\tagshift@=\dimen289
|
| 512 |
+
\tagwidth@=\dimen290
|
| 513 |
+
\totwidth@=\dimen291
|
| 514 |
+
\lineht@=\dimen292
|
| 515 |
+
\@envbody=\toks38
|
| 516 |
+
\multlinegap=\skip61
|
| 517 |
+
\multlinetaggap=\skip62
|
| 518 |
+
\mathdisplay@stack=\toks39
|
| 519 |
+
LaTeX Info: Redefining \[ on input line 2953.
|
| 520 |
+
LaTeX Info: Redefining \] on input line 2954.
|
| 521 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/amscls/amsthm.sty
|
| 522 |
+
Package: amsthm 2020/05/29 v2.20.6
|
| 523 |
+
\thm@style=\toks40
|
| 524 |
+
\thm@bodyfont=\toks41
|
| 525 |
+
\thm@headfont=\toks42
|
| 526 |
+
\thm@notefont=\toks43
|
| 527 |
+
\thm@headpunct=\toks44
|
| 528 |
+
\thm@preskip=\skip63
|
| 529 |
+
\thm@postskip=\skip64
|
| 530 |
+
\thm@headsep=\skip65
|
| 531 |
+
\dth@everypar=\toks45
|
| 532 |
+
)
|
| 533 |
+
\c@theorem=\count344
|
| 534 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerbasethemes.sty)) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerthemedefault.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerfontthemedefault.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemedefault.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerinnerthemedefault.sty
|
| 535 |
+
\beamer@dima=\dimen293
|
| 536 |
+
\beamer@dimb=\dimen294
|
| 537 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerouterthemedefault.sty))) (./preamble.tex (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/frontendlayer/tikz.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/basiclayer/pgf.sty
|
| 538 |
+
Package: pgf 2023-01-15 v3.1.10 (3.1.10)
|
| 539 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleshapes.code.tex
|
| 540 |
+
File: pgfmoduleshapes.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 541 |
+
\pgfnodeparttextbox=\box77
|
| 542 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmoduleplot.code.tex
|
| 543 |
+
File: pgfmoduleplot.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 544 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-0-65.sty
|
| 545 |
+
Package: pgfcomp-version-0-65 2023-01-15 v3.1.10 (3.1.10)
|
| 546 |
+
\pgf@nodesepstart=\dimen295
|
| 547 |
+
\pgf@nodesepend=\dimen296
|
| 548 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/compatibility/pgfcomp-version-1-18.sty
|
| 549 |
+
Package: pgfcomp-version-1-18 2023-01-15 v3.1.10 (3.1.10)
|
| 550 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/pgf/utilities/pgffor.sty (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/utilities/pgffor.code.tex
|
| 551 |
+
Package: pgffor 2023-01-15 v3.1.10 (3.1.10)
|
| 552 |
+
\pgffor@iter=\dimen297
|
| 553 |
+
\pgffor@skip=\dimen298
|
| 554 |
+
\pgffor@stack=\toks46
|
| 555 |
+
\pgffor@toks=\toks47
|
| 556 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/tikz.code.tex
|
| 557 |
+
Package: tikz 2023-01-15 v3.1.10 (3.1.10)
|
| 558 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/libraries/pgflibraryplothandlers.code.tex
|
| 559 |
+
File: pgflibraryplothandlers.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 560 |
+
\pgf@plot@mark@count=\count345
|
| 561 |
+
\pgfplotmarksize=\dimen299
|
| 562 |
+
)
|
| 563 |
+
\tikz@lastx=\dimen300
|
| 564 |
+
\tikz@lasty=\dimen301
|
| 565 |
+
\tikz@lastxsaved=\dimen302
|
| 566 |
+
\tikz@lastysaved=\dimen303
|
| 567 |
+
\tikz@lastmovetox=\dimen304
|
| 568 |
+
\tikz@lastmovetoy=\dimen305
|
| 569 |
+
\tikzleveldistance=\dimen306
|
| 570 |
+
\tikzsiblingdistance=\dimen307
|
| 571 |
+
\tikz@figbox=\box78
|
| 572 |
+
\tikz@figbox@bg=\box79
|
| 573 |
+
\tikz@tempbox=\box80
|
| 574 |
+
\tikz@tempbox@bg=\box81
|
| 575 |
+
\tikztreelevel=\count346
|
| 576 |
+
\tikznumberofchildren=\count347
|
| 577 |
+
\tikznumberofcurrentchild=\count348
|
| 578 |
+
\tikz@fig@count=\count349
|
| 579 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/modules/pgfmodulematrix.code.tex
|
| 580 |
+
File: pgfmodulematrix.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 581 |
+
\pgfmatrixcurrentrow=\count350
|
| 582 |
+
\pgfmatrixcurrentcolumn=\count351
|
| 583 |
+
\pgf@matrix@numberofcolumns=\count352
|
| 584 |
+
)
|
| 585 |
+
\tikz@expandcount=\count353
|
| 586 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/pgf/frontendlayer/tikz/libraries/tikzlibrarytopaths.code.tex
|
| 587 |
+
File: tikzlibrarytopaths.code.tex 2023-01-15 v3.1.10 (3.1.10)
|
| 588 |
+
))) (/usr/local/texlive/2025/texmf-dist/tex/latex/animate/animate.sty
|
| 589 |
+
Package: animate 2024/10/14 PDF & SVG animations from files and inline graphics
|
| 590 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/base/ifthen.sty
|
| 591 |
+
Package: ifthen 2024/03/16 v1.1e Standard LaTeX ifthen package (DPC)
|
| 592 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/oberdiek/ifdraft.sty
|
| 593 |
+
Package: ifdraft 2016/05/16 v1.4 Detect class options draft and final (HO)
|
| 594 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/tools/calc.sty
|
| 595 |
+
Package: calc 2023/07/08 v4.3 Infix arithmetic (KKT,FJ)
|
| 596 |
+
\calc@Acount=\count354
|
| 597 |
+
\calc@Bcount=\count355
|
| 598 |
+
\calc@Adimen=\dimen308
|
| 599 |
+
\calc@Bdimen=\dimen309
|
| 600 |
+
\calc@Askip=\skip66
|
| 601 |
+
\calc@Bskip=\skip67
|
| 602 |
+
LaTeX Info: Redefining \setlength on input line 80.
|
| 603 |
+
LaTeX Info: Redefining \addtolength on input line 81.
|
| 604 |
+
\calc@Ccount=\count356
|
| 605 |
+
\calc@Cskip=\skip68
|
| 606 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/media9/pdfbase.sty
|
| 607 |
+
Package: pdfbase 2024/09/16 v0.59 driver independent access to low-level PDF features
|
| 608 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/l3backend/l3backend-pdftex.def
|
| 609 |
+
File: l3backend-pdftex.def 2024-05-08 L3 backend support: PDF output (pdfTeX)
|
| 610 |
+
\l__color_backend_stack_int=\count357
|
| 611 |
+
\l__pdf_internal_box=\box82
|
| 612 |
+
)
|
| 613 |
+
\g_pbs_page_int=\count358
|
| 614 |
+
\g_pbs_oc_int=\count359
|
| 615 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/zref/zref-abspage.sty
|
| 616 |
+
Package: zref-abspage 2023-09-14 v2.35 Module abspage for zref (HO)
|
| 617 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/zref/zref-base.sty
|
| 618 |
+
Package: zref-base 2023-09-14 v2.35 Module base for zref (HO)
|
| 619 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/etexcmds/etexcmds.sty
|
| 620 |
+
Package: etexcmds 2019/12/15 v1.7 Avoid name clashes with e-TeX commands (HO)
|
| 621 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/auxhook/auxhook.sty
|
| 622 |
+
Package: auxhook 2019-12-17 v1.6 Hooks for auxiliary files (HO)
|
| 623 |
+
)
|
| 624 |
+
Package zref Info: New property list: main on input line 767.
|
| 625 |
+
Package zref Info: New property: default on input line 768.
|
| 626 |
+
Package zref Info: New property: page on input line 769.
|
| 627 |
+
)
|
| 628 |
+
\c@abspage=\count360
|
| 629 |
+
Package zref Info: New property: abspage on input line 67.
|
| 630 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/ocgx2/ocgbase.sty
|
| 631 |
+
Package: ocgbase 2024/09/15 v0.24 support package for ocgx2.sty
|
| 632 |
+
)
|
| 633 |
+
\@anim@box=\box83
|
| 634 |
+
\@anim@measbox=\box84
|
| 635 |
+
\@anim@tmpdima=\dimen310
|
| 636 |
+
\@anim@tmpdimb=\dimen311
|
| 637 |
+
\@anim@num=\count361
|
| 638 |
+
\@anim@curframe=\count362
|
| 639 |
+
\@anim@curframe@zb=\count363
|
| 640 |
+
\@anim@resizeflags=\count364
|
| 641 |
+
\@anim@skipfram=\count365
|
| 642 |
+
\@anim@mulframecnt=\count366
|
| 643 |
+
\@anim@@tmlnfile=\read3
|
| 644 |
+
\@anim@tmpcnt=\count367
|
| 645 |
+
\c@@anim@ltxcnt=\count368
|
| 646 |
+
\@anim@curlayer=\count369
|
| 647 |
+
\@anim@lineno=\count370
|
| 648 |
+
\@anim@curfield=\count371
|
| 649 |
+
\@anim@@resizeflags=\count372
|
| 650 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/media9/media9.sty
|
| 651 |
+
Package: media9 2024/09/16 v1.29 acrobat-9/X compatible media
|
| 652 |
+
\g_mix_pkgresizeflag_int=\count373
|
| 653 |
+
\l_mix_poster_box=\box85
|
| 654 |
+
\g_mix_wd_dim=\dimen312
|
| 655 |
+
\g_mix_ht_dim=\dimen313
|
| 656 |
+
\g_mix_dp_dim=\dimen314
|
| 657 |
+
\g_mix_rmcnt_int=\count374
|
| 658 |
+
\l_mix_viewcnt_int=\count375
|
| 659 |
+
\g@mix@page@int=\count376
|
| 660 |
+
\g_mix_resizeflag_int=\count377
|
| 661 |
+
\l_mix_lineno_int=\count378
|
| 662 |
+
\l_mix_pbtn_box=\box86
|
| 663 |
+
\g_mix_mbtncnt_int=\count379
|
| 664 |
+
\mix@btn@dim=\dimen315
|
| 665 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjustbox.sty
|
| 666 |
+
Package: adjustbox 2025/02/26 v1.3c Adjusting TeX boxes (trim, clip, ...)
|
| 667 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/xkeyval/xkeyval.sty
|
| 668 |
+
Package: xkeyval 2022/06/16 v2.9 package option processing (HA)
|
| 669 |
+
(/usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkeyval.tex (/usr/local/texlive/2025/texmf-dist/tex/generic/xkeyval/xkvutils.tex
|
| 670 |
+
\XKV@toks=\toks48
|
| 671 |
+
\XKV@tempa@toks=\toks49
|
| 672 |
+
)
|
| 673 |
+
\XKV@depth=\count380
|
| 674 |
+
File: xkeyval.tex 2014/12/03 v2.7a key=value parser (HA)
|
| 675 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/adjcalc.sty
|
| 676 |
+
Package: adjcalc 2012/05/16 v1.1 Provides advanced setlength with multiple back-ends (calc, etex, pgfmath)
|
| 677 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/trimclip.sty
|
| 678 |
+
Package: trimclip 2025/02/21 v1.2a Trim and clip general TeX material
|
| 679 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/collectbox/collectbox.sty
|
| 680 |
+
Package: collectbox 2022/10/17 v0.4c Collect macro arguments as boxes
|
| 681 |
+
\collectedbox=\box87
|
| 682 |
+
)
|
| 683 |
+
\tc@llx=\dimen316
|
| 684 |
+
\tc@lly=\dimen317
|
| 685 |
+
\tc@urx=\dimen318
|
| 686 |
+
\tc@ury=\dimen319
|
| 687 |
+
Package trimclip Info: Using driver 'tc-pdftex.def'.
|
| 688 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/adjustbox/tc-pdftex.def
|
| 689 |
+
File: tc-pdftex.def 2025/02/26 v2.3 Clipping driver for pdftex
|
| 690 |
+
))
|
| 691 |
+
\adjbox@Width=\dimen320
|
| 692 |
+
\adjbox@Height=\dimen321
|
| 693 |
+
\adjbox@Depth=\dimen322
|
| 694 |
+
\adjbox@Totalheight=\dimen323
|
| 695 |
+
\adjbox@pwidth=\dimen324
|
| 696 |
+
\adjbox@pheight=\dimen325
|
| 697 |
+
\adjbox@pdepth=\dimen326
|
| 698 |
+
\adjbox@ptotalheight=\dimen327
|
| 699 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/ifoddpage/ifoddpage.sty
|
| 700 |
+
Package: ifoddpage 2022/10/18 v1.2 Conditionals for odd/even page detection
|
| 701 |
+
\c@checkoddpage=\count381
|
| 702 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/varwidth/varwidth.sty
|
| 703 |
+
Package: varwidth 2009/03/30 ver 0.92; Variable-width minipages
|
| 704 |
+
\@vwid@box=\box88
|
| 705 |
+
\sift@deathcycles=\count382
|
| 706 |
+
\@vwid@loff=\dimen328
|
| 707 |
+
\@vwid@roff=\dimen329
|
| 708 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/latex/multirow/multirow.sty
|
| 709 |
+
Package: multirow 2024/11/12 v2.9 Span multiple rows of a table
|
| 710 |
+
\multirow@colwidth=\skip69
|
| 711 |
+
\multirow@cntb=\count383
|
| 712 |
+
\multirow@dima=\skip70
|
| 713 |
+
\bigstrutjot=\dimen330
|
| 714 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/booktabs/booktabs.sty
|
| 715 |
+
Package: booktabs 2020/01/12 v1.61803398 Publication quality tables
|
| 716 |
+
\heavyrulewidth=\dimen331
|
| 717 |
+
\lightrulewidth=\dimen332
|
| 718 |
+
\cmidrulewidth=\dimen333
|
| 719 |
+
\belowrulesep=\dimen334
|
| 720 |
+
\belowbottomsep=\dimen335
|
| 721 |
+
\aboverulesep=\dimen336
|
| 722 |
+
\abovetopsep=\dimen337
|
| 723 |
+
\cmidrulesep=\dimen338
|
| 724 |
+
\cmidrulekern=\dimen339
|
| 725 |
+
\defaultaddspace=\dimen340
|
| 726 |
+
\@cmidla=\count384
|
| 727 |
+
\@cmidlb=\count385
|
| 728 |
+
\@aboverulesep=\dimen341
|
| 729 |
+
\@belowrulesep=\dimen342
|
| 730 |
+
\@thisruleclass=\count386
|
| 731 |
+
\@lastruleclass=\count387
|
| 732 |
+
\@thisrulewidth=\dimen343
|
| 733 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/natbib/natbib.sty
|
| 734 |
+
Package: natbib 2010/09/13 8.31b (PWD, AO)
|
| 735 |
+
\bibhang=\skip71
|
| 736 |
+
\bibsep=\skip72
|
| 737 |
+
LaTeX Info: Redefining \cite on input line 694.
|
| 738 |
+
\c@NAT@ctr=\count388
|
| 739 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/pifont.sty
|
| 740 |
+
Package: pifont 2020/03/25 PSNFSS-v9.3 Pi font support (SPQR)
|
| 741 |
+
LaTeX Font Info: Trying to load font information for U+pzd on input line 63.
|
| 742 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upzd.fd
|
| 743 |
+
File: upzd.fd 2001/06/04 font definitions for U/pzd.
|
| 744 |
+
)
|
| 745 |
+
LaTeX Font Info: Trying to load font information for U+psy on input line 64.
|
| 746 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/psnfss/upsy.fd
|
| 747 |
+
File: upsy.fd 2001/06/04 font definitions for U/psy.
|
| 748 |
+
)) (/usr/local/texlive/2025/texmf-dist/tex/generic/ulem/ulem.sty
|
| 749 |
+
\UL@box=\box89
|
| 750 |
+
\UL@hyphenbox=\box90
|
| 751 |
+
\UL@skip=\skip73
|
| 752 |
+
\UL@hook=\toks50
|
| 753 |
+
\UL@height=\dimen344
|
| 754 |
+
\UL@pe=\count389
|
| 755 |
+
\UL@pixel=\dimen345
|
| 756 |
+
\ULC@box=\box91
|
| 757 |
+
Package: ulem 2019/11/18
|
| 758 |
+
\ULdepth=\dimen346
|
| 759 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/caption/subcaption.sty
|
| 760 |
+
Package: subcaption 2023/07/28 v1.6b Sub-captions (AR)
|
| 761 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption.sty
|
| 762 |
+
Package: caption 2023/08/05 v3.6o Customizing captions (AR)
|
| 763 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption3.sty
|
| 764 |
+
Package: caption3 2023/07/31 v2.4d caption3 kernel (AR)
|
| 765 |
+
\caption@tempdima=\dimen347
|
| 766 |
+
\captionmargin=\dimen348
|
| 767 |
+
\caption@leftmargin=\dimen349
|
| 768 |
+
\caption@rightmargin=\dimen350
|
| 769 |
+
\caption@width=\dimen351
|
| 770 |
+
\caption@indent=\dimen352
|
| 771 |
+
\caption@parindent=\dimen353
|
| 772 |
+
\caption@hangindent=\dimen354
|
| 773 |
+
Package caption Info: beamer document class detected.
|
| 774 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/caption/caption-beamer.sto
|
| 775 |
+
File: caption-beamer.sto 2022/01/06 v2.0c Adaption of the caption package to the beamer document classes (AR)
|
| 776 |
+
))
|
| 777 |
+
\c@caption@flags=\count390
|
| 778 |
+
\c@continuedfloat=\count391
|
| 779 |
+
Package caption Info: hyperref package is loaded.
|
| 780 |
+
Package caption Info: Hyperref support is turned off
|
| 781 |
+
(caption) because hyperref has stopped early.
|
| 782 |
+
)
|
| 783 |
+
Package caption Info: New subtype `subfigure' on input line 238.
|
| 784 |
+
\c@subfigure=\count392
|
| 785 |
+
Package caption Info: New subtype `subtable' on input line 238.
|
| 786 |
+
\c@subtable=\count393
|
| 787 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerthemeMadrid.sty (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemewhale.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamercolorthemeorchid.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerinnerthemerounded.sty) (/usr/local/texlive/2025/texmf-dist/tex/latex/beamer/beamerouterthemeinfolines.sty)) (./helpers.tex)
|
| 788 |
+
|
| 789 |
+
Package hyperref Warning: Token not allowed in a PDF string (Unicode):
|
| 790 |
+
(hyperref) removing `math shift' on input line 66.
|
| 791 |
+
|
| 792 |
+
|
| 793 |
+
Package hyperref Warning: Token not allowed in a PDF string (Unicode):
|
| 794 |
+
(hyperref) removing `superscript' on input line 66.
|
| 795 |
+
|
| 796 |
+
|
| 797 |
+
Package hyperref Warning: Token not allowed in a PDF string (Unicode):
|
| 798 |
+
(hyperref) removing `math shift' on input line 66.
|
| 799 |
+
|
| 800 |
+
) (./presentation.aux)
|
| 801 |
+
\openout1 = `presentation.aux'.
|
| 802 |
+
|
| 803 |
+
LaTeX Font Info: Checking defaults for OML/cmm/m/it on input line 7.
|
| 804 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 805 |
+
LaTeX Font Info: Checking defaults for OMS/cmsy/m/n on input line 7.
|
| 806 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 807 |
+
LaTeX Font Info: Checking defaults for OT1/cmr/m/n on input line 7.
|
| 808 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 809 |
+
LaTeX Font Info: Checking defaults for T1/cmr/m/n on input line 7.
|
| 810 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 811 |
+
LaTeX Font Info: Checking defaults for TS1/cmr/m/n on input line 7.
|
| 812 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 813 |
+
LaTeX Font Info: Checking defaults for OMX/cmex/m/n on input line 7.
|
| 814 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 815 |
+
LaTeX Font Info: Checking defaults for U/cmr/m/n on input line 7.
|
| 816 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 817 |
+
LaTeX Font Info: Checking defaults for PD1/pdf/m/n on input line 7.
|
| 818 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 819 |
+
LaTeX Font Info: Checking defaults for PU/pdf/m/n on input line 7.
|
| 820 |
+
LaTeX Font Info: ... okay on input line 7.
|
| 821 |
+
|
| 822 |
+
*geometry* driver: auto-detecting
|
| 823 |
+
*geometry* detected driver: pdftex
|
| 824 |
+
*geometry* verbose mode - [ preamble ] result:
|
| 825 |
+
* driver: pdftex
|
| 826 |
+
* paper: custom
|
| 827 |
+
* layout: <same size as paper>
|
| 828 |
+
* layoutoffset:(h,v)=(0.0pt,0.0pt)
|
| 829 |
+
* modes: includehead includefoot
|
| 830 |
+
* h-part:(L,W,R)=(10.95003pt, 342.2953pt, 10.95003pt)
|
| 831 |
+
* v-part:(T,H,B)=(0.0pt, 273.14662pt, 0.0pt)
|
| 832 |
+
* \paperwidth=364.19536pt
|
| 833 |
+
* \paperheight=273.14662pt
|
| 834 |
+
* \textwidth=342.2953pt
|
| 835 |
+
* \textheight=244.6939pt
|
| 836 |
+
* \oddsidemargin=-61.31996pt
|
| 837 |
+
* \evensidemargin=-61.31996pt
|
| 838 |
+
* \topmargin=-72.26999pt
|
| 839 |
+
* \headheight=14.22636pt
|
| 840 |
+
* \headsep=0.0pt
|
| 841 |
+
* \topskip=11.0pt
|
| 842 |
+
* \footskip=14.22636pt
|
| 843 |
+
* \marginparwidth=4.0pt
|
| 844 |
+
* \marginparsep=10.0pt
|
| 845 |
+
* \columnsep=10.0pt
|
| 846 |
+
* \skip\footins=10.0pt plus 4.0pt minus 2.0pt
|
| 847 |
+
* \hoffset=0.0pt
|
| 848 |
+
* \voffset=0.0pt
|
| 849 |
+
* \mag=1000
|
| 850 |
+
* \@twocolumnfalse
|
| 851 |
+
* \@twosidefalse
|
| 852 |
+
* \@mparswitchfalse
|
| 853 |
+
* \@reversemarginfalse
|
| 854 |
+
* (1in=72.27pt=25.4mm, 1cm=28.453pt)
|
| 855 |
+
|
| 856 |
+
(/usr/local/texlive/2025/texmf-dist/tex/context/base/mkii/supp-pdf.mkii
|
| 857 |
+
[Loading MPS to PDF converter (version 2006.09.02).]
|
| 858 |
+
\scratchcounter=\count394
|
| 859 |
+
\scratchdimen=\dimen355
|
| 860 |
+
\scratchbox=\box92
|
| 861 |
+
\nofMPsegments=\count395
|
| 862 |
+
\nofMParguments=\count396
|
| 863 |
+
\everyMPshowfont=\toks51
|
| 864 |
+
\MPscratchCnt=\count397
|
| 865 |
+
\MPscratchDim=\dimen356
|
| 866 |
+
\MPnumerator=\count398
|
| 867 |
+
\makeMPintoPDFobject=\count399
|
| 868 |
+
\everyMPtoPDFconversion=\toks52
|
| 869 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/epstopdf-pkg/epstopdf-base.sty
|
| 870 |
+
Package: epstopdf-base 2020-01-24 v2.11 Base part for package epstopdf
|
| 871 |
+
Package epstopdf-base Info: Redefining graphics rule for `.eps' on input line 485.
|
| 872 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/latexconfig/epstopdf-sys.cfg
|
| 873 |
+
File: epstopdf-sys.cfg 2010/07/13 v1.3 Configuration of (r)epstopdf for TeX Live
|
| 874 |
+
))
|
| 875 |
+
Package hyperref Info: Link coloring OFF on input line 7.
|
| 876 |
+
(./presentation.out) (./presentation.out)
|
| 877 |
+
\@outlinefile=\write5
|
| 878 |
+
\openout5 = `presentation.out'.
|
| 879 |
+
|
| 880 |
+
LaTeX Font Info: Overwriting symbol font `operators' in version `normal'
|
| 881 |
+
(Font) OT1/cmr/m/n --> OT1/cmss/m/n on input line 7.
|
| 882 |
+
LaTeX Font Info: Overwriting symbol font `operators' in version `bold'
|
| 883 |
+
(Font) OT1/cmr/bx/n --> OT1/cmss/b/n on input line 7.
|
| 884 |
+
\symnumbers=\mathgroup6
|
| 885 |
+
\sympureletters=\mathgroup7
|
| 886 |
+
LaTeX Font Info: Overwriting math alphabet `\mathrm' in version `normal'
|
| 887 |
+
(Font) OT1/cmss/m/n --> OT1/cmr/m/n on input line 7.
|
| 888 |
+
LaTeX Font Info: Redeclaring math alphabet \mathbf on input line 7.
|
| 889 |
+
LaTeX Font Info: Overwriting math alphabet `\mathbf' in version `normal'
|
| 890 |
+
(Font) OT1/cmr/bx/n --> OT1/cmss/b/n on input line 7.
|
| 891 |
+
LaTeX Font Info: Overwriting math alphabet `\mathbf' in version `bold'
|
| 892 |
+
(Font) OT1/cmr/bx/n --> OT1/cmss/b/n on input line 7.
|
| 893 |
+
LaTeX Font Info: Redeclaring math alphabet \mathsf on input line 7.
|
| 894 |
+
LaTeX Font Info: Overwriting math alphabet `\mathsf' in version `normal'
|
| 895 |
+
(Font) OT1/cmss/m/n --> OT1/cmss/m/n on input line 7.
|
| 896 |
+
LaTeX Font Info: Overwriting math alphabet `\mathsf' in version `bold'
|
| 897 |
+
(Font) OT1/cmss/bx/n --> OT1/cmss/m/n on input line 7.
|
| 898 |
+
LaTeX Font Info: Redeclaring math alphabet \mathit on input line 7.
|
| 899 |
+
LaTeX Font Info: Overwriting math alphabet `\mathit' in version `normal'
|
| 900 |
+
(Font) OT1/cmr/m/it --> OT1/cmss/m/it on input line 7.
|
| 901 |
+
LaTeX Font Info: Overwriting math alphabet `\mathit' in version `bold'
|
| 902 |
+
(Font) OT1/cmr/bx/it --> OT1/cmss/m/it on input line 7.
|
| 903 |
+
LaTeX Font Info: Redeclaring math alphabet \mathtt on input line 7.
|
| 904 |
+
LaTeX Font Info: Overwriting math alphabet `\mathtt' in version `normal'
|
| 905 |
+
(Font) OT1/cmtt/m/n --> OT1/cmtt/m/n on input line 7.
|
| 906 |
+
LaTeX Font Info: Overwriting math alphabet `\mathtt' in version `bold'
|
| 907 |
+
(Font) OT1/cmtt/m/n --> OT1/cmtt/m/n on input line 7.
|
| 908 |
+
LaTeX Font Info: Overwriting symbol font `numbers' in version `bold'
|
| 909 |
+
(Font) OT1/cmss/m/n --> OT1/cmss/b/n on input line 7.
|
| 910 |
+
LaTeX Font Info: Overwriting symbol font `pureletters' in version `bold'
|
| 911 |
+
(Font) OT1/cmss/m/it --> OT1/cmss/b/it on input line 7.
|
| 912 |
+
LaTeX Font Info: Overwriting math alphabet `\mathrm' in version `bold'
|
| 913 |
+
(Font) OT1/cmss/b/n --> OT1/cmr/b/n on input line 7.
|
| 914 |
+
LaTeX Font Info: Overwriting math alphabet `\mathbf' in version `bold'
|
| 915 |
+
(Font) OT1/cmss/b/n --> OT1/cmss/b/n on input line 7.
|
| 916 |
+
LaTeX Font Info: Overwriting math alphabet `\mathsf' in version `bold'
|
| 917 |
+
(Font) OT1/cmss/m/n --> OT1/cmss/b/n on input line 7.
|
| 918 |
+
LaTeX Font Info: Overwriting math alphabet `\mathit' in version `bold'
|
| 919 |
+
(Font) OT1/cmss/m/it --> OT1/cmss/b/it on input line 7.
|
| 920 |
+
LaTeX Font Info: Overwriting math alphabet `\mathtt' in version `bold'
|
| 921 |
+
(Font) OT1/cmtt/m/n --> OT1/cmtt/b/n on input line 7.
|
| 922 |
+
LaTeX Font Info: Redeclaring symbol font `pureletters' on input line 7.
|
| 923 |
+
LaTeX Font Info: Overwriting symbol font `pureletters' in version `normal'
|
| 924 |
+
(Font) OT1/cmss/m/it --> OT1/mathkerncmss/m/sl on input line 7.
|
| 925 |
+
LaTeX Font Info: Overwriting symbol font `pureletters' in version `bold'
|
| 926 |
+
(Font) OT1/cmss/b/it --> OT1/mathkerncmss/m/sl on input line 7.
|
| 927 |
+
LaTeX Font Info: Overwriting symbol font `pureletters' in version `bold'
|
| 928 |
+
(Font) OT1/mathkerncmss/m/sl --> OT1/mathkerncmss/bx/sl on input line 7.
|
| 929 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-basic-dictionary-English.dict
|
| 930 |
+
Dictionary: translator-basic-dictionary, Language: English
|
| 931 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-bibliography-dictionary-English.dict
|
| 932 |
+
Dictionary: translator-bibliography-dictionary, Language: English
|
| 933 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-environment-dictionary-English.dict
|
| 934 |
+
Dictionary: translator-environment-dictionary, Language: English
|
| 935 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-months-dictionary-English.dict
|
| 936 |
+
Dictionary: translator-months-dictionary, Language: English
|
| 937 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-numbers-dictionary-English.dict
|
| 938 |
+
Dictionary: translator-numbers-dictionary, Language: English
|
| 939 |
+
) (/usr/local/texlive/2025/texmf-dist/tex/latex/translator/translator-theorem-dictionary-English.dict
|
| 940 |
+
Dictionary: translator-theorem-dictionary, Language: English
|
| 941 |
+
)
|
| 942 |
+
Package caption Info: Begin \AtBeginDocument code.
|
| 943 |
+
Package caption Info: End \AtBeginDocument code.
|
| 944 |
+
(./presentation.nav) (./01.tex
|
| 945 |
+
LaTeX Font Info: Trying to load font information for U+msa on input line 6.
|
| 946 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsa.fd
|
| 947 |
+
File: umsa.fd 2013/01/14 v3.01 AMS symbols A
|
| 948 |
+
)
|
| 949 |
+
LaTeX Font Info: Trying to load font information for U+msb on input line 6.
|
| 950 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/amsfonts/umsb.fd
|
| 951 |
+
File: umsb.fd 2013/01/14 v3.01 AMS symbols B
|
| 952 |
+
)
|
| 953 |
+
LaTeX Font Info: Trying to load font information for OT1+mathkerncmss on input line 6.
|
| 954 |
+
(/usr/local/texlive/2025/texmf-dist/tex/latex/sansmathaccent/ot1mathkerncmss.fd
|
| 955 |
+
File: ot1mathkerncmss.fd 2020/01/31 Fontinst v1.933 font definitions for OT1/mathkerncmss.
|
| 956 |
+
)
|
| 957 |
+
<../logos/hf.pdf, id=27, 1027.84pt x 1027.84pt>
|
| 958 |
+
File: ../logos/hf.pdf Graphic file (type pdf)
|
| 959 |
+
<use ../logos/hf.pdf>
|
| 960 |
+
Package pdftex.def Info: ../logos/hf.pdf used on input line 6.
|
| 961 |
+
(pdftex.def) Requested size: 14.22498pt x 14.22636pt.
|
| 962 |
+
<../logos/lerobot.png, id=28, 2383.90625pt x 2213.77063pt>
|
| 963 |
+
File: ../logos/lerobot.png Graphic file (type png)
|
| 964 |
+
<use ../logos/lerobot.png>
|
| 965 |
+
Package pdftex.def Info: ../logos/lerobot.png used on input line 6.
|
| 966 |
+
(pdftex.def) Requested size: 85.35826pt x 79.24641pt.
|
| 967 |
+
|
| 968 |
+
|
| 969 |
+
[1
|
| 970 |
+
|
| 971 |
+
{/usr/local/texlive/2025/texmf-var/fonts/map/pdftex/updmap/pdftex.map} <../logos/hf.pdf> <../logos/lerobot.png>])
|
| 972 |
+
\tf@nav=\write6
|
| 973 |
+
\openout6 = `presentation.nav'.
|
| 974 |
+
|
| 975 |
+
\tf@toc=\write7
|
| 976 |
+
\openout7 = `presentation.toc'.
|
| 977 |
+
|
| 978 |
+
\tf@snm=\write8
|
| 979 |
+
\openout8 = `presentation.snm'.
|
| 980 |
+
|
| 981 |
+
(./presentation.aux)
|
| 982 |
+
***********
|
| 983 |
+
LaTeX2e <2024-11-01> patch level 2
|
| 984 |
+
L3 programming layer <2025-01-18>
|
| 985 |
+
***********
|
| 986 |
+
Package rerunfilecheck Info: File `presentation.out' has not changed.
|
| 987 |
+
(rerunfilecheck) Checksum: D41D8CD98F00B204E9800998ECF8427E;0.
|
| 988 |
+
)
|
| 989 |
+
Here is how much of TeX's memory you used:
|
| 990 |
+
30067 strings out of 473190
|
| 991 |
+
599521 string characters out of 5715800
|
| 992 |
+
975629 words of memory out of 5000000
|
| 993 |
+
52701 multiletter control sequences out of 15000+600000
|
| 994 |
+
568060 words of font info for 70 fonts, out of 8000000 for 9000
|
| 995 |
+
1141 hyphenation exceptions out of 8191
|
| 996 |
+
128i,15n,134p,460b,595s stack positions out of 10000i,1000n,20000p,200000b,200000s
|
| 997 |
+
</usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmss10.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmss12.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmss8.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmtt12.pfb></usr/local/texlive/2025/texmf-dist/fonts/type1/public/amsfonts/cm/cmtt8.pfb>
|
| 998 |
+
Output written on presentation.pdf (1 page, 300003 bytes).
|
| 999 |
+
PDF statistics:
|
| 1000 |
+
93 PDF objects out of 1000 (max. 8388607)
|
| 1001 |
+
60 compressed objects within 1 object stream
|
| 1002 |
+
3 named destinations out of 1000 (max. 500000)
|
| 1003 |
+
95 words of extra memory for PDF output out of 10000 (max. 10000000)
|
| 1004 |
+
|
app/scripts/latex-to-mdx/input/slides/presentation.nav
ADDED
|
@@ -0,0 +1,7 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
\headcommand {\slideentry {0}{0}{1}{1/1}{}{0}}
|
| 2 |
+
\headcommand {\beamer@framepages {1}{1}}
|
| 3 |
+
\headcommand {\beamer@partpages {1}{1}}
|
| 4 |
+
\headcommand {\beamer@subsectionpages {1}{1}}
|
| 5 |
+
\headcommand {\beamer@sectionpages {1}{1}}
|
| 6 |
+
\headcommand {\beamer@documentpages {1}}
|
| 7 |
+
\headcommand {\gdef \inserttotalframenumber {1}}
|
app/scripts/latex-to-mdx/input/slides/presentation.out
ADDED
|
File without changes
|