Add files using upload-large-folder tool
Browse filesThis view is limited to 50 files because it contains too many changes. See raw diff
- 2004.10353/main_diagram/main_diagram.drawio +1 -0
- 2004.10353/main_diagram/main_diagram.pdf +0 -0
- 2004.10353/paper_text/intro_method.md +68 -0
- 2006.09740/record.json +32 -0
- 2007.05060/record.json +32 -0
- 2007.13040/record.json +32 -0
- 2008.09641/record.json +32 -0
- 2009.09439/record.json +32 -0
- 2010.11230/record.json +32 -0
- 2012.15788/record.json +32 -0
- 2102.04152/record.json +32 -0
- 2102.11938/record.json +32 -0
- 2103.17185/record.json +32 -0
- 2104.05981/record.json +32 -0
- 2104.06669/record.json +32 -0
- 2104.08701/record.json +32 -0
- 2106.04335/record.json +32 -0
- 2106.05266/record.json +32 -0
- 2108.01499/record.json +32 -0
- 2108.08265/record.json +32 -0
- 2108.12318/record.json +32 -0
- 2109.02614/record.json +32 -0
- 2109.11171/record.json +32 -0
- 2110.05291/record.json +32 -0
- 2110.05419/main_diagram/main_diagram.drawio +1 -0
- 2110.05419/main_diagram/main_diagram.pdf +0 -0
- 2110.05419/paper_text/intro_method.md +105 -0
- 2110.05877/record.json +32 -0
- 2110.06257/record.json +32 -0
- 2110.08350/record.json +32 -0
- 2111.15097/main_diagram/main_diagram.drawio +1 -0
- 2111.15097/main_diagram/main_diagram.pdf +0 -0
- 2111.15097/paper_text/intro_method.md +123 -0
- 2112.01036/record.json +32 -0
- 2112.03258/main_diagram/main_diagram.drawio +1 -0
- 2112.03258/main_diagram/main_diagram.pdf +0 -0
- 2112.03258/paper_text/intro_method.md +113 -0
- 2112.04728/record.json +32 -0
- 2112.05261/record.json +32 -0
- 2112.05883/main_diagram/main_diagram.drawio +0 -0
- 2112.05883/paper_text/intro_method.md +27 -0
- 2112.07917/record.json +32 -0
- 2202.06200/main_diagram/main_diagram.drawio +1 -0
- 2202.06200/main_diagram/main_diagram.pdf +0 -0
- 2202.06200/paper_text/intro_method.md +145 -0
- 2203.04723/record.json +32 -0
- 2203.07881/record.json +32 -0
- 2203.12000/record.json +32 -0
- 2203.16220/record.json +32 -0
- 2204.03840/record.json +32 -0
2004.10353/main_diagram/main_diagram.drawio
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
<mxfile modified="2019-09-23T19:59:08.478Z" host="www.draw.io" agent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/76.0.3809.132 Safari/537.36" etag="LnPrVURCDklOCTidBwCJ" version="11.3.0" type="google" pages="1"><diagram id="CY1wFINZTNLj2MTf_vni" name="Page-1">7Vrfd6I4FP5reKyHJPx8VOvOnNnpnjnbh5ndlz0MpMAMEjZGxf71m0goBGxFBbXO+tCSL8kNufnu5buohqbz/AP1suiBBDjRoB7kGrrXIHQR5H8FsCkA07YLIKRxUECgAh7jZyxBXaLLOMALZSAjJGFxpoI+SVPsMwXzKCVrddgTSdRVMy/ELeDR95I2+jUOWFSgDrQr/COOw6hcGVhu0TP3ysFyJ4vIC8i6BqGZhqaUEFZczfMpToTvSr8U8357pfflxihOWZcJ6PtnO3m2zBSv9M8hCH9sfn66k1ZWXrKUG9aglXB7kyfCzfK7ZhvpCuvfJSk77hbbgxrzAdDI8qpTuNjz1QlTsqQxprzrD7yuD7VC+b++oE8SQpX5GkS67vuW1Z77vZzMt13MP8SmsNq26Z1g0/eF1bZNtsdmC4aK6yHDucAjNk84APjlglHyE0/lTaQkxeLW4iRpQF4Shylv+pwl/AzQZIUpizm/x7JjHgeBWGayjmKGH7Pi9NY8mDlGyTINsCCQ/nJbdcaV9OE2cV6DJAM/YDLHjG74ENlryGDYqM11LbQkFNWiqsQ8Gczhi92K7/xCUv4A+sPbpv8r8CLz0leWH3K/h4WhNkXa2Nq/wQtt5v+cMFROAGa3pACHSgpoR1Jo+B6nwViIi8qntWPgnqCbb8I/IwTNEvhrC1jIKYH7XLqwaG3qrS+Yxnwz4nDe9jLzaIjZ/gc8DhSZ0z6LmrPNHc4uMYoTj8UrVRztOgG5whcSbyNDHrVIYvWjNgzVwoKHm4/lpLqWadkZ6citfVSzuj7S6x+krlI4rbXKliwvLjmeP8ap/Mlj9q1kA78umGPKVkUb0djUGp1JU3i5gOgP9PBn+PV58TfQP80efrfvl6h8LF4JaWyXb16ljX4cbRw107jgrLww+8srlt1PXimpdqePgG0ofDPcczKuLPfeWT5zjX6I6RpqPlPTZNk8E0/Ls3hdFR8rkwqtdrguO10pazNdGxvaeLy9MDXHHqByrBbpv4KUtif6cYr4lxGTAF5XhQn6f8Oyk0znLDsz3HvdOVQVmWlTVxPy8IQysmOcHn4E/NYms5uoAoOYYp/FREzjzzwRaUMEM+gYzWiwaO5Rw6GmhKtqxSMlnG46FysZjKvSZcAwmzUDso+TZsCwG6achqWh5ZjVH+dMAPsmHYDmxUhnXRfpzFahejTpLF6AmbWCwLkoA+3+GNiuXE9PexZUGcgj5nwcdK6Lg7bREwNtdLa0t/uNwH7OFUqn/FoWNmRQ5GVi3DwPxRfTo6eErP3Io2zkpSlhnlAr/4CaDkrw01Yxig1hOltxJiykLSrH86bbU51iNd8cgJayMdosAT0Im7fe+N2qt22n4W29JPcef78RgCf5u8M74vfsb4Cs1qMQdXE4NAdyeAcR9a4dbrUF72Ud7ty4w51mWdA9qQyVxHe+uL0hn7vwTM9M3qx+l1UImurHbWj2Hw==</diagram></mxfile>
|
2004.10353/main_diagram/main_diagram.pdf
ADDED
|
Binary file (25.4 kB). View file
|
|
|
2004.10353/paper_text/intro_method.md
ADDED
|
@@ -0,0 +1,68 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Introduction
|
| 2 |
+
|
| 3 |
+
Hindi is written in the Devanagari script, which is an abugida, an orthographic system where the basic unit consists of a consonant and an optional vowel diacritic or a single vowel. Devanagari is fairly regular, but a Hindi word's actual pronunciation can differ from what is literally written in the Devanagari script.[^1] For instance, in the Hindi word ppr $\langle$pep@R@$\rangle$ 'paper', there are three units p $\langle$pe$\rangle$, p $\langle$p@$\rangle$, and r $\langle$R@$\rangle$, corresponding to the pronounced forms , , and . The second unit's inherent schwa is retained in the pronounced form, but the third unit's inherent schwa is deleted.
|
| 4 |
+
|
| 5 |
+
Predicting whether a schwa will be deleted from a word's orthographic form is generally difficult. Some reliable rules can be stated, e.g. 'delete any schwa at the end of the word', but these do not perform well enough for use in an application that requires schwa deletion, like a text-to-speech synthesis system.
|
| 6 |
+
|
| 7 |
+
This work approaches the problem of predicting schwa deletion in Hindi with machine learning techniques, achieving high accuracy with minimal human intervention. We also successfully apply our Hindi schwa deletion model to a related language, Punjabi. Our scripts for obtaining machine-readable versions of the Hindi and Punjabi pronunciation datasets are published to facilitate future comparisons.[^2]
|
| 8 |
+
|
| 9 |
+
Previous approaches to schwa deletion in Hindi broadly fall into two classes.
|
| 10 |
+
|
| 11 |
+
The first class is characterized by its use of rules given in the formalism of *The Sound Pattern of English* [@spe]. Looking to analyses of schwa deletion produced by linguists [e.g., @ohala_1983] in this framework, others built schwa deletion systems by implementing their rules. For example, this is a rule used by @narasimhan_schwa-deletion_2004, describing schwa deletion for words like jglF $\langle$\@Ng@li:$\rangle$:
|
| 12 |
+
|
| 13 |
+
::: center
|
| 14 |
+
--- --- --- --- ------- --- ---- --------------- --- --- --- --- --- ----
|
| 15 |
+
C V C C **a** C V C V C C C V
|
| 16 |
+
@ N g **@** l i: $\rightarrow$ @ N g l i:
|
| 17 |
+
--- --- --- --- ------- --- ---- --------------- --- --- --- --- --- ----
|
| 18 |
+
:::
|
| 19 |
+
|
| 20 |
+
Paraphrasing, this rule could be read, "if a schwa occurs with a vowel and two consonants to its left, and a consonant and a vowel to its right, it should be deleted." A typical system of this class would apply many of these rules to reach a word's output form, sometimes along with other information, like the set of allowable consonant clusters in Hindi. These systems were able to achieve fair accuracy (@narasimhan_schwa-deletion_2004 achieve 89%), but were ill-equipped to deal with cases that seemed to rely on detailed facts about Hindi morphology and prosody.
|
| 21 |
+
|
| 22 |
+
![A representative example of the linguistic representations used by @tyson_prosodic_2009 [-@tyson_prosodic_2009]. Proceeding from top to bottom, a prosodic word (PrWd) consists of feet, syllables (which have weights), and syllable templates.](pstructure.pdf){#fig:pstructure width="50%"}
|
| 23 |
+
|
| 24 |
+
Systems of the second class make use of linguistically richer representations of words. Typical of this class is the system of @tyson_prosodic_2009, which analyzes each word into a hierarchical phonological representation (see figure [1](#fig:pstructure){reference-type="ref" reference="fig:pstructure"}). These same representations had been used in linguistic analyses: @pandey, for instance, as noted by @tyson_prosodic_2009, "claimed that schwas in Hindi cannot appear between a strong and weak rhyme[^3] within a prosodic foot." Systems using prosodic representations perform fairly well, with system achieving performance ranging from 86% to 94% but prosody proved not to be a silver bullet; @tyson_prosodic_2009 remark, "it appears that schwa deletion is a phenomenon governed by not only prosodic information but by the observance of the phonotactics of consonant clusters."
|
| 25 |
+
|
| 26 |
+
There are other approaches to subsets of the schwa-deletion problem. One is the diachronic analysis applied by @choudhury which achieved 99.80% word-level accuracy on native Sanskrit-derived terms.
|
| 27 |
+
|
| 28 |
+
Machine learning has not been applied to schwa deletion in Hindi prior to our work. @johny_brahmic_2018 used neural networks to model schwa deletion in Bengali (which is not a binary classification problem as in Hindi) and achieved great advances in accuracy. We employ a similar approach to Hindi, but go further by applying gradient-boosting decision trees to the problem, which are more easily interpreted in a linguistic format.
|
| 29 |
+
|
| 30 |
+
Similar research has been undertaken in other Indo-Aryan languages that undergo schwa-deletion, albeit to a lesser extent than Hindi. @wasala-06, for example, proposed a rigorous rule-based G2P system for Sinhala.
|
| 31 |
+
|
| 32 |
+
# Method
|
| 33 |
+
|
| 34 |
+
We frame schwa deletion as a binary classification problem: orthographic schwas are either fully retained or fully deleted when spoken. Previous work has shown that even with rich linguistic representations of words, it is difficult to discover categorical rules that can predict schwa deletion. This led us to approach the problem with machine learning, which we felt would stand a better chance at attaining high performance.
|
| 35 |
+
|
| 36 |
+
We obtained training data from digitized dictionaries hosted by the University of Chicago [Digital Dictionaries of South Asia](https://dsalsrv04.uchicago.edu/dictionaries/) project. The Hindi data, comprised of the original Devanagari orthography and the phonemic transcription, was parsed out of @mcgregor and @bahri and transcribed into an ASCII format. The Punjabi data was similarly processed from @singh. [1](#table:entry-example){reference-type="ref+Label" reference="table:entry-example"} gives an example entry from the @mcgregor Hindi dataset.
|
| 37 |
+
|
| 38 |
+
To find all instances of schwa retention and schwa deletion, we force-aligned orthographic and phonemic representations of each dictionary entry using a linear-time algorithm. In cases where force-alignment failed due to idiosyncrasies in the source data (typos, OCR errors, etc.) we discarded the entire word. We provide statistics about our datasets in [2](#table:datasets){reference-type="ref+label" reference="table:datasets"}. We primarily used the dataset from @mcgregor in training our Hindi models due to its comprehensiveness and high quality.
|
| 39 |
+
|
| 40 |
+
:::: center
|
| 41 |
+
::: {#table:entry-example}
|
| 42 |
+
------------------ ---------------------------
|
| 43 |
+
**Devanagari** akwAhV
|
| 44 |
+
**Orthographic** `a ~ k a rr aa h a tt a`
|
| 45 |
+
**Phonemic** `a ~ k rr aa h a tt`
|
| 46 |
+
------------------ ---------------------------
|
| 47 |
+
|
| 48 |
+
: An example entry from the Hindi training dataset.
|
| 49 |
+
:::
|
| 50 |
+
::::
|
| 51 |
+
|
| 52 |
+
:::: center
|
| 53 |
+
::: {#table:datasets}
|
| 54 |
+
**Hindi Dict.** **Entries** **Schwas** **Deletion Rate**
|
| 55 |
+
------------------- ------------- ------------ -------------------
|
| 56 |
+
McGregor 34,952 36,183 52.94%
|
| 57 |
+
Bahri 9,769 14,082 49.41%
|
| 58 |
+
Google 847 1,098 56.28%
|
| 59 |
+
**Punjabi Dict.** **Entries** **Schwas** **Deletion Rate**
|
| 60 |
+
Singh 28,324 34,576 52.25%
|
| 61 |
+
|
| 62 |
+
: Statistics about the datasets used. The deletion rate is the percentage of schwas that are deleted in their phonemic representation. The Google dataset, taken from @johny_brahmic_2018, was not considered in our final results due to its small size and over-representation of proper nouns.
|
| 63 |
+
:::
|
| 64 |
+
::::
|
| 65 |
+
|
| 66 |
+
Each schwa instance was an input in our training set. The output was a boolean value indicating whether the schwa was retained. Our features in the input column were a one-hot encoding of a variable window of phones to the left ($c_{-n}, \dots, c_{-1}$) and right ($c_{+1}, \dots, c_{+m}$) of the schwa instance ($c_0$) under consideration. The length of the window on either side was treated as a hyperparamater and tuned. We also tested whether including phonological features (for vowels: height, backness, roundedness, and length; for consonants: voice, aspiration, and place of articulation) of the adjacent graphemes affected the accuracy of the model.
|
| 67 |
+
|
| 68 |
+
We trained three models on each dataset: logistic regression from scikit-learn, MLPClassifier (multilayer perceptron neural network) from scikit-learn, and XGBClassifier (gradient-boosting decision trees) from XGBoost. We varied the size of the window of adjacent phonemes and trained with and without phonological feature data.
|
2006.09740/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2006.09740",
|
| 3 |
+
"month": "2020_06",
|
| 4 |
+
"year": 2020,
|
| 5 |
+
"conference": "NEURIPS",
|
| 6 |
+
"title": "Probabilistic orientation estimation with matrix Fisher distributions",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2006.09740",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_06/main_diagram_database/2006.09740",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_06/tex_files_extracted/2006.09740",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_06/main_diagram_database/2006.09740/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_06/main_diagram_database/2006.09740/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_06/main_diagram_database/2006.09740/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2006.09740/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2006.09740/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2006.09740/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2006.09740/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2006.09740/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2006.09740/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2007.05060/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2007.05060",
|
| 3 |
+
"month": "2020_07",
|
| 4 |
+
"year": 2020,
|
| 5 |
+
"conference": "NEURIPS",
|
| 6 |
+
"title": "Program Synthesis with Pragmatic Communication",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2007.05060",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_07/main_diagram_database/2007.05060",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_07/tex_files_extracted/2007.05060",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_07/main_diagram_database/2007.05060/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_07/main_diagram_database/2007.05060/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_07/main_diagram_database/2007.05060/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.05060/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.05060/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.05060/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.05060/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.05060/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.05060/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2007.13040/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2007.13040",
|
| 3 |
+
"month": "2020_07",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "ICML",
|
| 6 |
+
"title": "Improving Generalization in Meta-learning via Task Augmentation",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2007.13040",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_07/main_diagram_database/2007.13040",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_07/tex_files_extracted/2007.13040",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_07/main_diagram_database/2007.13040/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_07/main_diagram_database/2007.13040/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_07/main_diagram_database/2007.13040/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.13040/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.13040/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.13040/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.13040/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.13040/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2007.13040/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2008.09641/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2008.09641",
|
| 3 |
+
"month": "2020_08",
|
| 4 |
+
"year": 2020,
|
| 5 |
+
"conference": "ECCV",
|
| 6 |
+
"title": "MPCC: Matching Priors and Conditionals for Clustering",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2008.09641",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_08/main_diagram_database/2008.09641",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_08/tex_files_extracted/2008.09641",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_08/main_diagram_database/2008.09641/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_08/main_diagram_database/2008.09641/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_08/main_diagram_database/2008.09641/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2008.09641/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2008.09641/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2008.09641/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2008.09641/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2008.09641/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2008.09641/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "ok",
|
| 26 |
+
"copy_png": "ok",
|
| 27 |
+
"diagram_pdf": "ok",
|
| 28 |
+
"intro_method": "ok",
|
| 29 |
+
"paper_pdf": "ok",
|
| 30 |
+
"latex": "ok"
|
| 31 |
+
}
|
| 32 |
+
}
|
2009.09439/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2009.09439",
|
| 3 |
+
"month": "2020_09",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ICLR",
|
| 6 |
+
"title": "Surgical Prediction with Interpretable Latent Representation",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2009.09439",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_09/main_diagram_database/2009.09439",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_09/tex_files_extracted/2009.09439",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_09/main_diagram_database/2009.09439/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_09/main_diagram_database/2009.09439/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_09/main_diagram_database/2009.09439/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2009.09439/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2009.09439/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2009.09439/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2009.09439/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2009.09439/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2009.09439/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2010.11230/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2010.11230",
|
| 3 |
+
"month": "2020_10",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "NAACL",
|
| 6 |
+
"title": "Self-Supervised Contrastive Learning for Efficient User Satisfaction Prediction in Conversational Agents",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2010.11230",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_10/main_diagram_database/2010.11230",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_10/tex_files_extracted/2010.11230",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_10/main_diagram_database/2010.11230/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_10/main_diagram_database/2010.11230/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_10/main_diagram_database/2010.11230/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2010.11230/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2010.11230/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2010.11230/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2010.11230/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2010.11230/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2010.11230/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2012.15788/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2012.15788",
|
| 3 |
+
"month": "2020_12",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "ACL",
|
| 6 |
+
"title": "Evidence-based Factual Error Correction",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2012.15788",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_12/main_diagram_database/2012.15788",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_12/tex_files_extracted/2012.15788",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_12/main_diagram_database/2012.15788/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_12/main_diagram_database/2012.15788/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2020_12/main_diagram_database/2012.15788/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2012.15788/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2012.15788/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2012.15788/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2012.15788/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2012.15788/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2012.15788/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2102.04152/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2102.04152",
|
| 3 |
+
"month": "2021_02",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ICLR",
|
| 6 |
+
"title": "EigenGame Unloaded: When playing games is better than optimizing",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2102.04152",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_02/main_diagram_database/2102.04152",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_02/tex_files_extracted/2102.04152",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_02/main_diagram_database/2102.04152/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_02/main_diagram_database/2102.04152/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_02/main_diagram_database/2102.04152/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.04152/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.04152/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.04152/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.04152/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.04152/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.04152/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2102.11938/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2102.11938",
|
| 3 |
+
"month": "2021_02",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "NEURIPS",
|
| 6 |
+
"title": "Baby Intuitions Benchmark (BIB): Discerning the goals, preferences, and actions of others",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2102.11938",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_02/main_diagram_database/2102.11938",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_02/tex_files_extracted/2102.11938",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_02/main_diagram_database/2102.11938/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_02/main_diagram_database/2102.11938/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_02/main_diagram_database/2102.11938/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.11938/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.11938/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.11938/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.11938/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.11938/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2102.11938/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2103.17185/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2103.17185",
|
| 3 |
+
"month": "2021_03",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "CVPR",
|
| 6 |
+
"title": "Rethinking Style Transfer: From Pixels to Parameterized Brushstrokes",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2103.17185",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_03/main_diagram_database/2103.17185",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_03/tex_files_extracted/2103.17185",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_03/main_diagram_database/2103.17185/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_03/main_diagram_database/2103.17185/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_03/main_diagram_database/2103.17185/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2103.17185/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2103.17185/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2103.17185/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2103.17185/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2103.17185/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2103.17185/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2104.05981/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2104.05981",
|
| 3 |
+
"month": "2021_04",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "NAACL",
|
| 6 |
+
"title": "CLEVR_HYP: A Challenge Dataset and Baselines for Visual Question Answering with Hypothetical Actions over Images",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2104.05981",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.05981",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/tex_files_extracted/2104.05981",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.05981/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.05981/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.05981/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.05981/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.05981/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.05981/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.05981/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.05981/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.05981/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2104.06669/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2104.06669",
|
| 3 |
+
"month": "2021_04",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "AAAI",
|
| 6 |
+
"title": "NAREOR: The Narrative Reordering Problem",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2104.06669",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.06669",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/tex_files_extracted/2104.06669",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.06669/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.06669/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.06669/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.06669/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.06669/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.06669/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.06669/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.06669/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.06669/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2104.08701/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2104.08701",
|
| 3 |
+
"month": "2021_04",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "NAACL",
|
| 6 |
+
"title": "Intent Features for Rich Natural Language Understanding",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2104.08701",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.08701",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/tex_files_extracted/2104.08701",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.08701/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.08701/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_04/main_diagram_database/2104.08701/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.08701/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.08701/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.08701/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.08701/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.08701/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2104.08701/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2106.04335/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2106.04335",
|
| 3 |
+
"month": "2021_06",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "NEURIPS",
|
| 6 |
+
"title": "Reinforced Few-Shot Acquisition Function Learning for Bayesian Optimization",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2106.04335",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_06/main_diagram_database/2106.04335",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_06/tex_files_extracted/2106.04335",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_06/main_diagram_database/2106.04335/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_06/main_diagram_database/2106.04335/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_06/main_diagram_database/2106.04335/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.04335/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.04335/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.04335/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.04335/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.04335/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.04335/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2106.05266/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2106.05266",
|
| 3 |
+
"month": "2021_06",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "CVPR",
|
| 6 |
+
"title": "Semi-Supervised 3D Hand-Object Poses Estimation With Interactions in Time",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2106.05266",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_06/main_diagram_database/2106.05266",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_06/tex_files_extracted/2106.05266",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_06/main_diagram_database/2106.05266/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_06/main_diagram_database/2106.05266/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_06/main_diagram_database/2106.05266/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.05266/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.05266/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.05266/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.05266/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.05266/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2106.05266/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2108.01499/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2108.01499",
|
| 3 |
+
"month": "2021_08",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "ICCV",
|
| 6 |
+
"title": "Boosting Weakly Supervised Object Detection via Learning Bounding Box Adjusters",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2108.01499",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.01499",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/tex_files_extracted/2108.01499",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.01499/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.01499/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.01499/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.01499/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.01499/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.01499/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.01499/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.01499/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.01499/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "ok",
|
| 26 |
+
"copy_png": "ok",
|
| 27 |
+
"diagram_pdf": "ok",
|
| 28 |
+
"intro_method": "ok",
|
| 29 |
+
"paper_pdf": "ok",
|
| 30 |
+
"latex": "ok"
|
| 31 |
+
}
|
| 32 |
+
}
|
2108.08265/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2108.08265",
|
| 3 |
+
"month": "2021_08",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "ICCV",
|
| 6 |
+
"title": "End-to-End Urban Driving by Imitating a Reinforcement Learning Coach",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2108.08265",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.08265",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/tex_files_extracted/2108.08265",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.08265/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.08265/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.08265/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.08265/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.08265/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.08265/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.08265/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.08265/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.08265/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2108.12318/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2108.12318",
|
| 3 |
+
"month": "2021_08",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "EMNLP",
|
| 6 |
+
"title": "CAPE: Context-Aware Private Embeddings for Private Language Learning",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2108.12318",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.12318",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/tex_files_extracted/2108.12318",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.12318/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.12318/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_08/main_diagram_database/2108.12318/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.12318/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.12318/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.12318/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.12318/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.12318/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2108.12318/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2109.02614/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2109.02614",
|
| 3 |
+
"month": "2021_09",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "ICCV",
|
| 6 |
+
"title": "The Animation Transformer: Visual Correspondence via Segment Matching",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2109.02614",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_09/main_diagram_database/2109.02614",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_09/tex_files_extracted/2109.02614",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_09/main_diagram_database/2109.02614/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_09/main_diagram_database/2109.02614/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_09/main_diagram_database/2109.02614/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.02614/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.02614/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.02614/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.02614/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.02614/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.02614/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2109.11171/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2109.11171",
|
| 3 |
+
"month": "2021_09",
|
| 4 |
+
"year": 2021,
|
| 5 |
+
"conference": "EMNLP",
|
| 6 |
+
"title": "Zero-Shot Information Extraction as a Unified Text-to-Triple Translation",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2109.11171",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_09/main_diagram_database/2109.11171",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_09/tex_files_extracted/2109.11171",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_09/main_diagram_database/2109.11171/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_09/main_diagram_database/2109.11171/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_09/main_diagram_database/2109.11171/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.11171/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.11171/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.11171/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.11171/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.11171/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2109.11171/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2110.05291/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2110.05291",
|
| 3 |
+
"month": "2021_10",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ICLR",
|
| 6 |
+
"title": "Graph Neural Network Guided Local Search for the Traveling Salesperson Problem",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2110.05291",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.05291",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/tex_files_extracted/2110.05291",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.05291/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.05291/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.05291/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05291/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05291/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05291/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05291/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05291/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05291/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2110.05419/main_diagram/main_diagram.drawio
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
<mxfile host="app.diagrams.net" modified="2022-01-16T07:39:09.026Z" agent="5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/97.0.4692.71 Safari/537.36" etag="U-KtUKInIySfCmdoS8i0" version="16.2.2" type="device"><diagram id="wverVtk432qQdxx5vKkb" name="Page-1">7X1Zd6LK9/anOWu978VhMQ+XgIKIOIvizW+hIKJMAgr66f9ViSZOSds5MbHTlXVOq8zUHmpXPbWf/Q8lh6Wa2snciB03+IfEnfIfqvIPSRK8IIAPuGX7vEUQyOcNXuo7+4NeN/T8nbvfiO+3rn3HzU4OzOM4yP3kdOM0jiJ3mp9ss9M0Lk4Pm8XB6V0T23MvNvSmdnC5deg7+fx5K09yr9trru/ND3cm2P0Lh/bh4P0lsrntxMXzpqeXo6r/UHIax/nzt7CU3QA23qFdnltAeWPvy4OlbpTfckLZoaRWaA3ZgTjryZlk6P8u/+X2zbyxg/X+jfdPm28PTZDlabx8eXvw3NI8DwPwlQBfwVsl8Liw9KACYLMgLqZzO80xO4ri3M79OPofPNAOfC8CBwbuDDyulMR+lLtpdQOePttfK90fD37+K+Bgw+Ur7t9646a5Wx5t2r+y6sahm6dbcMh+r4Bj3F4iBw1kMYrFX//2lyxepcthzPO2+ZFkeRrj9m1j73XKe7nba7ODL/uW/x0pEFekwAawnWYxeHWo3kGcPu1hV2uoMaBtqNnT3/Em1oOfLyeDh3k+/3n7hWTBowE7Aj+kYu7nbi+xp3BPASR5KmU7S56Na+aXrgMfyw8C+eWZqOeWhOrwpCtHewSWo2z2c2TJMxjLnMiSoFmMZ46ESV4IkySvSfN16+cLk/1UYRI/VZossKgzcZIMsE1cePnjqQcQp/C54vyh0qQpEiO4I69KnTpdDuOY7xcmf83R/qTujgO9m3Da3xE8gXGXhsRhLHGl7SnQOd6r8ckf3vg0KWAUcdL4/5ICgdH0bSEGxwgYyd+r9Zkf3vosDzqPM9VneOxKH3K18QkQKVJ3a/xrMcGPanyCxbjTvpynMI6/re0Z8sVuPr/puR/e9BwB4qhTvQeeCCNvdDoExWD0ndpeuFT73kXjg9fMz1r8JFCJ4sg9i2r2mw4NPnVhQ4MNsNF8MHYX9ztC33GCt0KoNF5HDgyYKlAQgT1xA8meLr2n7UdRkvL0tw/3FDv0A9jKfT90M/AiTbcA/3bj0I72h+znMED7P/++Fop9RmdDMJhwNPLBTwMu5lL8LH4pfQq/l+gvza7ZRrL/JJMHXdVpR0fg2LEu4Ow3S5+/kL6JpP9Zls8LGH42D8Lyv5oHgbHllYCfwu822BKujZzPe9595+qHT/OwL3JsQKG048zf95aTOM/j8F1p7Ufcl7qQx8m1YfLTLcXDVvywBXx37Nz+hxKff5JKEnn/kLJvSq1ugeuqF4vgr9kbzKsDD3zTCvCPxMqiBT9zmctY8KUjL4Nqx+zSoks6MieZQHqkUvPLTjCdG4Fte0l3rGs9k14Y+SjnRuZiI1uB2+/0dWs414fj3nocW+l2XhEkizIVOlkQXVfrk1Zg+OlIZbppW+B32Y7dkewsn1lAjBKzFYSA5tidsxuDn4TA5QlwxIow3oC2sTecCbYuCLPZqmk9WVYl26+t1qukJWaaVMZVLxA7U6uq17eESXq1qiXLA0li5h060s3aUJxKy3q9EKth0JhPx+DQOa10qpEsyr3B2hSWUhl+9TUWBngnhWMY8P5Ks832tDr4UkZiu73ZQAUmJb0GvKXCD0a73a7v+JyzIug422zmlRQYzijcjfPtBui7NAZuS3K1rGPt7GEbOCyFpXV5mVS7wtOVlFoVb+66RUDhLQPu5nnQzSo0q7fa3VrArCeLIAjyqj0ME7tJsnl/Ecdas5oPUzOv8bVa5HutVrhc8r0gIOgVl0ZRbVOTZXhzrb+DkjLbO4IgeLWsM8KWdGczczxICVoo4CPM5yW/bm+e7gxOKXVls932CyncMlZOr8MoKu3RiGKZTnMuFeKC21DgFeMwTdNE37Qbug7Oajy1C0/i4XA35vkh2HT4D74jGDTVqGi5dPU05egtn5b+JEnsorVzN+bGCBdmI6q1W61RVm/5hVfkHdhOK39hrNptARq9MqJ3hsDFQ01iikKs6c2NzE+cNcmZA3jsuD0QXRA8U3TP4t2WMiDtplLVW51ClFdEQxIEOAnkZ6m+XbWZSRb26rTSimm62dbT1ag/7thFReI6mkzY+agnB17kKi0ofpKbtvR2qjT1BLZIlRqtiKZKjt2K6Dm2JLmSXSkX+Nxb9uveHPcHGlvLKW/darWqDM45pVWRvKKTw7eQ1txit1tsFLapLspZZV7wo048aCtNx5lqu4TkZCBMcNy28BJO2kyepJKteanQB/zSaLUGfa5R64RBbTrZDlOxoTw3c7vS3Mmd1kDycNtv6NssLnoblXPWgazrflupQD3gtSbD8zheurP6EreJSRNO10i069Z3QsdxhGIOjtosFgseXnO8nihEM4Cb+mUUhJOKAmU5koEzLoScEAQ4IaeA/yUb+JdEZ4LuQoqJ3qqcrvuNbRPsbFEURfcXO461m5zYnTj2BvirQc4Z7loyCZp126O1KqaLrWYB76LkIl1ATeLYFfQqwGnmw5XCzCOqpqpWg/AJknPmy+GSX8RV4PeBXMfjmOIYoZ8zVp9UcX7AVaDM9KbEbNolt4B2wLJ63SIDNYv1Qt7JwALqjW6T28qmRM4cgWHoDXhQeCTJdUwHNBaIELLp1K3CF04JwSE36x7ToactK2upFVfnOI7ulyXDpQ0LHM3Z4Wrlr7SI8zaZZxXQkibNLbe0fXAFjZ2BfxleJ632QutW5l7RfZYbSUWlpYm6s6YmK8mq1saNaMzWfCdORcnoDkwlrjXqfc0dLyxio9ltfzh0HaXj2S23QW+JZrttmhnF0cJuQ6/BBYOwLtLWuOdUfKKr1Gp1dZx3+84gNqhONYhYIE5pN85I3rPaDps4s9mipgXjYFIvYivJCMknpnE+guqC43Ur7CsgnpOW0x6UdJiaK3yrbHQ8NzaNMamLll1L8arRJjqrUXdi0eOF4FqDbrlQ2X5vWwu7g2Va72iWV6tstz0nLUaNSqXvWLHjq5zZnWqz+ri6WDx7iXazT/a55bIz94YaR1WE9haq0CpNaaC3PSIrS6dXA4+TM1DiuUv2uHE70GieH2TzRTXlQycTs1Fj3tGFbqORRlU2NMsR9EZlM4fWPKOEfL3taht5yKp0zth20xKtljMPwrZJVbaqOBqtd7qtLBV/hjdmyzVVBJwJNMC2lSkNXNFusLDtUlzbug5cojyfNCbddNbMpC0wLUoXsuq0VlEKna1WzX6kbdLpFCpDwDiUxkvLiVXheL7ayq2mETblTm3YEEdg/2A6cC2OUEOe7XWNqB/0hxNnha8cn5mybjaO8LFMip00pAmlCQ2+2ayDpyEjSZTFwtyKUMv9EQ68aLvdNJj+CCj0s//N2dQsuarI5OZoRAx0OSLCkqYd08u10tlWKjs2noiCV5iaMJslODEcjnHJBmdmu9mIr4bkbuSMSEukk3xoNFau4yRJr1DmYKg47ifLspsvYQzgRzphO7LXmabphNuVZdDTio2neSCmleqKblNjMXLGGb81Xb67XddzD3ZONL7q1lfZBlpzTI7prFqrMSVRw23XUFnXBXcp5+2F11TzPr+dtvptdTyHjnElLXdjr8ca80p9CcNJx302KmFoidKsQrCCsaCUODINoNucHK8a0An2+wtvbroZ6GulVmWwUpQKOQ3wpk6PrA60Vgq3m6ofbQKmuxg2OF4tRkYBLVpJpY6+YV1lFG3yfCBT0rSYsmow9/tz4K6UrdvkyfVm059k6VDPJ0VRyJI6q3BCh68Rc6a6rMQjGPwpeRmbO7UEvZw036wrEc1M23pm4Lsu2RK18U5uFLvW0CWaIgfEGOU6Y/ZIdhqn6XbHZvBJOp2CT8az/mCAC2N9W39qn0m7MSSXLLRSvVLfcvAN9YbkTBILxLT9ZDiKMkpPdzURVxdMM1v15pNE3mxmirPcWiwe7qrjjhRHG4pL1FBsReaKcIdR2zNgJxBFEVmzFityXZ1WChBDMHhZM8e7JuyeZ6NRHNYq82Dp6KLS7qm7gZOA+ycDdrqK4AEsywoTvRcvPBD3W7HZk6udDrlMZzPQmFbV1voJDixqW2nY3XwMem7f6iwNOcijwdIWpzsadOumJytmqJG4O2uHIEZm+1IWjrfsSmu4k0ma1ruBOp46Ks+B6CTzKpVOMoTW4uUMB2fMoZfZeNlsPZrUllO6V5+Cb7q7I3kSRmTzEGqP0vCy6WhLLiyvYedrctK1N8vQFSVdMqpsqz7Trcyf6fjSKsQkinZMsJStZq/0Z4sqq+9iR4qdcjVdVQI991vNvkW2rL7bE7VO0skntlgw1Xwu47W6aUNnPOO2MNbbrbrRwspn4yXsGmoLL2raZa5UAm+r+mBst2sBaa29vpibEb6tT8GbLCd0MGn1nFlWg11XTu0Wi/7MV5bQazL4oKU266XbzfIFSVh0b9RZ6GVvCcZXpNKb9v0xAwL7oBkPzFlnM+xY7ZBcbEWloupPfdYOemVaM9rVEOeg+49gALMx+7wHe+pGpdcrenHd8gm1PmmBCJFm5JElR+WSsZrMvnsjwMih1q5XR7HZ9RrmmJInRc+sd6NN34jbylDYMVlZ74V9G57B7Ty1lsOgG4iPcEi5TlbXxSDpMtoy2UxXhbF1Wjunnif99mJeFGtxGKtAy5RqvQiLZTMfPZkU3NDPq6zK9eZ1+Mx+BvvfUp7W24MKdFW62Qj4iQtilNoi6G/FiKdHTrZlJGO8GjfG8XLYsBJnnHqO2s60YkuIxCxTZWkS0P6sD65l8bYxmGnSpFxwE21F1uvlsD4xCFsYV/XEiCdzi7GMeboiY74SmGXVjSpmZadRDOjJspAeGlHaGvA8b/V6FFHW402Ldac9U/Om0Nb05bZpdMxVdeQqwAjMYbggCynh2iRPZ91AK5fSQk2hGKZao7HrwyFl6Lpub9zgm2zLbcmSRO92A8OrmE/dhRHu7F7VEgsQgVAx55ONPiFMB6NRURjNxgY0d54FsGtV64SJS9baaCZat9AKEMTQDUOZz/pDsiYI9UajGKihh0+aWamPykL0eq1nw5JGPqXYolpUjaGgJ3NDXE2bvWpiOkFtPGvXdiWdNUyv07XYoRPErtadaXRFAjE5twNDjmTp1MWKGONy0ZS743r7WYPMKgjCFHyWTyTWpDRcnM/NYdA1zA0+CpLl0sj8uOHNe1B7Zllr26GXu2C3bVQ3ZDbpmQtRVst10lvh4zmjrAU3Mr1YWpaDYaBl1UU4FZe9Cb8ktw27Z0wlT96IeqfLLMDwSR22cUNedIEZSCGXTlvMJudXI2UXCs+PxsJR8CodRlJWqUA9cxWfsMcaCFTbTTmHQzuFWdUbS1r1djuOm6jZQtrgQmUFuh+8IG3Bztj6HB7Itfx2ZaqqdW9JjvnpcFIE9mQjV2PgKstyt2jDy1PtDY7j/Oh5BApb3BUWk3pgzYbMeFTfaHQfmG697lHyfGEIcREP5v62VLTpOHeokTRnYCDSJ2APVBdkQe914m5rvOmaY6dbbaR9c7zQdFVarbOk3aHdfrCjZ1pjFenxti4Vz3ftLWOllwfTUX0o7QhvUl9TNZGvBx21106m9W3uxfOutl10FooaDWp4zDaWIex2Z3ERsJ3qpuIu52ZK8kvakmRxF9eNLJw5a28rjReppIV9XJ317LxZAwNraLuSsa7XGSbOJ1rKdIuG1a5YBD/YrLidu9VI8IJFa1j3EmPY1FWhuo5cveX3xKYi5l5V7FfS0YwOpg7eXjKjZTJfj6KKJE1X7Mgw7IVGunYdRgWx0ogWBuOSjVW3uQbDIKk3KzbNTPSqdZuvmzKfWFtuHA3JvNEdDK0ZK7ipWcVDdVX2+51+vNDg3AjLMATlpwGRNmeqNJ/bXVubmzNtOjfg3InU6rX4olF36Z2zbVvTPF25JNdNlpTeXWwmQYvYDKj+Zk2RhlLRqD7X4wdBxRv7XKcZT2zFEYGpgRikFF2+uoH+OeSTkBwqu9ybqjsV7MRpvWG69ZReN43phPFEJ2iTi0Feqa11xeUs4F8qkpKEYhaD0+tCd871c9vuTulJSCZ+HzxcXhXFZr8rVatexx71F3a1J7fmdN0IRk0YYTVDyk4M2+cS2JNYsVgnVx29M2m6S2uSW+1CHtuEonQGXmG1FDIbVzszhmQNui92gJQVXez0hmpvNy57RmaXltVUi3pTHYFx1nyUzcZB6M/xAsTjxIDMHLed4mB0X6apA6JSMCByVokBXxRGfNWoGuDTPCM8Cc4YxHq/7bGW2Bd2894A74wD3Lf8vFi3qGp1Iw44p+lNhsq2USNFqTKMSW9aB2Nwelqr+6uBZWjetlP3cK9bkI620ZsDvt0wpJWIJ11b75nbrWh1kyY958bsjOlx4UJdZ8N4YBsyRdhrZkiEylLrDJtFdaDA2NZ23HU0sYQsIqpSA/ZLxYSplp5a2RVZUtgd+NDuMM8F1k6czno40ZYMs2nMBone8zqTKZMY7Jbm1MJIhj1VhtNHEj/ptY1kom8HAQvjB6arioPmVnOG63qgF0NVIIytuJ0qAkkOTTs2Am6g1iMt07rTVro2cb+Nu4pFdqaGqkt4czbaapownqj9ulYFSiHJ65pDVEaqaMCZiA7bKQOrgcN5gGhta/Z2pcGj+KbQjhZOgvu9Ps6y/KayXdgtbRHi8MBsOGhXgtCYLOa+RSyjZBzykx0dNFctfB1wS3knLjwjJLhSHsuONOJ9s7eqN7v60mvL66SjVOGAQk4TkuslEImUaOAnJWVXlFTHocrMN7KWDH1Sxneqqc9GWzGE47zJxMZHraG0BWpdtdamnO3mGdXwSqOi7vKyvWipXCfWqt4AN01zPITet5sxI6YqS6tZrunDTm1A6otBumyMaWoCPA8JfOzKEJRAlPwBnYMo1TKp8URzdy1VxS1drSR9hrYswWLCviqULim2y5WkZKwvFsvtZvbir1ejke97SWPZ0dRkPdegJ1ebgpKFCcdMcpJrTmphjaYWHTqULL5ebrd+4Jea11VHrtGBPdFiMBoOGjO3XquIeEUzh8xmVAtga4Us52vjsmn2+h3Zj+rj4XjpGUkjaQ9WO2fXl/tGTcMnCcHSxcYYhB45WNtKQ0lkmm30OEYVxfZuU6/Scre6tstM0pdjGcai1QDEDQVtBqXJdiXRowvb9QKfmuNeQ40NoSH6XtNQo7SuzWnHknerRaVX0J7cnwfs1gfjYVXLWK0gs5YhL+PxNKPzqlSuGq2eaKSVhZSw8rDhtotZHMyY1GwrI7YK58PF3sBsdXVGtjQNzsd/Ds7F0xjNvq4HPFvdQQsYzr+3kJdiX9bsnmAe5J0QD+LwDAj1vAfuRTGYQBxpA3u24ITGLpcYfinsSRxwNwR5IcgLQV4I8kKQF4K8EOSFIC8EeSHIC0FeCPJCkBeCvBDkhSAvBHkhyAtBXgjyQpAXgrwQ5IUgr6NEWpy5yObkeIx/nxXj64EuGgFddwO6wFDlJW39oAMkjR3nd11inV8MdP10NgGaY7AzDpN/CYHBmEuI8WpaNehyMOZeedUEfgufAMIZEc6IcEaEMyKcEeGMCGdEOCPCGRHOiHBGhDMinBHhjAhnRDgjwhkRzohwRoQzIpwR4YwIZ/zLcEaapzD8LJ8Ov6Tr/Xpg8afT9dIMhx2Y0F+AJRKWsbgRWKIJ7G6tf6i682Nbn2IJjDjNG6QgdepRVqFwmxxomsfwuwF8xGUiaRvh659FVs7R5wSqT/j6N2PqB7j55xrf07qGi0UMbyxzEW50iGC0cWBDvoNMLte5/CyZMDiPsWdrTXgaO9Q3+2XjM/RL0YU7tP4ljfTPan2WJ15WlBy6I5LCWPK21ueIF5r/OzT+LfTNf3LjcwyBcWexAGznS+70NwIxDL9XvRCCvBaInVSdepXCocQU3PFv9tSFiuAAik3K4/pT+Oy5D3/ddq0jvihYBT7tEHb60SRLni90ZZNhP7WsPV2t/RREBiSez+PMffpMXfg5WWd+5GbZ0x396Oo14CUIAa6/uNiJ3Vo36zFiork9na9TV4W7KvQVpf4kHT4E7Hv9pUFfeBm/8sSVOIa8Wxxz8BM/1nFQAgHix6PVmKeVbygQyJA3Vh3iOOx+Y2nyWkD58dJ11E8tXcfh5/3AI9aVJMhra24/Lk7mh4qTYkmMfHuUcSW2+gZR/vR6YKAXvyxKxWH8jZMrBBiOH6ZA7tD6P36OkWSx01EF5EoTbqwIRoFoC79f439uPVX2h3oxQqAx5ri+1+kYnSJYjBTenSz5eq9G/XLQ8lui5X6oaGkSjDuPgwv2DxDtDYVz3cgR0/SpxP00sLPMn4L2cuxs/jQqIk5FcTpcOpvsfZk3Bo2cbkf7U55+WPsznn5UypNf28Ov0s9HR9+fzgFt8/zr9ST443BOlgP3fXj8/YDwaZviw6ba38XZ/3odVh73EG/qRhav0/3Y+/0i0+CGnpu/d+D+iq7jue/q2nHlryujv8O21A1A77NxTx73mvrs79CGPdirKsMinwJLCwxB8jxF86cDIQJ0JCTzuvessPBzq+yv+KqgFzeheCA8hsHB5QmCZQ5Tsi+lzwgwhHrdS53NBT836cVdnszgpb3+i2XcMNq9p2XgJ5bxrOU32Abx59gGcaNtvIx3H8Q4aAoM7t9mrxUYjDp282dzibdax1PpxyMjO9TcPUTdjIDh9Kt5fLFx3ACf3ME4PkNhr3UixLtm8rHO6uOGQbD0zZbBPJZl4DyGs0c++1RlaRwT+P/cbdDEJcp7WHjxZfp/Q5o00v+P6v9Bq2/Qf+Gh9J8CIwAcf9V/4mwCGWiuQB4ZAPfBwIm6YgFf3QPcMNN1j/Do0cKcB9E8kqAw9niRxdm0AsljJH4Rsvy26yU5jD6HtVmM579Y926Z50MsCYglAbEkIJYExJKAWBIQSwJiSUAsCYglAbEkIJaEfxBLAmJJQCwJiCUBsSQglgTEkvAPYklALAmIJQGxJPxVLAkET18rMfySsUpgxys4Lhc3fzl9AnUtY/LjSzD5H7oEk6KA5OijJZhnYNVBaN+65pL+3OW0wg+VJQlZ2Yk3ZQmxSP54mdVleuc3iPZ7ltP+JetCDuvQf70u5OAuHwSdpykaI44W65GnXgmmBOAXi21/e1UIe7EqhGa+eF0UfcOiWYTMI2QeIfMImUfIPELmETKPkHmEzCNkHiHzCJlHyDxC5hEyj5B5hMwjZB4h8wiZR8g8QuYRMv+3IfPnCO4pMg/3cseg3yX11ZdD8/TnsjFCerQfiec+V6Z4m/mKZTGWeG/RxTfAuddoLv6DaPEfKloKiJY6omY81E94geoFjHg00d5S6B4hlQipREglQioRUomQSoRUIqQSIZUIqURIJUIqEVKJkEqEVCKkEiGVCKlESCVCKhFSiZBKhFT+ZUglyVEYd4RnnSGVPIcR1HuYx9cDlbcUp0KQB4I8EOSBIA8EeSDIA0EeCPJAkAeCPBDkgSAPBHkgyANBHgjyQJAHgjwQ5IEgDwR5IMgDQR5/GeRBc7Do7BHkcZbC84jZWdeIU+/PyIjqFL5BUkjwGH5ZiPCQKETjGHdM6vnhyuIsxpxRIVICds6seHc2ROFbdO8z2UB/R3+vly5/IQf9fD7QQ3rer/lADzmaj2IFLI0J9Ft1kmmexrijyt/URwlBSRoj2TcJdBn2i8lBD0XKPyubkvqh2ZQszmAs/5aPhB6U5i72fmsy5YFm9rMkS/9UyQoCdh4zEdzjifOTM9qZnypOlsII9lScD5jFznxyFjv7Q8XJMUCczIU4GfrdEc03yPNa6vp/kCf3U+UpUBj15gI9Ej8t8sFxjyDaG5boPfiI4XHrB3D7xv31cIF5rEEzy/MYz79ZPoBgMeqofMDZ5W8dLbAMGJOwb0adHAHGJKdXvvt4gfsWY/iuqZvrw+f9nT9iEI+ivXBW8CxKIlnQ01545t9WWYHDCPJNlSUZjKO+WGW/d7bxVU1/7X0/VWW/qgLMYaB5gwt/rBkfOFRgyLdiETjjQ/53e+BI0E+czajyFMZfYWb6MoP4ninQh5t+v0G1H0thOZy75qJfwgHuBcf5bS1lLuoUUdwXz8yz10qw/WFx9omvx99V7dfQ4jWauPPc/GHS4QZPzTyW4rMMhp9G2DTDnMwCCGeB8K2az7M4RjH7OJ6hCO7UCFgO2MU+jqdZ/os9Nfu5s0Q/dQqXFiiMPI1myYO6fOvUAXsDoeGjurTrI5+POqdfu5yDrj+Iy4Hh31mPSHy0d4U9s0C92W/j1MkKjXNPdm8n82Jv76WgXujHkVIe8lPD0gMOYo7NgriYzoG6YXYUxbkN01P/9+Qt9smmgTuDbieBr+Sm1Q1QluxFwZ+PBz//Fd7VqN9gQsVZjDxDeGCe8GVxUu6ag+Bf+57P9xAHlUL5vyj/F+X/ovxflP+L8n9R/i/K/0X5vyj/F+X/ovxflP+L8n9R/i/K/0X5vyj/F+X/ovxflP+L8n9R/i/K/313gRFFchh/tAaPvwA6vjzll7uWvoJwDoRzIJwD4RwI50A4B8I5EM6BcA6EcyCcA+EcCOdAOAfCORDOgXAOhHMgnAPhHAjnQDgHwjn+cpyDpWESzcPjHDdkfP3RyTS8gJHHaZenyBMLJEIyN+bV4HcTwjXGnv9AxvRD0yZhVprwtihhOvkj8C9x1yhnPi5N8odKkxBw7IgbAMdPfSVH0S9kQd8rzmt0LB8X508lHKUpCiMfVILX+ENQVvolZewrPddBhDyLEfSFx/1WafLXaDc+Ls0fyysqMJhAP740P5fP+afSisLghj+TJkU8gvyu5YN/XH4/lUaUZijsONbBT9k/HiZ25T+XtZn/oeLkBBaj3yTIeBxpfi4dj/BDpUmT9AmvHn7qaFn8QeJY/qfP1dCQPvb471QQFM1cJZ751cQNS78MTe4glBvYlv9ooXACaNTTsTnLEC8k579qfIrEyPs1/g3svn9044NOBhNOx2QkAazgtrbnDsyZd2j5G0hq/+iWJ8D4CT8ufHNmAiz5UgrqV2IgQP9CvEPo9R8lcQM76h8tCZqkQPNduv2XEkQ8JtwoCYogMOFukhBuIAT9oyXB8sDvn5oBaGeWutEMBBo7Lxf1iY3/01nhWFx4gaoOUwA0LPJ0MQz5lSAYBrsfoChcmxD4SXLgCByjzqZi8A/JgaDuKogbMtjuwZr9IR7Oa1zb96eef1Mbfk1NfHsdkMNMwIPwhArkeyAqR3+QNp6gzya4zsKl81733iShwrWpj/tr/99QfhX4rfdEzdEfLLhKU+R7o/+L676hQqD17e3RYU/9Qvb261C33fZVQZ9v8Mnq+uOIlz9ecuYGlyo8lEFAavfjokdnhWpgSbgPkr2z3MtY/+2L3d2V/sH1xL5eNw/w9IPoJkueubPDUPm3K25wIOw9uRRP09hRZbzzgjN3V8vvqez1F5XfOKjyDUr/WBEKR1yi5dfXPvx+tbCzYIf9aq3/nuJgdyz0dVdnTD6UXrKcgFHUUaBwqktH6+V+uyoMyWDU2VCO4zH27Xpgd1fU7yna9adGDY9VuIvmWIw/UlT2bDTP09fqg/yu0pLc2WUp4RynvbOWkvgfXMPrg+50sQ6Tnr+DL8c+32b/i/mPKvxY9XCelugc5drQ99BggiBYDH/Bpg6VO19uIlxUYrm7Ql/CIL0Ljc7dMj+bdD9ZX7PXyOPFOPtNhzn3qQvn2v+55CsLfceBt7m68ufUON5mQ3uxE2g/ih36AWzRvh+6GXiRpluAf7txaEf7Q/YKDIY6Fyb3soLoTc3+DSASf3emiDnEgkcKzl5RcOpeeUQvNcSOhN9sI+l/jvSZ9xPJKPaF1fD75H8J+ZhI/p9o/fTb1v8YCnAD6oFYKxFrJWKtRKyViLUSsVYi1krEWolYKxFrJWKtRKyViLUSsVYi1krEWolYKxFrJWKtRKyViLUSsVYi1sq/jbWS42mMPl2XxJICRrEXyAbFYAR7DdzAWPpu+MYtDAgI30D4BsI3EL6B8A2EbyB8A+EbCN9A+AbCNxC+gfANhG8gfAPhGwjfQPgGwjcQvoHwDYRvIHzjL8M3CO5d5m7hMoXrq0tykfgl3wrK4vqsLB6Kp67xQLzI/5LL8oszeG5hNEYIF0K4EMKFEC6EcCGECyFcCOFCCBdCuBDChRAuhHAhhAshXAjhQggXQrgQwoUQLoRwIYQLIVx/GcLFwDJb9FsI1zHJ8LeCXJc81m0Ecn1WqbBzvt7vh7Uu2aARpvlp4ubZC7r7X1aG+1r5E38wz/KhWMM/t5Yme6tUw6+KoL2pCr8u1UDt3emvaZrpx6pPQlMMhtMMTvMcAUYXxCHR9KjkM9gpMATJ8xR9qDr7uzzNFEtgBP16mdMEV4pnMFK4fIav4mw+1Ib9k43jpJLJL8wDaF5v/2Zxms9jL47soPq6VZqu083Le/0HqzhUDHsUXScYjCffWnsCjACj2Iu9v12hhD/t+0FXcJM2/27ZNIY6vQ+1z/p+67nOjz88113rqpHEteU0bJDvzeLExtjVOj7s+Dd76rpFcABJJ+XrTrBhZk9PT7gWABydwHrw8//Z//9wZwgWPN38ec+DhkHZ3HaeXMOn8SczGEW8pf3MoRLDF4Qo/8unzYHlVuTpdMxRWyMyBfnfaz74exRl8pcrCiXwGPPmEj2e/WZFIf+zolDspygK+LRDKI5oksEPw35qRXu6WvspkA6IBudx5j59pi78nKwzP3Kz7OnafnR+AbDReOoDCAEGWhc7sT9LL+f2dL5OXRXuqsAh2HmF6E/SV/I0ZOVJ8qVO95GO8jiB0VeKQJO/r6cQPoyhKrx2lbBSthE7Ljzi/wA=</diagram></mxfile>
|
2110.05419/main_diagram/main_diagram.pdf
ADDED
|
Binary file (51 kB). View file
|
|
|
2110.05419/paper_text/intro_method.md
ADDED
|
@@ -0,0 +1,105 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Introduction
|
| 2 |
+
|
| 3 |
+
Constituency parsing is an important task in natural language processing, having many applications in downstream tasks, such as semantic role labeling [@fei-etal-2021-better], opinion mining [@xia-etal-2021-unified], among others. Named entity recognition (NER) is a fundamental task in information extraction and nested NER has been receiving increasing attention due to its broader applications [@DBLP:conf/semco/Byrne07].
|
| 4 |
+
|
| 5 |
+
<figure id="fig:example" data-latex-placement="tb!">
|
| 6 |
+
<embed src="example2.pdf" style="width:100.0%" />
|
| 7 |
+
<figcaption>(a) an example non-binary constituency tree. (b) an example sentence with nested named entities. We show the span and pointing representations.</figcaption>
|
| 8 |
+
</figure>
|
| 9 |
+
|
| 10 |
+
Constituency parsing and nested NER are similar tasks since they both aim to predict a collection of nested and non-crossing spans (i.e., if two spans overlap, one must be a subspan of the other). Fig.[1](#fig:example){reference-type="ref" reference="fig:example"} shows example span representations of both tasks. The difference between the two tasks is that the collection of spans form a connected tree in constituency parsing, whereas they form several tree fragments in nested NER. However, we can add a node that spans the whole sentence to connect all tree fragments in nested NER to form a tree. Because of the similarity, there are some previous studies adapting methods from the constituency parsing literature to tackle nested NER [@finkel-manning-2009-nested; @wang-etal-2018-neural-transition; @TreeCRFNER]. In this work, we focus on constituency parsing, but our proposed method tackles nested NER as well.
|
| 11 |
+
|
| 12 |
+
The two main paradigms in constituency parsing are span-based and transition-based methods. Span-based methods [@stern-etal-2017-minimal; @kitaev-klein-2018-constituency; @TreeCRF; @xin-etal-2021-n *inter alia*] decompose the score of a constituency tree into the scores of constituent spans and use chart-based algorithms for inference. Built upon powerful neural encoders, they have obtained state-of-the-art results. However, they suffer from the high inference time complexity of exact algorithms or error propagation of top-down approximate algorithms. In contrast, transition-based methods [@dyer-etal-2016-recurrent; @cross-huang-2016-span; @liu-zhang-2017-order *inter alia*] conduct a series of local actions (e.g., shift and reduce) to build the final parse in linear steps, so they enjoy lower parsing time complexities. However, they suffer from the error propagation and exposure bias problems.
|
| 13 |
+
|
| 14 |
+
Recently, @nguyen-etal-2021-conditional propose a sequence-to-sequence (seq2seq) model with pointer networks [@PointerNet]. They cast constituency parsing to a top-down splitting problem. First, they use neural encoders to obtain span representations, similar to span-based methods. Then they feed input parent span representations into the neural decoder recursively following the order shown in Fig. [2](#fig:post){reference-type="ref" reference="fig:post"}(a)[^3]---which amounts to pre-order traversal---to output a series of splitting points (i.e., boundaries) via pointer networks, so that each parent span is split into two child spans. Notably, @nguyen-etal-2020-efficient propose a similar top-down pointing mechanism, but they design a chart-based parsing algorithm instead of adopting seq2seq modeling, and has been shown underperforming @nguyen-etal-2021-conditional. Thanks to seq2seq modeling, @nguyen-etal-2021-conditional's model achieves a competitive parsing performance with a lower parsing complexity compared with span-based methods.
|
| 15 |
+
|
| 16 |
+
However, their model has two main limitations. First, when generating each constituent, its subtree features cannot be exploited since its subspans have not been realized yet [@liu-zhang-2017-order]. Thus it is difficult for the model to predict the splitting point of a long span due to a lack of its subtree information, which exacerbates the error propagation problem and undermines the parsing performance. Second, since each parent span can only be split into two, their parsing algorithm can only ouput binary trees, thus needing binarization.
|
| 17 |
+
|
| 18 |
+
<figure id="fig:post" data-latex-placement="tb!">
|
| 19 |
+
<embed src="traversal.pdf" style="width:100.0%" />
|
| 20 |
+
<figcaption>Illustration of pre-order and post-order traversal over the constituency tree shown in Figure <a href="#fig:example" data-reference-type="ref" data-reference="fig:example">1</a>(a). (a): pre-order traversal. (b): post-order traversal. We mark the generation order in the circles below spans and link two consecutively visited constituents by arrows. Note that in (a), binarization is assumed.</figcaption>
|
| 21 |
+
</figure>
|
| 22 |
+
|
| 23 |
+
In this work, we devise a novel pointing mechanism for *bottom-up* parsing using (almost) the same seq2seq backbone as @nguyen-etal-2021-conditional. Our model is able to overcome the two aforementioned limitations of @nguyen-etal-2021-conditional. The main idea is based on the observation that if we traverse a constituency tree in post-order (i.e., visiting a parent after its children), two consecutively visited constituent spans would share a boundary. Fig. [2](#fig:post){reference-type="ref" reference="fig:post"}(b) shows an example: the right boundary of is also the left boundary of and the right boundary of is also the right boundary of . Based on this observation, we propose to use a cursor to track the shared boundary boundaries and at each step, leverage a pointer network to predict the next boundary for generating the next constituent span and update the cursor to the right boundary of the new span. Our model generates one span at each step, thus needing only linear steps to parse a sentence, which is efficient. Besides, our model can leverage rich subtree features encoded in the neural decoder to generate parent constituent spans, which is especially helpful in predicting long spans. Finally, our model can output n-ary trees, enabling direct modeling of the original non-binary parse tree structures in treebanks and eliminating the need for binarization.
|
| 24 |
+
|
| 25 |
+
We conduct experiments on the benchmarking PTB and CTB for constituency parsing. On PTB, we achieve the state-of-the-art performance (96.01 F1 score) among all BERT-based models. On CTB, we achieve competitive performance. We also apply our method to nested NER and conduct experiments on three benchmark datasets: ACE2004, ACE2005, and GENIA. Our method achieves comparable performance to many tailored methods of nested NER, beating previous parsing-based methods. Our contributions can be summarized as the following:
|
| 26 |
+
|
| 27 |
+
- We propose a novel pointing mechanism for bottom-up n-ary tree parsing in linear steps.
|
| 28 |
+
|
| 29 |
+
- Our model achieves the state-of-the-art result on PTB in constituency parsing. We further show its application in nested NER where it achieves competitive results.
|
| 30 |
+
|
| 31 |
+
# Method
|
| 32 |
+
|
| 33 |
+
It is known that constituency parsing can be regarded as a top-down splitting problem where parent spans are recursively split into pairs of subspans [@stern-etal-2017-minimal; @shen-etal-2018-straight; @nguyen-etal-2020-efficient; @nguyen-etal-2021-conditional]. However, this formulation can output binary trees. We make an extension to cast constituency parsing as top-down segmentation, i.e., parent spans are segmented into $\ge$ 2 subspans recursively, for the sake of outputting n-ary trees. To this end, we add some $\emptyset$ spans (we do not allow two adjacent $\emptyset$ spans to eliminate ambiguities) so that each span is either a bottommost span or can be segmented by its subspans. For instance, in Fig [2](#fig:post){reference-type="ref" reference="fig:post"}, is a bottom-most span, and can be segmented by , and . We always include the whole-sentence span in order to cast other tasks, e.g., nested NER, to constituency parsing. We also collapse unary chains to atomic labels in constituency parsing, e.g., $\texttt{S->VP} \rightarrow \texttt{S+VP}$.
|
| 34 |
+
|
| 35 |
+
:::: table*
|
| 36 |
+
::: center
|
| 37 |
+
+:------------------------------+------------------------:+:------------------------------------------------------------------------------------+:-----------------------:+
|
| 38 |
+
| Initial configuration | $(c, A, p, s) = (0, \{1, 2, \dots, n \}, \texttt{null}, \emptyset)$ |
|
| 39 |
+
+-------------------------------+-------------------------+-------------------------------------------------------------------------------------+-------------------------+
|
| 40 |
+
| Goal | $(0, n) \in S$ | | |
|
| 41 |
+
+-------------------------------+-------------------------+-------------------------------------------------------------------------------------+-------------------------+
|
| 42 |
+
| Pointing action | Input | Output | Precondition |
|
| 43 |
+
+-------------------------------+-------------------------+-------------------------------------------------------------------------------------+-------------------------+
|
| 44 |
+
| [Left-point]{.smallcaps}-$a$ | $(c, A, p , S)$ | $\Rightarrow (c, A \setminus \{a, \dots, c-1\}, a, S \cup \{(a, c)\})$ | $0 \le a < c$ |
|
| 45 |
+
+-------------------------------+-------------------------+-------------------------------------------------------------------------------------+-------------------------+
|
| 46 |
+
| [Right-point]{.smallcaps}-$a$ | $(c, A, p, S)$ | $\Rightarrow (a, A \cup \{p\} \setminus \{c, \dots, a-1\} , c, S \cup \{(c, a)\} )$ | $c < a \le n$, |
|
| 47 |
+
+-------------------------------+-------------------------+-------------------------------------------------------------------------------------+-------------------------+
|
| 48 |
+
:::
|
| 49 |
+
::::
|
| 50 |
+
|
| 51 |
+
A problem of seq2seq constituency parsers is how to maintain structural consistency, i.e., outputting valid trees. To solve this problem, our pointing system maintains a *parsing configuration*, which is a quadruple $(c, A, p, S)$ where:
|
| 52 |
+
|
| 53 |
+
- $c$: index of the cursor.
|
| 54 |
+
|
| 55 |
+
- $A$: set of indices of all candidate boundaries.
|
| 56 |
+
|
| 57 |
+
- $p$: the left boundary of the lastly created span, which is needed to maintain $A$.
|
| 58 |
+
|
| 59 |
+
- $S$: set of generated spans.
|
| 60 |
+
|
| 61 |
+
We can see from Fig. [3](#fig:model){reference-type="ref" reference="fig:model"} that in the beginning, the cursor $c$ lies at 0. At each step, $c$ points to another boundary $a$ from $A$ to form a span $(\min(c, a), \max(c, a))$. There are two cases:
|
| 62 |
+
|
| 63 |
+
- $c<a$: a new bottom-most span is generated.
|
| 64 |
+
|
| 65 |
+
- $a<c$: several consecutive spans are merged into a larger span. It is worthy to note that we can merge $>=2$ spans in a single step, which allows our model to perform n-ary tree parsing.
|
| 66 |
+
|
| 67 |
+
In the first case, the new bottom-most span can combine with the very previous span to form a larger span whose left boundary is $p$, so we push $p$ back to $A$ (except for the case that $p=\texttt{null}$). In the later case, the very previous span is a subspan of the new span and thus $p$ cannot be pushed back. In both cases, all indices $\min(c,a) \le i < \max(c,a)$ are removed from $A$ due to the post-order generation restriction; $p$ is updated to $\min(c,a)$ and $c$ is updated to $\max(c, a)$. The process stops when the whole-sentence span is generated. Table [\[tab:pointing-system\]](#tab:pointing-system){reference-type="ref" reference="tab:pointing-system"} formalises this process.
|
| 68 |
+
|
| 69 |
+
The oracle pointing representations shown in Fig.[1](#fig:example){reference-type="ref" reference="fig:example"} can be generated by running a post-order traversal of the tree (e.g., Fig.[2](#fig:post){reference-type="ref" reference="fig:post"}) and for each traversed span, pointing the cursor from its boundary shared with the previous span to its other boundary. If we do not allow two consecutive $\emptyset$ spans, the oracle is unique under our pointing system (we give a proof in Appendix A.1 by contradiction).
|
| 70 |
+
|
| 71 |
+
<figure id="fig:model" data-latex-placement="tb!">
|
| 72 |
+
<embed src="pointer2.pdf" />
|
| 73 |
+
<figcaption>Demonstration of the generation process and the neural architecture. Black arrows point to candidate boundaries that are not selected in each step. </figcaption>
|
| 74 |
+
</figure>
|
| 75 |
+
|
| 76 |
+
Given a sentence $w=w_1, ..., x_n$, we add `<bos>` (beginning of sentence) as $w_0$ and `<eos>` (end of sentence) as $w_{n+1}$. The oracle is $\{ q_i \rightarrow p_i, y_i\}_{i=1,...,m}$, where $y_i$ is the span label and we use $l_i = \min (q_i, p_i)$ and $r_i = \max (q_i, p_i)$ to denote the left and right boundary of the $i$-th span, respectively.
|
| 77 |
+
|
| 78 |
+
We feed the sentence into BERT [@devlin-etal-2019-bert] and for each word $w_i$, we use the last subtoken emebedding of the last layer as its dense representations $x_i$. Then we feed $x_0, \dots, x_{n+1}$ into a three-layer bidirectional LSTM [@hochreiter1997long] (BiLSTM) to obtain $c_0, \dots, c_{n+1}$, where $c_i = [f_i; g_i]$ and $f_i$ and $g_i$ are the forward and backward hidden states of the last BiLSTM layer at position $i$ respectively.
|
| 79 |
+
|
| 80 |
+
We use fencepost representation [@cross-huang-2016-span; @stern-etal-2017-minimal] to encode the $i$-th boundary lying between $x_i$ and $x_{i+1}$: $$b_i = [f_i; g_{i+1}]$$ then we represent span $(i, j)$ as: $$h_{i, j} = \text{MLP}_{\text{span}}(b_j - b_i)$$
|
| 81 |
+
|
| 82 |
+
We use a unidirectional one-layer LSTM network as the decoder: $$\begin{equation}
|
| 83 |
+
d_t = \text{LSTM}(d_{t-1}, h_{l_{t-1},r_{t-1}} ; E_{y_{t-1}}), t\ge 2
|
| 84 |
+
\label{eq:1}
|
| 85 |
+
\end{equation}$$ where $d_t$ is the hidden state of the LSTM decoder at time step $t$, $E$ is the label embedding matrix, $;$ is the concatenation operation. For the first step, we feed a randomly initialized trainable vector $d_0$ and a special `<START>` embedding into the decoder to obtain $d_1$.
|
| 86 |
+
|
| 87 |
+
We use a deep biaffine function [@Biaffine] to estimate the pointing score $s^{t}_{i}$ of selecting the $i$-th boundary at time step $t$: $$\begin{align*}
|
| 88 |
+
d^{\prime}_t &= \text{MLP}_{\text{cursor}}(d_t) \\
|
| 89 |
+
b^{\prime}_i &= \text{MLP}_{\text{point}}(b_i) \\
|
| 90 |
+
s_{i}^{t} &=\left[b_{i}^{\prime} ; 1\right]^{\top}W_{\text{point}} d_{t}^{\prime}
|
| 91 |
+
\end{align*}$$ where $\text{MLP}_{\text{cursor}}$ and $\text{MLP}_{\text{point}}$ are multi-layer perceptrons (MLPs) that project decoder states and boundary representations into $k$-dimensional spaces, respectively; $W_{\text{point}} \in \mathcal{R}^{(k+1) \times (k)}$.
|
| 92 |
+
|
| 93 |
+
For a newly predicted span, we feed the concatenation of the span representation and the decoder state into another MLP to calculate the label score $e^{t}$: $$\begin{align*}
|
| 94 |
+
% d^{\prime}_t &= \text{MLP}_{\text{focus}}(d^t) \\
|
| 95 |
+
H &= \text{MLP}_{\text{label}}([ d^{t} ; b_{r_{t}} - b_{l_{t}} ]) \\
|
| 96 |
+
e^{t} &= HE^{T}
|
| 97 |
+
\end{align*}$$ Note that we reuse the label embedding matrix from Eq. [\[eq:1\]](#eq:1){reference-type="ref" reference="eq:1"} to facilitate parameter sharing.
|
| 98 |
+
|
| 99 |
+
The training loss is decomposed into the pointing loss and the labeling loss: $$\begin{align*}
|
| 100 |
+
L &= L_{\text{pointing}} + L_{\text{labeling}} \\
|
| 101 |
+
L_{\text{pointing}} &= - \sum_{t=1}^{m} \log \frac{\exp\{s_{p_{t}}^{t}\}}{\sum_{j=0}^{n}\exp\{s_{j}^{t}\}} \\
|
| 102 |
+
L_{\text{labeling}} &= - \sum_{t=1}^{m} \log \frac{\exp\{e_{y_{t}}^{t}\}}{\sum_{j=1}^{|L|}\exp\{e_{j}^{t}\}}
|
| 103 |
+
\end{align*}$$ where $|L|$ is the number of labels. Note that in the pointing loss we normalize over all boundaries instead of only accessible boundaries, because we find it performs better in our preliminary experiments.
|
| 104 |
+
|
| 105 |
+
Our model follows the description in the previous subsection for parsing. For each time step $t$, it selects the highest-scoring accessible boundary to generate the span, then selects the highest-scoring label of the generated span, and updates the parsing configuration (Table [\[tab:pointing-system\]](#tab:pointing-system){reference-type="ref" reference="tab:pointing-system"}).
|
2110.05877/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2110.05877",
|
| 3 |
+
"month": "2021_10",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ACL",
|
| 6 |
+
"title": "OpenHands: Making Sign Language Recognition Accessible with Pose-based Pretrained Models across Languages",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2110.05877",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.05877",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/tex_files_extracted/2110.05877",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.05877/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.05877/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.05877/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05877/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05877/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05877/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05877/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05877/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.05877/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2110.06257/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2110.06257",
|
| 3 |
+
"month": "2021_10",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ICLR",
|
| 6 |
+
"title": "Causal discovery from conditionally stationary time-series ",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2110.06257",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.06257",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/tex_files_extracted/2110.06257",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.06257/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.06257/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.06257/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.06257/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.06257/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.06257/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.06257/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.06257/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.06257/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2110.08350/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2110.08350",
|
| 3 |
+
"month": "2021_10",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ECCV",
|
| 6 |
+
"title": "Disentangled Differentiable Network Pruning",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2110.08350",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.08350",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/tex_files_extracted/2110.08350",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.08350/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.08350/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_10/main_diagram_database/2110.08350/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.08350/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.08350/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.08350/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.08350/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.08350/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2110.08350/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2111.15097/main_diagram/main_diagram.drawio
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
<mxfile host="app.diagrams.net" modified="2022-03-05T12:19:09.360Z" agent="5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36 Edg/98.0.1108.62" version="16.6.1" etag="IyvDT1dnjugC4ElBcOEa" type="google"><diagram id="T8zZj2U4Ipr4i2PrreUk">7T1rc9s4kr/GVTsfgsL78TGJk7mr2p2bupmp3fmo2LKjHdnyyXJi76+/BkmQeJAUJUEU7UhxKRIIgSD63ehuXLCPd88/r2cPX/+xup4vLyi+fr5glxeUEqIF/GdbXsoWyaqG2/XiuurUNPy2+M+8asRV69Piev4YdNysVsvN4iFsvFrd38+vNkHbbL1efQ+73ayW4V0fZrfVHXHT8NvVbDlPuv1zcb35WrZqqpr2/5ovbr+6OxNpyit3M9e5Gvjx6+x69d1rYp8u2Mf1arUpP909f5wv7eK5dSl/97njaj2x9fx+M+QHtPzBt9nyqXq2al6bF/ewt+vV00PVbb7ezJ/blnj2ZRmvWDMFUj8YYMR8dTffrF+gixsIE1n+5sUtlxvke7O4ilc3/OotLCMcCUy5UopoJjGuADCr4Htb36xZAvhQrUL7irDtKwILcn89t/3xBfvw/etiM//tYXZlr34HhIe2r5s7GP+SwMebxXL5cbVcreH7/eoeOn24Xc+uF7A6UfPN6n7zeXa3WNp1+H1xBxhO8S/z7/D+v6u72X3VpSIHwuH79ezxazERUn35dbbZzNf3RQvFtrUTcj6EaC+ENA3g0wIdhnkKHYkPhwZPoPH544ThkWGxjUQiWG7HG73lpqKFFjKstjjjfgwO7n7Sg/yuLTfyywQcfwAjxmWvycIlw6ITTrYTgTwSEahkbefXIP2rr6v15uvqdnU/W35qWj+Eq9/0+ftq9VCt+b/nm81LtU6zp80qhAgszPrlX9Xviy9/2i+wCtXXy2f/4uVL9W0AfB43s/XmvdV6GrAWbZ8Xdg2qQa9djy/L1dVfv39d3JfNVaeGmuxi9IMV1m71tL6aBxwcbng7d7TRC3xJDoeh/lEpp2Ya49ON6aWbalWaxSbjEko2KtgbrrsSjkwJR/eraexwIDoL68ejHEZCQT8i5bg7H5t0nhebf3mfPcKBbw3d2C+ObH4cctMpuRHcizFZ6G2AAe6tU7WSgerrQfhxs179Vbsk9ly+dpCTLQAfvug7aAIsIUHRomqLHhKsxv51tYBZJJBzDgccjVDiRfUj33sSjVN7C9xANBqofNZkoAIr6ucdhigD/BKnQhTyZhGFaZIHU4gIMUUfD1FSl8nl6vv9DyHDOZNIYVO/dCjRORpRpotemX4sM3JEEZ9LfO8nrp2iGshr3osdhu9M/oxGfoiYhXTQ/z50m3p7fhi6Fc7pchJK7Xf4nLXvMbRv0uImIv1CnzKDmPaYfYbNH5I6jn4YIpREIUNx/drLHHZeoIOAMJIn6UyQfQTZ4n4i/f6nYxAkTf1RKXK8Xft4Cws0MiHKPHZPvOu1v4XMQ3/08eweN+MpYsoIBvIWTHHRE9kxRZFMmCJYqAPKWgc8Aq60oMaIexX07YkL5wjbwRw7irho85LJ5aZ6zgDk8v+eVu7Cu8diBd5DB8IfnpuL8OnW/v/shoEZlCOV7Qkegf61aWMlsYqXKoOz5eLWBi9cwTrPof2D1eYWV7Pl++rC3eL6unATtCmaoSo62Ow+zN2CMTLeK3K30ITlkBaWE7tM9wJ76vP67elhvv5lvnn380RBNCB85TR2gE6dqdSBNjvg0oijGnCXZ8DtSI40FfKtsUlZICd7Reg5TGZXAapSAdovP3PEybi7ZpeX/znLyxaImUBA8jSUUB6JWFO31lsPo1VYIbbVhdUWSJvDhbUlGOoN7P5Mn6OSloiOLRFUlKnsJgkb4MG6elp/q1WKjiVpgXMD2z990G6BcwPaPwPIdu7yHQyJ4ZFsA90T6/lytll8Cycx3GdBVOSzIHvHf4QDxUHfHR4LAO/sxev2YDs8ds+XRr74MKcHPpQD7usNYQM8Z28cQ3cI/hoHRWPMYpEvbDCGOodrxzi5MNTtrR8JQ2l+DA2wpgv3tvlq2/Gc9OJ5Nh56WgwlwkSxEXJfHN06Ui4sjXzVGvfOi0X6o8qM1AMi+srcy1Dbvp7fzJ4K2+oKdLDZ4t4aPx5m7Zmk2a9OUywQZp6vL9zWoU2gjK9da4pMi0nj8gkPUqtSl19kud7MrkLDtU1L9QxXWKVygf1GVu6FpwbuHw/vHmd3D8vF/S38dPUwXwMxre4fh9q94ebCDsYWzEnrD/SzU8K99vLVZULH5vHGWhVdRlsnKu1ggZEBKELaOFQW/GjLZSxB87Qc5NcgbX4NN8Zysf8Yv8xhDR83BSbCQ3+Bdaf4v60T42G1LLDIQyJ7o3y3/rAAhIXbj3bD39ez+8eH1SNgOsUfV/ff2DPrvRk0Wvj0e4t2dREt5zebFgqomnd2BXnuChX5roCELUfodqFkICwZ7rNQrpAiTdyOS8X2qYy3+O+FQkJnILQ0bnG6Yowrini0fO2iS6o2voSwX20gQ/4H2+qAPa4Y+2W1vpstR5dgRnz8rGQeCZaBokBYIUwwUVhzJjQZpNxIjIw2mhoiJAeEaHElZhFkqe/WF2Qpqw5bgCPdx217YlY5iGXi5NnudP3telFJD4DxT51oUze3TaVT4IzwFIUoevVPIZ7FMZ9ikDw+YPemQzTvsHeznH2ZL39dPS6qx0/2g/4edajHzsA5GAPTX2lmJNNMWekQyWaODOX1v1YFGCQSltY6lYw4ozAQ1AZJ5v3rSWMYzFRMJ1O5XnxrVfcsqN9VULP6XgG4Tv34JKzpF4tar4qAf/tr8fCuUmtCNfwVTL6dh9I3wUOzPkXEQ+vGgtQmvS+eQ7fa4jkawCE1qN1cCsaoIlQbmrJIaYALe/8ybOHytn27ETVzm3dyOhfT548fwMC5yKKgD7bvdrDjDEdcpjZvv8bOARNbkCeHls7b9tCGi8KDONh7WEpbsBAWcLUqsWUb6xrCaQ+a0z9mz5Oaz/76/ihT61Piv6xbZjV4stPX33Owgxa3jiCIkUauOEdL4BWTSLRUlZQMEa0pE9RwQpR0KSwH8Qfayx8G69vlguHC93pTeSntVfgdiIpSojBuQTpffptbkCRXupX2UWhuiGZ1Iprbb2rHprDXE98otEnJkG8lw7YtIKpRRXwKC0uMOWgw3XB1weaPrxdufuzkoTUve3J9W2CUI/mep9u4tscZHj48iMPcMQCS7pt+nN1fL65nm9q30mgk/+MZJWeAeUZtS2h3C8BUDoC17b/5vpJ9Q/g/bXF1DB6dICml6ttNf3z64pp+xv5tm/bOybxWtMsgb01cN10gpgwV7i/dCKYUSU4Vw1hKRrAWLbucxKBmCPjLIXm37XLuiVn9eOtj1Y5IdUwMJ3vNpUMj7Zxi5Wuy01vffpn9DRdGAkAKt376KTQtmmfD7U8l4tMdBEDethZHF9TfHCaIAheg5dJ+tncVFgUEINC2vqTu6yhjr2FoM0yJhfUV+2YXExoqbBSVWeG6lI9vr1ekLCpiFrWB4Xcqvt16n6vBS1gJB6109Gbp4dv7omuJ4l2j/xEMb3GpuYb9a8UaRR3S69X8Oju41fE70JYnKpmw7VGwYdtMyil6rNi2Fpy3ePKaHQfNJUu2TRVTto0hW7YXK8ZsL3qs2V4pmLNtd+zZNuJo8hUPHoJx9aUa2RquLCxfbnrafXs3zIuHh5ohy5jdpZI5N5cbnHZMufXa7LFpv/WmEFND8bUmCb8xJNSqX0LRZ6GbCN3QwCUaI2kUZvXfZIRud+DIEQSsOKttg2MGBOhpQlFpDGNSG1eArlbjOMLSSFr96emocd1RA0fAKHnGqFwYZQjyOVTqgzsRQol0jxUngNx7G3OKackEbAWhhWQaM6O0FGHiC+EcKW+RWzxxtsStEVRxoxTTirZsaDKCaMM/ZG/h08GgSjc406r1bwtUElZaaSEw/NNSiXCP2aSC/lSgoQkgWqpZHZBP7ipW7ZjGeeEncU676FWIuFEsxPVsrm+uEgECV+SVnn+5qa+4cnG0RsBds/xER0rfobl6/ahMsEDAkeqXiW6QKf2OUIyU5lxIeCeMcTFkFrnS7VwB9pGqisKjHk4zaso0E1OJmOtr3kolH/WnD5/tctrsmI29g0gKKb0qeqHAxXWtTklznHRVwgwimmIDmpk9XBQPmkU2ekl35lI58yMJ/O1a9Knkf/9RA8euZjlpJtWhwvRn6O8u/4/F2cKEf9FREGw3bpdZbFMiEPFslqpmTjY21F9n7shiux9PXqWqu7vYzqvcTgKlt0hWYxCw8fpVVcTMhtLpbmiqm74pyUo5RqCgEBvGjyV3Bf6n7/TQozIfgjDWvtFgkxoGciBPDA8pmnRqmfzrfA19N3VOwGCupOkXJuWBXCngQOooJgTFDGmpiVZSM6xYvG0EUlMTTLQo/9zMttTO2bnijVTISCGElIZJ7e7SP4lsfO6U53gMJxz6aghnsF6q6YeSSPLopSORiwDWDgqBZvCmIkzdYnHnIhcOgl8qQwzF3Ggt9JBZ5KIXme59HNeOi2TGtmJnp5YZ5ZxtmXmf0EGF66f0LNLmCAYeH5G4OOIGUFYxAeKoVrDqYn2iSLB0ipOQxyGuXlnUMYlstDXuEcsWTVmApkSzfjTtEkhTp8pttLVdSGWzL49GUbnliFGAD1i4l8yM6zSxL/nbti8ZZYgIzcHEFEZJFrKW6dqXctw9qdS+5PvxpCkrydOyLmUH4h7busSIea84VieTRLfH49qwI60M5gQbLYbMIhubS5naeOYlfYN+mSmYl0eklz7zcksARC56ERYdOMggZqjGjPMhs8hGLyNtE3YKGzmcaN6WCnxccbNNBc5DUpllAwgHRD1kN3lRfexNQ2NogOpa7CkfXjuqjyUzJor0/QyeEo1YY/Zlt/tOeZD9YN4eaEOva6t8RFdGy2lwB6P0YKt05F3PAI8G79uEeDTpIKSD3c2vEYnSXCGEUIJYryZLJ40Y37iyQaYB0WHxh1uiUjqMFD+NRyFhJBfS1qXlmDu/X5DGA6NwTZq/DGecqTSP5wzsw4DdHoI0CWCnmUBvFdiZYF3UOJVagBXEqCbCFaGst5UBFfzXdECd7iScQX0QqJVEngHgLIBJwDqtjnaGdT+sjUAAJgz0C7ZXfZayA7WgSOjUBz8FUKepHWdQ94KaAaiUUFppwYhhMtromC4DT6u5nSF9CKQnzL+PXgfOL87AzsUZhhfAt3tPBLR8wzhR3JUC3gGlTlOdQXVXbTsCRvEzRg3GKE0Qx5IQJpgiJNr7ZwyB6tm4CSZT7UONWpCInvEpF4eyYc+myRJwNs0EMGpQhsHrqnzhe559B3V0rHX1bI1vem8Nqqfyxc3NXF617nhcK/MF44t4x6PYCsiwwdeJqpxLpLABi4sTKokzWitMNQxxgrHhkil7JgftRrG99vcUBmhyzQ1lhICSGO9pS8SZ5lpozDnp293bcWSiBOLUYKBUDUxQq+jBhp60LQUIDoONYFoarjAZMP9tVQ322KjUYyc6XBywVzQyPeZIWO8m20tlPrSQ7S4bla+SNAkmCPQloxXTRGHWl12w69AcZB+8C0JsoAtW/Bi02fEAxyDOsTMlRk+4H5mid5epb1iCSg0A54IrCaqmYnHAsEYYKMxGjYEi6Q7sGEal/SOPQ6Tt8z8GkdJxidTXgXeIfT4ga+C0RHqSOLYyiH+ihMtBBbWH3BKgMKUd/nm71R51Gd5TAyalKUKRtT0VB5pSgofbKDZQVMDDCkdYe1JuNP/YabJFwc5IuSOn/OxJuQdkLZyYco8ZbD1V2hTSnoapsT3DHjMdHcF1gFlqadPTfFlMm70G42DSDKdvTkWZ4+YThTJ133IVO8WLj0yZ/366e6i+M3fH6rsYmXAjQdtRdGoaxNwraIlCxAihmTBWn608tTkErQ0vVAQejynBDGdkX2ruF7RbFPGM5DxuutO+KvIB6R/TJueTaNCTJuxeKQ13RsoQZTTBigpRJbUMp2xlgKgJk0QTFp4LZjQiUhF7yrqVo44i84rp8eh63NyuE2S5vDbTN90CmrLHKsFrwTjiIBYlAeTmhESB2FIhG/PsBOOwwlMp8WgrXQUDRVpKhln0EFvukpF6TpkjNkpuzys0P2NR+sopRmCk7VnjlWUo2XFIZsttMtJMGrdyDqLsxT8JNoC1MSjDBgtAsRByiiNAHBsgj5UAPG3JbbGxIlzA76WgkjvQB+EkoI0AUmjBtTZCG9ONxUPDSfSPk7P2juhcsAaTlMH6Y600QDtyEZWwZsAqsH3HuiW60cKaEVBfmeVJtCV0yCCpgFWBWlm+68NhbX6clLXMsJZYKmmEMYZHNgBVyKdIxzdPTtfmx8lYe5crP1FggcBUZAwAwQiXifQlyBO+LQfUnwjU9AzqnUFtj6S1uQrYHnxC3BJ6fkGGqWaGlF6Elnji7aA2qPbgwwicZwj+dGpDeHjg7sdiVydh+//9dNF1FHYRQPzWT8OmbWc617ZTcEJzZek3Rzy3nvj8z/qQY1ofdF0hvm3sPlXbI4BiTF60VYZic5ZzZSw2t3XWYtTl8qX1AOjC0spw/HN0lrPoOcvZv1bXXmibQt9Bz/UsKisnPIEabLz2E6hVxznSgWHWhkqttxFdt+EdtwkMs84Tq72FZWMeZf2m6qL2+wEIJ0hje0FTA5zbpLo5A76NQX8HO41JDVZ4SzIbI6gOS+QKg6jIwNtbDx878/bDeHsxil3EgjMXy5jw+XIBbIdyOeHT+5JLVUsqymEKMsNbv//kGkp+Wy77Tc36y6HvVverx4LIWjo/Vsy/7GrhFMyy+HbbfE7BVa2XBzKvpVlDt4puIdx6VWtZrvzQX5Lglw04swxOw8EdkL0eYQcHYK+DJ8w9cR4IdCdMPaHuN4WC3Yn2RLgH4r0R8J6Ij4R8LeYDQe+kdiDs/ek0Aj/pevkStYarUwn/3YAXdPGg5qsDnkLg/VKEcG8Ug54+noLQPc3ZY3j5NphKMtNa/lbS3PtlpTg0Q7+E1ysFovvWkSLRjeQ9kxDbJsG3TCJSM7omUTVGoEsv1/QYX0p5SjNkGxty/Nd+/uyYsM/BWrVvp5Y0Kq6nmtjGQjmhqQLeqCgeocdqSnDpQOW8eYRKdRki37Yo1r5Wi9vVTebJyFgR9zXRWBG3+9hDdO0WFfWsie7iOQwDCUPPYdtZ6qfSPPcIG4oiYw6sJBsWTeZYDtw0PUIl2Xhj8kerLOssIr+C4oExRt0YuldQkQQbTjEDdyWaYB4HFQmOXNiPdb7RnqCiPTZGTVpyJE0ee1NsbMvON6eI2qoMkkuplcLpBsjJ2Nq4cR87Za/WLIxcsFcTNTXx4OaIZ8mDeNaRUwIpUA2TnGGNOcUiDF60YR4YM+BkQnHC+jhYe1ikUEYKxggWJq7aBShGmHYuML1nwLOQDCkK9Go3yaRypy+NHhhp0jCVN36scL9SSQxBvndzQsx4UNmSYzHjgUVMImb85oPw9uS2+m1xW2m35KUBKxvIwZCeA053zeO06SVKYjCJmRSGDTthJ2W2nCNYFiZh/aRm0YbGaLliBI9UwqS9SkI/CZ+0fEnNPoC1mpCFCLOFhRTfJmkHvjoq7zcLJUF10ibHrsjlfimhodClGNS1QlBWKtueySZCwQopreE2kmKjoqhCULmqtWNGcRWHk+Qk9DR2LM0TfVtalSpOrzXAYmGJ42QiwgEwJdoUgVsTUquIK0x3Dv7KFfxFMRC7EVTq0lDaK8zvCLFfBJ+rle9M1jYmXzBMbaVGE1G1Dd3lIK6kLN7TE3IHRGkTjgjnHAPjL99FDkCnkSDHKxp6ic9FQwfHFjEcohDoBUYAllV/bVkdDGmFOVHKJvXgWq3ykYiC/hDE+mfhFmkZ9CMiETkj0XAkAj1VcayZ4BKU1dhFV7hwpNGaF++kJfb4ZDg1Zr31S3rGqcE4pW26kATjhoEKqkRsO2wXdSdDqTELrl+eC67vqCRLwqVkVEgRm9MKgS0KCnD5zlty3E6GUmPWXL8UZ5QarI4LsIANB8lmrIMmRikKcpFisL9Z8S7ohFAqzZI9IkqdDxrJxaVA1SKefq4mJPhImox7RJSSZ5TahUtxY1N+CSsKE4UoVey3KOz+SOoLPB1KjVTG+jXGrky+xny4+1L74IPtl/5iuYKD5Wh5HtWKGiUixC3K7TCQv1QQQl3NgqxVro0WAgjG7lLEh4gZgzwDRLJdquUqw5GQWIAZo+z5ZGEEWlmRUzC4M9HWpbtnTIviAmmgVsUZrDSNJElB9kJraQxMhZFc1YouqgBqr3sTOs0+/T8=</diagram></mxfile>
|
2111.15097/main_diagram/main_diagram.pdf
ADDED
|
Binary file (46 kB). View file
|
|
|
2111.15097/paper_text/intro_method.md
ADDED
|
@@ -0,0 +1,123 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Introduction
|
| 2 |
+
|
| 3 |
+
Generative adversarial networks (GANs) [@goodfellow2014generative] have obtained remarkable achievements on image generation tasks. A GAN consists of two networks (i.e., generator (G) and discriminator (D)) that contest with each other in a zero-sum game. G learns to generate semantic images from real data distributions, while D distinguishes real data from generated data. Since G and D have conflicting optimization objectives, GAN training is unstable and prone to collapse. Therefore, many efforts have been made to manually enhance architectures of GANs [@dcgan; @biggan], but this requires much professional knowledge. Recently, neural architecture search (NAS) has proven to be effective in automatically finding superior models in various tasks [@nas_survey; @automl_he], including GANs. The early NAS-GAN works [@autoGAN; @agan] search only generator with a fixed discriminator to reduce search difficulty, but this may lead to a sub-optimal GAN. Although some recent works have searched both G and D, they suffer from the instability of GAN training. For example, AdversarialNAS [@Adversarialnas], which is the first gradient-based NAS-GAN, proposes an adversarial loss function to search G and D simultaneously, but the architectures of G and D are deeply coupled, which increases search complexity and the instability of GAN training. A subsequent gradient-based NAS-GAN work [@AlphaGAN] also demonstrates that simultaneously searching both G and D hampers the search of optimal GANs. DGGAN [@dggan] alleviates instability by progressively growing G and D but takes 580 GPU days to search on the CIFAR-10 dataset [@cifar10].
|
| 4 |
+
|
| 5 |
+
In this paper, we propose an efficient two-stage **E**volutionary **A**rchitecture search framework for **G**enerative **A**dversarial **N**etworks (**EAGAN**) on the unconditional image generation task. First, to alleviate the instability of GAN training during the search, we decouple the search of G and D into two stages. In stage-1, we fix the architecture of discriminator and search only generators. All generators are paired with the same discriminator, i.e., the candidate generators and the fixed discriminator are in a *many-to-one* relationship. In stage-2, the best generator of stage-1 is used to provide supervision signals for searching discriminators. Specifically, in stage-2, we create multiple copies of the best generator architecture of stage-1, and each generator copy is paired with a different discriminator and trained independently. Thus, the generators and candidate discriminators of stage-2 are in a *one-to-one* relationship. Because we indirectly evaluate the discriminators of stage-2 via IS (Inception Score [@IS]) and FID (Fréchet Inception Distance [@FID]) based on generators, the one-to-one strategy has a potential problem, i.e., if some generators have mode collapse at some time, then subsequently searched discriminators paired with these generators will be evaluated unfairly. To solve this problem, we propose the *weight-resetting* strategy, where all generators inherit the weights of the best generator of the previous search round before a new search round starts. The results in Sec. [5.3](#sec:ablation){reference-type="ref" reference="sec:ablation"} show that our simple yet effective weight-resetting strategy can stabilize GAN searching. We summarize our contributions as follows.
|
| 6 |
+
|
| 7 |
+
1. We greatly reduce the instability of GAN training by decoupling the search of generator and discriminator into two stages, where stage-1 and stage-2 adopt the *many-to-one* and *one-to-one* training strategy, respectively.
|
| 8 |
+
|
| 9 |
+
2. We propose the *weight-resetting* strategy, which is simple yet effective to avoid mode collapse when searching discriminators in stage-2 and ensure fair evaluations of different discriminators.
|
| 10 |
+
|
| 11 |
+
3. EAGAN is efficient and takes 1.2 GPU days on the CIFAR-10 dataset to finish searching GANs. EAGAN achieves competitive results on the CIFAR-10 dataset and outperforms the prior NAS-GANs on the STL-10 dataset [@stl10].
|
| 12 |
+
|
| 13 |
+
# Method
|
| 14 |
+
|
| 15 |
+
NAS aims at automatic architecture design and has achieved remarkable results in various fields [@nas_survey; @automl_he]. It can be formulated as a bilevel optimization problem as below
|
| 16 |
+
|
| 17 |
+
$$\begin{equation}
|
| 18 |
+
\begin{array}{ll}
|
| 19 |
+
& \alpha^{*}=\arg \min _{\alpha} L_{\text {val }}\left(\alpha | w^{*}\right) \\
|
| 20 |
+
\text { s.t. } & w^{*}=\arg \min _{w} L_{\text {train }}(w | \alpha)
|
| 21 |
+
\end{array}
|
| 22 |
+
\end{equation}$$
|
| 23 |
+
|
| 24 |
+
where $L_{\text {train}}$ and $L_{\text {val}}$ indicate the training and validation loss; $w$ and $\alpha$ indicate the weight and architecture of neural network. This process aims to select the architecture $\alpha^*$ performing best on the validation set, conditioned on the optimal network weights $w$ on the training set. There are mainly four approaches in NAS: 1) Reinforcement learning (RL) [@nas2016; @enas] based methods train an RNN controller to generate neural networks; 2) Gradient-based methods [@darts] apply softmax function to relax the discrete search space, allowing differential optimization of architectures; 3) Surrogate model-based optimization (SMBO) [@pnas_liu18] builds a surrogate model of the objective function to predict the searched model's performance, which can substantially improve search efficiency; 4) Evolutionary algorithm (EA) based methods [@amoebanet; @cars] maintain and evolve a large population of neural architectures to produce the Pareto-front architectures.
|
| 25 |
+
|
| 26 |
+
::: table*
|
| 27 |
+
+----------------------------------+----------+-----------+------------------+--------------------------------------+
|
| 28 |
+
| Method | Type | search D? | Multi-objective? | Evaluation Metric(s) |
|
| 29 |
+
+:================================:+:========:+:=========:+:================:+:====================================:+
|
| 30 |
+
| AGAN [@agan] | RL | × | × | IS |
|
| 31 |
+
+----------------------------------+ +-----------+------------------+--------------------------------------+
|
| 32 |
+
| AutoGAN [@autoGAN] | | × | × | IS |
|
| 33 |
+
+----------------------------------+ +-----------+------------------+--------------------------------------+
|
| 34 |
+
| E2GAN [@offgan] | | × | $\surd$ | IS+FID$\dagger$ |
|
| 35 |
+
+----------------------------------+----------+-----------+------------------+--------------------------------------+
|
| 36 |
+
| DEGAS [@DEGAS] | Gradient | × | × | Loss |
|
| 37 |
+
+----------------------------------+ +-----------+------------------+--------------------------------------+
|
| 38 |
+
| AdversarialNAS [@Adversarialnas] | | $\surd$ | × | Loss |
|
| 39 |
+
+----------------------------------+ +-----------+------------------+--------------------------------------+
|
| 40 |
+
| AlphaGAN [@AlphaGAN] | | $\surd$ | × | Loss |
|
| 41 |
+
+----------------------------------+----------+-----------+------------------+--------------------------------------+
|
| 42 |
+
| EGAN [@EGAN] | EA | × | $\surd$ | Loss |
|
| 43 |
+
+----------------------------------+ +-----------+------------------+--------------------------------------+
|
| 44 |
+
| EAS-GAN [@EAS-GAN] | | x | x | Loss |
|
| 45 |
+
+----------------------------------+ +-----------+------------------+--------------------------------------+
|
| 46 |
+
| COEGAN [@costa2019coevolution] | | $\surd$ | × | FID (G); Loss (D) |
|
| 47 |
+
+----------------------------------+ +-----------+------------------+--------------------------------------+
|
| 48 |
+
| EAGAN | | $\surd$ | $\surd$ | Pareto-front(IS,FID,#size)$\ddagger$ |
|
| 49 |
+
+----------------------------------+----------+-----------+------------------+--------------------------------------+
|
| 50 |
+
:::
|
| 51 |
+
|
| 52 |
+
Due to the great success of NAS in searching neural networks, many works have also applied NAS to search GANs, summarized in Table. [\[tab:NAS_GAN\]](#tab:NAS_GAN){reference-type="ref" reference="tab:NAS_GAN"}. AGAN [@agan] and AutoGAN [@autoGAN] are among the first RL-based NAS methods to search GANs, but they only use IS as the reward to guide the search. E2GAN [@offgan] is rewarded by a linear combination of IS and FID. However, to avoid the notorious instability of GAN training, these early NAS-GAN methods only search generator (G) with a fixed discriminator (D) architecture, resulting in a sub-optimal GAN. AdversarialNAS [@Adversarialnas] proposes to search G and D simultaneously in a differentiable way. However, it results in highly coupled architectures of G and D. The ablation study in [@AlphaGAN] has demonstrated that simultaneously searching G and D would potentially increase the negative impact of inferior discriminators and hinder finding the optimal GANs. Liu et al. [@dggan] propose to progressively grow the architectures of G and D in an alternating fashion, but this is only a remedy to alleviate the issue of architecture coupling and causes huge computational costs (580 GPU days on the CIFAR-10 [@cifar10] dataset). COEGAN [@costa2019coevolution] is very relevant to our work, which also uses an evolutionary algorithm to search G and D in two separate groups of architectures (called populations), but the two populations' architectures are coupled during the search. To reduce the search difficulty, COEGAN only explores a simple search space and experiments on a small dataset (MNIST [@mnist]). The final results show that COEGAN fails to outperform the previous human-designed GANs. In summary, since coupling G and D is not conducive to searching for the optimal GAN, we decouple them into two stages.
|
| 53 |
+
|
| 54 |
+
The early NAS methods first retrain the searched models from scratch and then evaluate their performance [@nas2016; @amoebanet], which obtains accurate evaluation but consumes huge resources, e.g., [@amoebanet] took 3,150 GPU days to search. To improve search efficiency, the weight-sharing strategy [@enas] was proposed to allow all subnets to share weights within a super network, so they can be evaluated without retraining by inheriting the weights from SuperNet. In our work, we also adopt the weight-sharing method to search generators and discriminators from SuperNet-G $\mathcal{N}_G$ and SuperNet-D $\mathcal{N}_D$, respectively. To simplify the notations, we use $\mathcal{N}$ to refer to both $\mathcal{N}_G$ and $\mathcal{N}_D$. Denote the loss of the $i$-th subnet $\mathcal{N}_i$ as $L_i$, and the weights of $\mathcal{N}$ as $W$. The gradients of SuperNet loss $L$ with respect to $W$ is
|
| 55 |
+
|
| 56 |
+
$$\begin{equation}
|
| 57 |
+
\nabla_{W}L = \frac{1}{N}\sum_{i=1}^N\nabla_{W_i}L_i = \frac{1}{N}\sum_{i=1}^N\frac{\partial L_i}{\partial W_i}
|
| 58 |
+
\end{equation}$$
|
| 59 |
+
|
| 60 |
+
where $W_i$ is the weights of $\mathcal{N}_i$, and $N$ is the total number of subnets. However, it is not practical to accumulate all subnets' gradients in each batch. An alternative way is to use mini-batch subnets to update weights $W$. In our experiments, we find that randomly sampling one subnet (i.e., $N=1$) per batch can also work.
|
| 61 |
+
|
| 62 |
+
To ensure a fair comparison, we use the same search space as in [@Adversarialnas] since it also searches both generators and discriminators. The search space is given in Fig. [1](#fig:searchspace){reference-type="ref" reference="fig:searchspace"}.
|
| 63 |
+
|
| 64 |
+
{#fig:searchspace width="\\textwidth"}
|
| 65 |
+
|
| 66 |
+
**SuperNet-G** $\mathcal{N}_G$ comprises a fully-connected (FC) layer and three Up-Cells. Each cell contains five ordered nodes (0-4), where node 0 is the output of the previous cell. There are multiple candidate operations between two nodes, each represented by an edge, and only one operation will be activated (solid edge). The edges $E_{G0}$ and $E_{G1}$ indicate up-sampling operations. The rest edges ($E_{G2}$ to $E_{G6}$) are normal operations, where "None" indicates no connection between two nodes. We encode each edge by a one-hot sequence. For example, \[0,1,0\] for edge $E_{G0}$ indicates that the bilinear interpolation operation is activated. **SuperNet-D** $\mathcal{N}_D$ comprises three Down-Cells and an FC layer. The Down-Cell is the inverted structure of the Up-Cell. The edges $E_{D0}$ to $E_{D4}$ are normal operations, and $E_{D5}$ and $E_{D6}$ are down-sampling operations. Thus, searching the architecture of G and D is transformed into searching a set of one-hot sequences.
|
| 67 |
+
|
| 68 |
+
EAGAN comprises two stages, each having two steps: *weights training* and *architecture evolution*. The *many-to-one* and *one-to-one* training strategies tailored for two stages are detailed in Sec. [4.1](#sec:stage1){reference-type="ref" reference="sec:stage1"} and Sec. [4.2](#sec:stage2){reference-type="ref" reference="sec:stage2"}, respectively. Sec. [4.3](#sec:evolution){reference-type="ref" reference="sec:evolution"} describes the steps for evolving architectures, which is the same in both stages.
|
| 69 |
+
|
| 70 |
+
**Many-to-One GAN Training.** As shown in Fig. [2](#fig:EAGAN){reference-type="ref" reference="fig:EAGAN"} (left), in stage-1, we search generators (G) with a fixed discriminator (D) that has 0.91M parameters and the same architecture as that of [@Adversarialnas]. We adopt the *many(G)-to-one(D)* training strategy. Specifically, the fixed discriminator $\bar{D}$ is denoted by architecture and weights variables, i.e., $\bar{D} \sim (\bar{\beta},w_{\bar{D}})$. During each round, we produce $P$ candidate generators to form the *population-G* $\mathcal{A}_G$, where all candidate generators share the weights $W_G$ of SuperNet-G, and each candidate $G_{ i }$ is parameterized with architecture and weights variables, i.e., $G_{ i } \sim (\alpha_{ i },w_{G_{ i }})$, where $w_{G_{ i }}=W_G(\alpha_{ i })$. We then pair each candidate generator with the fixed discriminator $\bar{D}$ to form $P$ GANs, i.e., $\{(G_1,\bar{D}),...,(G_P,\bar{D})\}$. Stage-1 can be formalized as below
|
| 71 |
+
|
| 72 |
+
$$\begin{align}
|
| 73 |
+
\alpha^{*} &=\arg \min _{\alpha_{ i }} \{ V_{val}\left(\alpha_{ i } \mid w_{G_{ i }}^{*}, w_{\bar{D}}^{*}, \bar{\beta}\right) , i\in\{1,...,P\}\} \label{eq:stage1_eq1} \\
|
| 74 |
+
\text { s.t. } \quad w_{G_{ i }}^{*} &=\arg \min _{w_{G_{ i }}} \, E_{z \sim p(z)}\left[\log \left(1-\bar{D}\left(G_{ i }(z)\right)\right)\right] \label{eq:stage1_eq2} \\
|
| 75 |
+
w_{\bar{D}}^{*} &=\arg \max _{w_{\bar{D}}} \sum_{i=1}^{P} E_{x \sim p_{\text {data }}(x)}[\log \bar{D}(x)]+ E_{z \sim p(z)} [\log (1-D (G_{ i }(z) ) ) ] \label{eq:stage1_eq3}
|
| 76 |
+
\end{align}$$
|
| 77 |
+
|
| 78 |
+
{#fig:EAGAN width="\\textwidth"}
|
| 79 |
+
|
| 80 |
+
where the inner (Eq. ([\[eq:stage1_eq2\]](#eq:stage1_eq2){reference-type="ref" reference="eq:stage1_eq2"})$\sim$([\[eq:stage1_eq3\]](#eq:stage1_eq3){reference-type="ref" reference="eq:stage1_eq3"})) is to optimize weights of $P$ GANs on the training set via the many-to-one strategy, and the outer (Eq. ([\[eq:stage1_eq1\]](#eq:stage1_eq1){reference-type="ref" reference="eq:stage1_eq1"})) is to obtain the optimal architecture of G according to the value function on the validation set (i.e., $V_{val}$). The inner and outer optimizations are solved by iterative procedures, outlined in Alg. [\[alg:eagan\]](#alg:eagan){reference-type="ref" reference="alg:eagan"}. These $P$ GANs share the same discriminator and are trained for multiple epochs for each round. To get a fair comparison between generators, for each training batch, we uniformly draw a generator from $P$ candidate generators and train it with the fixed discriminator (lines 4 to 10 in Alg. [\[alg:eagan\]](#alg:eagan){reference-type="ref" reference="alg:eagan"}). The many-to-one training mechanism can bring two benefits. First, the fixed discriminator $\bar{D}$ is trained with various generators, which can be viewed as an ensemble method to some extent, avoiding that $\bar{D}$ is over-fitted and much stronger than generators. Second, different generators are trained with the same discriminator, so we can fairly compare the performance of these generators to find the optimal one. Besides, a generator with mode collapse will not interfere with other generators because the selection step will eliminate it from the population (see Sec. [4.3](#sec:evolution){reference-type="ref" reference="sec:evolution"}).
|
| 81 |
+
|
| 82 |
+
After stage-1, we obtain an optimal generator $G^*$ with architecture $\alpha^*$. In stage-2, we use it to guide searching discriminators (D). There are two major challenges in searching D: the lack of evaluation metrics for discriminators and the instability of GAN training. Next, we describe our approaches to these two challenges.
|
| 83 |
+
|
| 84 |
+
**One-to-One GAN Training.** Unlike generators, discriminators are difficult to be assessed directly. For example, the accuracy of discriminators does not reflect the overall performance of GANs, as high accuracy may indicate that generators are too weak to fool discriminators, and low accuracy may indicate that generator has mode collapse, with no way to analyze the real cause. Some works [@Adversarialnas; @AlphaGAN; @costa2019coevolution] use the reconstructed loss (e.g., Eq. ([\[eq:D_loss\]](#eq:D_loss){reference-type="ref" reference="eq:D_loss"})) to monitor discriminator, but the loss is not a reliable monitor metric as GAN training is a dynamic equilibrium process. An alternative solution is to *indirectly* assess the discriminator via IS and FID metrics calculated based on a generator, so we cannot simply imitate the training strategy of stage-1 (e.g., many(D)-to-one(G)) in stage-2; otherwise, all discriminators are paired with the same generator and not comparable. To this end, we propose the *one-to-one* training strategy. Specifically, we create $P$ copies of $G^*$, each paired with a candidate discriminator from *population-D* $\mathcal{A}_D$. Thus, we obtain $P$ GANs, i.e., $\{(G_i,D_i), i\in\{1,...,P\}\}$, where $G_i \sim (\alpha^*, w_{G_i})$ and $D_i \sim (\beta_i,w_{D_i})$. Each GAN is independently trained as a regular GAN via Eq. ([\[eq:D_loss\]](#eq:D_loss){reference-type="ref" reference="eq:D_loss"})$\sim$([\[eq:GAN\]](#eq:GAN){reference-type="ref" reference="eq:GAN"}). Therefore, stage-2 can be formalized as follows
|
| 85 |
+
|
| 86 |
+
$$\begin{align}
|
| 87 |
+
\beta^{*} &=\arg \min _{\beta_{ i }} \{ V_{val}\left(\beta_{ i } \mid w_{G_{ i }}^*, w_{D_{ i }}^{*}, \alpha^*\right) , i\in\{1,...,P\}\} \label{eq:stage2_eq1} \\
|
| 88 |
+
\text { s.t. } w_{G_{ i }}^*, w_{D_{ i }}^* &= \min _{G_{ i } } \max _{D_{ i }} E_{x \sim p_{\text {data }}(x)}[\log D_{ i }(x)] +E_{z \sim p(z)}[\log (1-D_{ i }(G_{ i }(z)))] \label{eq:stage2_eq2}
|
| 89 |
+
\end{align}$$
|
| 90 |
+
|
| 91 |
+
::: algorithm
|
| 92 |
+
$\bar{D}\sim(\bar{\beta},w_{\bar{D}}) \leftarrow$ Initialize a discriminator with weights $w_{\bar{D}}$ and fixed architecture $\bar{\beta}$;
|
| 93 |
+
|
| 94 |
+
$\mathcal{A}_G^{(0)}=\{G_1^{(0)},...,G_P^{(0)}\}\leftarrow$ Warm-up($\mathcal{N}_G, \bar{D}$);
|
| 95 |
+
|
| 96 |
+
$\{(G_i^{(0)},\bar{D}\}), i\in\{1,...,P\}\} \leftarrow$ Initialize $P$ GANs that share the same discriminator;
|
| 97 |
+
|
| 98 |
+
$G^*\sim(\alpha^*, w_{G^*})\leftarrow$ the best generator with architecture $\alpha^*$ and weights $w_{G^*}$;
|
| 99 |
+
|
| 100 |
+
$\mathcal{A}_D^{(0)}=\{{D_1^{(0)}},...,{D_P^{(0)}}\}\leftarrow{}$ Warm-up($G^*, \mathcal{N}_D$);
|
| 101 |
+
|
| 102 |
+
$\{(G_i,{D_i^{(0)}}), i\in\{1,...,P\}\} \leftarrow$ Initialize $P$ GANs, where $G_i$ is a copy of $G^*$;
|
| 103 |
+
|
| 104 |
+
$D^*\sim(\beta^*,w_{D^*})\leftarrow$ the best discriminator with architecture $\beta^*$ and weights $w_{D^*}$;
|
| 105 |
+
:::
|
| 106 |
+
|
| 107 |
+
**Weight-resetting.** The second challenge of stage-2 is that the one-to-one training strategy does not fully guarantee a fair comparison between different discriminators. Since $P$ generators are trained independently, each generator will have different weights after a round of one-to-one training, presented with different colors (see Fig. [2](#fig:EAGAN){reference-type="ref" reference="fig:EAGAN"} (right)). If some generators have mode collapse due to combination with unsuitable discriminators, then subsequent discriminators paired with these generators will obtain unfair and biased estimation. To alleviate this problem, we propose the *weight-resetting* strategy, which is to first copy the weights of best generator in the current round, and then initialize all generators in the next round with the copied weights. In the first round, all generators are initialized with the weights of $G^*$ found in stage-1. In summary, the one-to-one training strategy allows each discriminator to be paired with an independent generator, and the weight-resetting strategy ensures a fair comparison between different discriminators and alleviates the instability of GAN training.
|
| 108 |
+
|
| 109 |
+
As shown in Fig. [2](#fig:EAGAN){reference-type="ref" reference="fig:EAGAN"}, after weights training, stage-1 and stage-2 perform the same steps to evolve generators and discriminators, respectively. To simplify notations, we use $\mathcal{N}$, $\mathcal{N}_i$, and $\mathcal{A}$ to denote the SuperNet, the $i$-th subnet, and population, of candidate generators (stage-1) and discriminators (stage-2), respectively.
|
| 110 |
+
|
| 111 |
+
**Selection.** This step is equivalent to Eq. ([\[eq:stage1_eq1\]](#eq:stage1_eq1){reference-type="ref" reference="eq:stage1_eq1"}) of stage-1 and Eq. ([\[eq:stage2_eq1\]](#eq:stage2_eq1){reference-type="ref" reference="eq:stage2_eq1"}) of stage-2. In our work, we use IS [@IS] and FID [@FID] metrics to evaluate the performance of individual (i.e., subnet). FID is inversely correlated with IS, so we adopt the *non-dominated sorting strategy* [@nsga2] as the value function to produce the Pareto-front individuals during each round. An individual $\mathcal{N}_i$ is said to be dominated by another individual $\mathcal{N}_j$ when Eq. ([\[eq:domination\]](#eq:domination){reference-type="ref" reference="eq:domination"}) satisfies.
|
| 112 |
+
|
| 113 |
+
$$\begin{equation}
|
| 114 |
+
\begin{array}{ll}
|
| 115 |
+
\mathcal{F}_{k}\left(\mathcal{N}_i\right) \geq \mathcal{F}_{k}\left(\mathcal{N}_j\right) & \forall k \in\{1, \ldots, K\} \\
|
| 116 |
+
\mathcal{F}_{k}\left(\mathcal{N}_i\right)>\mathcal{F}_{k}\left(\mathcal{N}_j\right) & \exists k \in\{1, \ldots, K\}
|
| 117 |
+
\end{array}
|
| 118 |
+
\label{eq:domination}
|
| 119 |
+
\end{equation}$$
|
| 120 |
+
|
| 121 |
+
where $\mathcal{F}_k$ indicates the objective (e.g., FID, and $\frac{1}{IS}$[^2]). We split the population with $P$ individuals into a number of disjoint subsets (or ranks) $\Omega=\{\Omega_0, \Omega_1,...\}$ by comparing the number of times each individual being dominated by other individuals, where the length of $\Omega$ and each subset may be different for each search round. After non-dominated sorting, individuals in the same subset are regarded as equally important and better than those in a larger rank. For example, the individuals in the subset $\Omega_0$ outperform all other subsets of individuals. Finally, we sequentially select $\frac{P}{2}$ individuals from lower to higher ranks.
|
| 122 |
+
|
| 123 |
+
**Crossover&Mutation.** As detailed in Sec. [3.2](#sec:searchspace){reference-type="ref" reference="sec:searchspace"}, the architecture of each subnet is encoded by a set of one-hot sequences, where the one-hot sequence indicates an edge and the position of 1 indicates the candidate operation activated on that edge. Thus, the basic unit of crossover and mutation is the one-hot sequence. We set $\frac{P}{2}$ Pareto-front individuals obtained from the selection step as parents. Then, we repeatedly perform crossover and mutation on these parents with probabilities of 0.3 and 0.5, respectively, until we generate $\frac{P}{2}$ new individuals. For crossover, we randomly choose two parents and exchange a single one-hot sequence (i.e., an edge). For mutation, we also randomly choose the one-hot sequence of an edge and change the position of 1 on it.
|
2112.01036/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2112.01036",
|
| 3 |
+
"month": "2021_12",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "CVPR",
|
| 6 |
+
"title": "GANSeg: Learning To Segment by Unsupervised Hierarchical Image Generation",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2112.01036",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.01036",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/tex_files_extracted/2112.01036",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.01036/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.01036/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.01036/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.01036/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.01036/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.01036/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.01036/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.01036/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.01036/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2112.03258/main_diagram/main_diagram.drawio
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
<mxfile host="app.diagrams.net" modified="2021-11-23T20:51:41.908Z" agent="5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.45 Safari/537.36" etag="Ct5C2ftQ2zPw9x2ZXM7l" version="15.8.4"><diagram id="qgh3YcXz90gdRl4K-Eil" name="Page-2">7L3bkttWsjT8NHM5Chx1uKRIWoanAZrT5NjUzR9qShtuUgf/VssE8PTfqsystdA6zHjvGHsmHHKEo9XdbBJYWIeqrMysv5TLN8OzX178/FP77uWr138pspfDX8rVX4oif/TkYfhiPxn5kyKrHvMn/S+3L/Wq9IPr2+mVfpjppx9uX756f++Fd+/evb67/fn+D4/v3r59dby797MXv/zy7nL/Zf/z7vX9T/35Rf/qkx9cH1+8/vSnP9y+vPuJP31cZ+nn37667X/yT84z/ebNC3+xfvD+pxcv311mPyrXfymXv7x7d8d/vRmWr17b6Pm41LfZtGn+/reX6yd/3RbLzdDeFn/lm33zv/mTeAu/vHp7939+67u/nZc/N6fvHv9891PeVNPu4dX5r/kTvvevL15/0IDpZu9GH8Fw3z/bP2/fYKif/vrql7vbMMBXL25evf7+3fvbu9t3b8Pvb97d3b17E17w2n7x9MXx3P/y7sPbl8t3r9/9grcq/wf/zd5j8fq2t7+9e/dz+OmL9z9zCvzP7fAqXPZTfOTCf5r5T8K/X764e/GXcsFvi2/e/9r/pXg6vAk3vPz+2654Pj6tbn4YPhyn7PbFt3/Pjqt3v16VL8uXY122Y/3r8c3x1/a0uLTLJ9PLN8fb5tvnr49vu59viupJc1p/aJdN/7J4fX75rA/fH8urU39pp+3Y7hZju9pfuqkfmtun757/8Prti2+34TXn+uq0L7uptf+Hza6dut05C68pXvzwj3L75kn1/XUzdstq2uzOdfi/aHd93k6LMby+b+0zV+F6Vovwt4syfFbVndqpWX3hs1df/LzL8c0/puc/flc8338zHZ/94/Tym+9+evns9a/fX39X7X54ff5hn//8fPn07vkP/8iaU/WmKX/6aTMeTq+erR99/+zYv/z2u59u3nZvbsrv7sLfZMc333w4FkN4j/27dllddJ3v211TXZ0W/H557o9F99MNXzP4a652iw/huvGav103/fM3r9/frLLbZvK/a/pj+ffxprh7HT7rw01Rv47XNA2Xw49/f9c8C+N7zvJ2dezDexXt+PR0tTv3V7u+b28Xd920DT/f2vd1GC/8/GrXhv/XQ7PMJntNu1vbz8ZmtQhft2X4Of7vbg+n8J6Pwu+y8H3WXYf3W+G97G9r+z7cZx3+PWxWeN8af7cMP5/sGo65fcYGrzuPzbPLXYt/r8tmtQ3Pr+kPUxO+rt800/qymbZZs2r7bnXou9vFZP8Ofxe+X/TtdOg3+P4Yvm/C/+H+dutwzcdL+Ps8XEvZLhdDO+HnVfiZXUfejYsMnznZdR+z727bMP7vwzUs7B7GDcZkPWAcpsMl/Kywaw5/d4f7C/fR7nRvK5tzi97GwMa0s2ub9n147bhZhrkariHca4br2O11T+s8/F0R/g/3dr7wvtdjuCZ7vzBXm3APh/C7xp6P/W6w+wn/F/a68Poc13m9GOya2tM+rAe75vBz+1s8F39O4XdLPJOez2Zvc4LPehXGJLxPO4Zrw3M8hJ839pzHzubJyuZAY/deYrxG/P0F1xA+A383+ufY9e7tfS/2rO3veU2YUwX/1t7vWHNOre199bn2d43NpwLz5GTzZIv5Zb/fYO42RXpO4RpX6wvmmI0r3js8k5M9kzAnrhfhWbcV7o/jOPBr+JvVceT47sN1nEc821U76lpKjpHPvXMZ5qc9nwuehc2B26cnG8NwPRc8v2VYZ3y+d124bjwPW2cnrJtKf1eE8Rg4dxqfO+G57tM9nQ7hb8I8D/PTriNcr82B3p5bGCs83/C7CWN+y//tteHaOI7LRYZndr2wcak1PjXvfWHvw3sa8dqJ82Yd5uvB72XEXAi/C2N6t7GxWNo42litw9gdJn61+Rr+bnXGdXW4vr39vLa13O22vm5Lzu8+rOd1uA5bG3YdYR2He7H7ucHasd81mM/dtY2trUn7nLi/ZJhPk633lvvHap9hT9Ec78Lehflp44K5jed14dyxtdH6XKnxnrdhXdpaPoVrwhzHmsMaD38zcu6GeWRr+XZRpTlu+0i4Ztu/sK9gz7xwPjc2z3kfS5/re8zH8Br8DcbXrhG/5z6Geb/C3w5hTDR2DdfMLZ7l0OI9uD/bvtmV7/ob+zf21LY/jNibtV818fM2HBP7P7N9Pnw/Yh9a2lzkPdn96jV478239t4H7OXNsxZr3veNMPa+H2TYf5YL7hXXWIO2N2mOLDCXw3OwdeDzrn51+zQ8R+wpk7/3RnsFzpHbBeYy97uG+zrfm+OHc8b2P3uPNd7Pvu/wPryfMNdsHld4D66LO67Jhf4WzyrnXtVUWieVrj3n1zZ8ffqweZblvPeFjbFdQ572oDWfffgMzBe8/6HX5+UYI+w3rc0h21s47nz+Bc4MG9uwbnGdGPcwTtojDtPZxy3sORhnnZlbfwbYt3F9tjesuPfZ1/C+F5wVvP8R8+30nd1P+GpnzLrXOHE+xbOi8TnK/dvuh/PXnm2pZxPW9bn39+1s/mB8v7H3H/jcw/wNz5fv0XOc7X3T51Qcn4XNyVLnmv7Hew8d35tzehevXfdk8cxR8xH3MW50xoU5Z+eLjx3uBXMMZ3pcV1jfimHC88SaqXi92CvG7tT4e0+tzliccfa99nrbp9J8wHyyfW7ws7llHBH2va3NY9tPs86emc1rWxs7rBnbbyyGtbHGfmj3v7F9nPsCYoP0ORiLEKNg38rDOA/cb45+VlQeq3B/OUx4bthH1pz7Nmexlo4eR4TxwXqN51GH+cxzsVsizrC9uOZ92bXb+bm2+WzzIGM8E65hCvvMcoE9ccO1wBhl95zriWtx4OfcnTo8w32O8y6sQ+y/S8WDYb1hHnDtZnyOPHtbnxvTWXv4frDYX/GrxdM9ztTJz/KjxSeIpWx92p7CdbKtGAMgFsi4/60nxbEYX3zuaa0zPbwOY3VQ7IB9xmK38Lmz2Fjn8obPMsyfMGYTrt3O9QFxbIg/7UzHGuL/tc7sTOsdsXh3wtjZHOaZ6HvdqD3/xPHRs/J1Y2dyjxhxqf3nGvuo9kCcEdxHeJ4w9rHzZMIZN/h63WDMt4PGLlzPAfM3/MxiGKzl8Dk5c4ve98WMMb991nFCDMlnf0HMPx177Te5f47F9OGe7L0Qg9i5zud99PgYc7rDudtctHZ47ts1Yd5hTYcxPOBswr56bbkI1uvo847naa/zmec03ovjNHKfs1g5PKcRsW+mMxZrafMjxizjHNzaPKkUC2Leprmw5jMbdeaPHt+uPTdCDMz9iLE3z4cz7p2xpJ2je+Y3Ya/f2PpjzGm5wKQ1q/wAe0Dd4DzAGsW6YszQjNrj/DzLNHfvYtwU4wg8X7tGzBHeg2KGaa+4R/vnEtdisSjjzhVirh65xrTQv21/WzP+szmPuXHE+PIcZvxkc9XP8nD91avb5tfvb5Vrf/s05NR9/zzkw7vd0fKM0uai7oX74Mhr93ym0/rg3mhrD+vd5vGEc9rmycryCfu6rTSWiB+xj9pctPcKcQvzV+Z7GquYA7envz9sbh//pXj63W1XHXbvH18Vx/l3ZVc8nypDdcqngpBe/XL3avgIdPsXiFUeYbQ3w7NX7968uvtlDH+ndynz/EHNdxL6+Nfi0aMHVZWl//T7S4L2Hgmf/GmO6jmo90JoYh8/LCFm4R8Czf4XAJqDg18BtH8GoJ2zq9OhaKdDHQKRamMgzml/2zz76fWLH16+exkWwGZ1+BAWWAhUQuCyCknKKRwWu8be9+fnP75c3pT2PpawbL8AlH32M/ovv+/T8fmPXbjm56+3P3aX5z902f7NN9mLHx4/ad7+1L344e/PDSxrnuWvn/+4vfVF6wsgjI+AsedhLLLwN38fDz/U082zb7Ln1/0lAWOXsJn3fs2Xv10/nQ5huPmatb/mbgaa/dzcPn3z4ofh/ffXzcn/Lvxsevntd7++KPZPmjf1rzdv9ref30gOSM5bC3buJTy2QW9HbRZ3m1XrB1vOQNw22B6bABO6RQyeQ2IdA1AG9D2DQwIBFyVNTHoSGMID87T3z+HhPzrgxA1ZiUelzTcGHQySLJls/f0ZqCLRbP2wJVjCJDETiHPXKXhWkiRQEBs/kxsEw02ZwIomHlY4qBnoMQjHhm/Xui/S4Wavj9cVkvKtJ8YWEPI100Gfv50Fu/Z3x4sH4RwrbOAEHnFIMUi2w5BBQ99zDPvJE0AFxkV8PhMPg/g5dmDodRsExlsATBh/AIxrO7j9XnnoAwxwsGLrY1EJNOz5GSHQTwkwkmT+LRICjjtAq6ODCuGejgpI1p6AMNEnCJBHkOHWD7u23yhw84NTyZkONiS1JYGgVgn61hPXWp+rJKe362XgjOS9B+hiwa+NgwGbN3wvT3xGHtpPT3yGCwLMIcBB4qUESAc8EwqCPQa0IDBNACoCIgEHRwC8lhyEZ+OgKRLNMF8MJAzXYKC+BRwHByQLgH52X9eL3AEuJkEheJoIIlqCz7m0ZwLGQDskRL0/pwwJz0kJI8ZLwSADnxoAAfcFjqMlO7zXGgAdg+ACf8t5kcbXgA09JybfCwVga4G4R4KIIxIHuxfbX6aWiYKBV0wkkeQwWd4wwLcxzQQg2hwdHaxnwG5z0QDzvufcWF84Nxol9S0AXHv+Bu5ZAhz+1sApXDMSWQbR4VnuR+0lSr49YIvAre5X4JrPVwaVF1+HAGq5zxFMwBg4IHlQgoFADAlbAhAAGiooZYLJBN/BOOyfpUASJedtCm65z+VIFDhfCoJWABAs+I6JvY0H9ysH0tcKjPFsK4BaBKYGK7S0BE/vUgKOz9F1HLTvIXmxe6/1OUigCD5xX0HwTCAJgAv/Bl8BfIWEn2eTgaacbwl8WArgnA5+nfkMfMkcBOmQnPWlA0Ecl5b7y7hQktZ68YdJzIkJHv6ee9TEPR5nTJ3OiKM/exZbCASNep6ZQA7t6S3HFoG6QAecR+caY4dns9fc217m5x7XZOv7JOeWgRAE+iokvzi7rVjDz7bxQYJg+9Uc6ODZ7QmC3QPeL50Te52/bYlnRfDE9s9c+4TtDSXO4NV+CImuzWsApHxvS6577uucv7YOGCNgvR201y16FjQOdk4QjEAShXNzEPBCUMKACNsr8e8917cVEsKcJGBia+qs+98DRE0FDjyPgddroKgV3mzOnHuumR5FtpbFi4rFKdvXDCDFs+DvdKbpmReKQUKcpGe02xKAwhl/YNJoyf3IZFngIgEPFYRUQCl9nPA7xgr++RfsQ0iU4xk7oTi2YgLccv5XTIYb/lyxEQE2u28Dm21szjnPFIDCGYANnE2HSgU6FFPieuW+WydA/qxz+em/A1zg/RRZOEM5Nh1AYczpnsDnwfcWxDgbT35T7JDj+QBMOCoBZ/GPAB33Jd0PgOQO4FdTM94guNhyL0trAve2v2jf9NgurXPu8YPOh4ue3ZgAkUaFT9vPDPhD8h1+th35lXs4Y0gDzexeXgtcxv5pxYNJ693PgDvGHbj3Ie7d2C9Z2PI4mvtNOOd2LFAgxohfDwS2CRrrb777KLE/hozlxfLp+fvr77oQVYQ7OBfttUPSKE/wE7nyBaOvbbZnin4G7QperuMoL/3pvf6nnwhwaOQntjyXVBjYxD1URacV4q0+xUH9DAiJ+9fwZZClt7uo21UEWarGC1x+VjKWc1Br8DiWn6eYf1SRCXtcBLMJklth+rSNoL/FkBvlA1cGGv+nQZXc8ZEIqTx8+OBR+QmM8qT+FEZ5/PD3QlHyryjKv0ZRQpRiKEF4vSEZm7CjdWHGbq6b/sWzf/z8vPgp+/66qTbLKguRUhn+t/JVeO0h/Cy8z5vXr19m3/36KqyEsKI+hPfIN2Hns/cIO1Fmu03Yhz6E1VGFWTxuVoewSx9qZtwLQysscsnaVV+HyCEP72unUnV12pYhuwj/h99PIWI/9dUfTEMq/0AaUmbIxMZ2BkNXsKtjD8RpQfrBtgZE7WgHMwCU+NuInpzreAJ5RjgKfUA5oFWJ66DofI/X4z1uVdJn9sfTNGRQ/FvLePD+Fq2xbMZTVifXR6WWlIUXuF6V5fk7UXAIG4eTGyej7ZhABbplRH48mo4naMqgDoOyxsopI8wYYwnT7m3yE01RKtEElJT2ypr6PkbYOmuw26ayBGgdXm4O91Xp2ipSrJSFoOy17bWDh7WzKLBTh/e3kgoyNivJnEh5aUGlAEXpEk/ha6EidkKxfB1LB5ulMvUwjpvVuneoHmcmkKQzYHeUPa4RQaHszUj04BGNU048KqhBLQCa4PQOoFmWFYNi1KFU2TAyUwbSqdzNkksbv9qYswwVIwuVSNaIupHxrlS6R8mq58kXssjGqSKrxhGBApGxl+WNchbG8UZI2oal1inOu8lOcFFeVkDLckRTJ9C5Cpb5DsySwn13OjlVEq1F6yk6lSAZNW5VjkeG6GuJa40lcZY8Vso2FGExmjr0vMeD33euuMXmDSk3eC7riIy0c+rWaYHyJzNYp088PXn2uXGKWqScrAtl9Jnm/CR0AetYWfSAkqmhF454xDVvz9VLYQ3X/giEBNlft4vlRUcLSz3HEJUe65iJeoTNktiUMkKMc5YyxrVHReU8Im3Hp+9FNyg5/8JnI1PYC+lY2+tFQTiDjiL0VnO7x7zBnjGdPYPMlRHZZ9v95sgWl+F9uI6F8BiiAcphSeqMZXltqUiO5SvQUpStnUBfRIluc61S3a4X+nFWBNkSHUTUbMgSy30s/3YWrWcto++w5jw25mduEopz4RmAbCVL2QQRpO7WUXfM4QpZNmgjRC/D86lAa1qJVoZyol0Tsqgi3ENCIyZHd46ZKHl1N2p/5Gfac4z7Mvdbe0+7NssKiXgQETuPnnUxEgZqFzPQzbXTZ5o+ZUSIjHPuG0QmFH3r9zi/WFk4efYGJLjSnlGDahH3b8v6iXK1np2xQuH7nNYPrlURua7heuGUgalxqgizuDruC2Fv0dgVQENtrYBG22eiow5CDi+ibtg8zDeib4ZrTFQMZAbYv2uhjPY/EXagmlqzUwOU2c5NIM54nutZFrjthbSxHH7L7IHU4TWodbaeb2z/sUrWtSMu+14ovT+jvFW2vFn9RGrF6UxUKlzrYRJ1Uud4q72vEy1atJZaa6vXua/zpO9n8YnRum6vpi/Qs7FewlmIuGjhsQLOBaMe6nwohIBY7DgR8bBSNc7rmoig3f/x3r60iRkaEQ/Oh7P200MP6thyUQtJYVmcSMgkJCJWc652P/2zfPSEnZLYJ6Ihw+ORCec+YsDF40w/FIhuliRg2utaEv9A9mRU5if6/qLZ7JmzRY0VRggEQdR+KmFDd8DKuZuVLetfwNcMO0akQdy8wspfiuwbMarzRbksyFgzMp+T7rCD3cuVl0/ezv5tGUkYpd8xF32YfVzdz/9VdT9W8u/lpY9/r7y0+JqX/uu8tC+uLM80ytXuWIFmdk8a05tU5aNKfhvyxgOq1GGPNET7Es6diZRwz+3a8F4hv90dR1Z6Dh/C+g37zDE3CQyr++cPXcjnjt9+9/pY/GN8+eYfT49hhA/l36/D3vTzTbi/V+uQZ37zunvxw7a/+eEfr18um7T2fSd723zL+v3zN0/Gm12YFcw+34d3ef231SLW76/C7rJJ2efPzbfh2n4Y+BplmFfLbFbbP982z57/fPPs8qS5bf3vwmh0IaN98svz8Fk3b558eH7dfGlHAkX9PJImqip2rBYjdqjw/U50PD+HE6IlJGpdaSerY2WVVfZL2FFqUWRHUvWBoE+sNhh6bRT09aUD2sv8wunCqSLeDOmzbZfEnp7QyxBfiQaKinwX8x1IHPhZI6qDtdNzgfYiZu09Fhjtud+oiky6JfLtIlHh9l4tA7VT8VfNCsOxFwuh9N08XKvOiKfvQScGpbBFvnaYGsWEW6J6J69o2FjY67eoIiuGLJ2+iupypHHO0OOVcnbkexbvAT0WLdbGxVkN/nwOzg4YiC0gbhs7Mh7spJla5i4X5QMlfmaxJhByu/+z/icdGnH3rbDck1ejVUmy53DLuaRqg/IVy9ucoRHyZiDue0qhwtwRS8RjyUw4cLmJecYsZ5pARR82OzFDnrXKpxH7xGo6471DHj9fsiXKIFiB6JRTS6Yz8ST1Z9T2TpUXpmLzqmIl0eU1qLo4/bzwqjbjM9JWicNsGQnYXGRsSbQ54hzrAvfKNZixcmG08YszYnjiW9XlWTt6jtMShwC91Vk6xIvOJjUh42Rn2MQhUcA5tqgebVYHf++JOZJX+MGcUeVoLXo6ZSm8DuIpJkFT9ahEtIJKe9obgNVgHmF+MM+lhGy0GC/k87XyLscEcsStqVo5KheeVEUbiEnMWCSY156P7ZW7YB5Ujaobiu0RwyqOnM/NC6UUwCj6djmvMq9d4lArDvRKz+DVpY6fPUQWUBhL5tOgezr1t3A6MXAdyx9WpPFz3z1Uvv/wfppMGA3ZDcxXRlXsQAtnhdgwVOaEqRo/q94i1t877uJ4Y4z77ZlpzQvDOarSaLgbnlnRxj3b5sJh0PqvSUU3XAr3PyXq9cHPkUjNZrQYPnO5iJXPjfJo5oxrfe84k3KaFfLrDGss3PvmWtTtnVf41yXXY5QSilq87iOzwJkPzHGEAawHPjP7PKwrriWcL7Zn2F5qX7dk43DPH+brFZXRJC0bgCuKTWHj3sUqMVglScqzlORsSSyXTLXDRPnOFhXUEJeo4nnJ9Jzj2dzhHJc0ZXTMteWzSuwOYKAbscBQVcLn8PNsjhubAvjJrLpKhsNan9EnnCmuGXsWyARql+Ohyk7mSqZa39BSWmbv/aV8Dzg4pMiW77lcJDLyvOJuFVCcEwWlq2fDEAwXKoQNOSaJeGPOYJNkgfmgxzCUbF10jjlrxp7FmDBl4DUFxj08R1aZW+0r3PPvy5729+IZ3X+meZMLH1Bl/dBLcnbn0rfuXk58+fX57N/H8vnb73sU9H5fonR5P5Eqn1QP3Kdhljx5/W6eOz353ZjR5dfc6bfU9EI2MYX4It+E/Sjs72W7OoZ4/F5NL2T6lTGY6w450dEYeUU3flzTaz5Y1h/23amFNcA+nMMHYO1Xp+ayWS3K1hjSpzZv7Xdg3PUhpwp77q4fw+dnIZ4qjV3UGcs65Geb3XYyxkbY16sQO/+JWdNnMEwR04zzGOM4Y90toryMGCniNMfvDVXJHfOkjDjFvBvsO5RqtMwLLh5bat8deL4YW2ceeyO/KyR50v5H1rbLeo3xNmd0CXut03kCvHJIuFlk/OVi4A1griDeOpKxYzJ/xiMXsFgkd0pY95lMW8TiMTbw+1ccfhT7+FxYLVLSWMi0OkPXcPadP4rje7CV5vuzYsmLpFlkoN2C7aSxa9O9XTujZ+1MjTlrk5LgyOY4z+qUxozymF6x3mkvxijitkpypEyWBoNY0hkl5WT0oX6IuBH8mBy1snDWJVSu6Vnv26JOtnFJ1G5BmdxqPQnNG8T2GfiM2kJnM9jNzpBGbYyfMyEeWDpDVkwaw66NpYIcdX+RRcOoWMby+UGx0yD7hUF1hFF4quMHYqWuJfePUv/Ewid7R9itxSBWv7E6EaTzxmSznMZyZKKeluuPC0M7R4zTzqVelNMyd0j1XYujLcYiEzLExM/auIY6SSdVy64SPrxlfJDq12DJtqxdz9b2McXUo8fZMeesk43DWlI+XAPjuVvVPWOdoqUsHzUy2W3YujKc2+7dsRdjdRr+vlLey/pBhpxMNh8bYA9m/XCWTOteXCysYe35Uuby7rDeq7nM1rCiDZl6Lk0umR+2zPGMyT2p7u25GnJHe56GUaC+UUiiXjm3y+SNzVxGPkWWLOuCszqZxoncCPLImA8mSWWh6yeizlhYkmYwEIf7bNz1pR19L95TYs98ADYqZHy3mXIFw3vKVgwtjus5rjfbk2nngL2wjHVx7D/fKM+TpcItWdGoEy491l0QhxKLmxgF7EF4/ac4Djh3kxz9GPEkqRrEFgQHgkwzshsHzqP43JUvKm82q46VrBJQU4n2GhUVBb1zCijVdyk36yyqX5/FUbHcwWvrzFOct+EKok08i7asp9/Gevps7448loISc/t7sEoL557Ee9gtiLWs1tz7pKZAzRJsP6tpH9w2xnkQPnbESSBhbTU3wfb0nKf3+mRDpvalSWz/WPPqWF0Z3D5FWFzYc7hezc4AuFZ4X2doyvKiJxsX85gMWOYNI1mx28hdQc5g++iPlPG2UmFBcZI4Mzqf41mf8TrdniPmoAPPQbCqpYDY9lLSDI4nOFai/HSmOtjStuLW7yVaYkjGa/tXo7igobKC+CZYvzgryPZGTs/5KZWPJNn/0epS/rj6KCt69PBzWVGefUYxWle/V1pUfU2L/nVadKhDKlCH8DHbmDJ6gmjzo7KSCT4tFdiHtOVYh6/mpPax41o4SitbOtPVqa9DOpO3lgqFkCSEzhZCF5vrKnzfZ2H7Yuo1rW27C0c+Uos8bJHlJoT+LE/9yUtNCEPPA6k8vUMvIjvvecxGhw0/Io6RkL5J3w8Sf0aYl04L+/ieoFKR4F0IghtRspAqXdsLUhzR0khbUeEdQj7C1/Z8Z042ohpMSm0EoXvpTKlVpIbwd1D66+u5l0CtEI1A5P5eLifRTQs/uyG8JSpPU2tLlchxLZcn2z7pjiCojSG3BLJyZ1EK1Oj7Jjl9MFzXkQK3n5rhtIWVKKcNLv5gabBXmG7wZ8uQi24tSNcI8zYXF74lYQscxno6Eh4ZRq2OFFHuItVf6e6xVxmqMBowwq/JYfm9Q/W5XHUk5rhXxlCYvU3iixQ+ZslhD8f7lERjrdN8cp+XDK+amK5slqmMQTFTW0sMRFGZOauQRmYhpIWaNZ1seASSyGFp3bFS+F8x7G96p8l2gm5bdzaaFgrBLIU/yl1QMDuukeURo5wq/M1BhcS8cUqPQq5VdC/MlfJKCOfPqVf63rrL3MTnHuYuwqCt3B5sHa+jyFbEFBzhm2u6coimOEjQo+M+UX5EpwvX+5NciySIecaShVM/RWflV5TP1mnNxVCprVyoQxHWPaemwUUVFC2qhEz6cLGhK8eFokI+c0L1awqPDI4mTarG+n9r40DRdJfop4RGOO/tGiKFzQV7ETKPdMcmUafkIthK6Dx3o0G4OiuXSCSfnOhGCloRmr2VQ42g8sO0d4lNHh2wCJ+Ps/Knp8XJlW4EjVAC9F4ulVyDrVMvUbI89yo/+N4jQYroiSuFmSwZTnQV3MK90dN8UUmZQoXxJx352NN176h94ijhkdxRrGS+6/WMjiwRXc/C7JXot9dJDGtzsBOMo/2klBB61By4gwNVck+CG08ah7XoDlEkXzB9YTqWHKGw17iA0lPTWrRNzkdSqi8p5WmmOBeY+ky+b3n5wcseMYXE83BXJ1DsWKq4DfdygsD8olSjdJel7noRKY8tU0jAXiwtkrKK+QThZ3IbIj3ZxZbbnufkms5MKJ0xBTXRIc+ev9uaJrXX4DUJ5NLYG60AKV+R6LRb/3muscuToHOhOdGK8n4QhEWauKQCBWOBhnBfHE9AKh+XZkqlbXnjbp4ryAUyOne1RgUBZUTlPwqIAWOcSbm3sce8eKn9iw5iYU2obBYp4iznRBefXilXr72MjlQuTruKlEa5ZJ6a3svXXrIT5buaixI3ydEyp9CV0gTs3Xa+TIQxQB/lfkfXNpdPTN/850Vkj+qPjXmqLP8ksXr45DPlpt8tr6q/5lW/wYinuDrts3baWympCOvHaLvF/bzKYI4qxOIhdzodjJZXhX0QJan75abFh7BnFCZFC3nDJez9BlmCumxSMfuZlaMsJzMpGSlmVcin2tzKSiGmnDqSau08+RBi3zHsF+GamjLEnWO4tj9xuemYg4Ji+6jlC1OUiBXJnVFygmudKyQX11FmRVr36O6rVykmILXL6SpRsovyyiCI0igoGWOLw0xoqzORdAeI1B3OZKyPXCIJ4RUjtcuY24mys3aqhD13wYdnmR7g/TOHywkdNorfk3mBzuQEyy/dPbMX/AczjUHnip/RtdzfRpectNE9rnXKV6Wza4r0pORUWyvOILROWgNlFNHcwWVmyKEmh9o28f5Y+hAdxmU6dDheNfO4eugYE6sE1PCctvcxuJ5x5kAKAoxuLnSlJN1dsTDLGCtSQ/j8UOapU+lxq/wZzrIDYemG54lMeVq5W5NSChrVoDjE4io5SjaVxoxjAVi2T3NulMshDAwOnlPndGuemT8sfT4uiCesAAnniCtPMM5AqSca/EgmlVyL94WkaIMc4RG3EXIFRuE59wiXZ0LunAuQSLmc7UBqUHToPLiIvpwJuiWLW/QqbbH0OkZHU7nsHkTb85IqnArpHM54reY4RPOliSUrh7BxnxnMgEauNZW5QA9jybBxuDrT8xhiLOfUwZ2ceHktfefO48AFBGMnYQINeJZPTzQnWvd00odhViY55UX0VZsvojVa3I61XXseZGNNCnIyrpBR0UWxldNLPV8dYwmE+5rPr0HmFBnzEFA23eQIBk+igA3RvXF0Gti+p8O3YyQwFSv5jPCehkt46TynCVqv/TGaaaAcxf14q1g6lhnC+/Y5cSBSijaKDSEPWm29lMuciHOrUMl5EAVtFF5Vee7NMYBZg88FzRVSDERLdCf8aArTMVcfu2svETVu4jQ6HdzdQlWqsbNmSh0PgG9MynVjKYJjz/jZJWUyvhi87KW8Knfqn6je+N/NU8J+NKP8NtFsSH8/ATMC1tBj/RhdAI68KgndCBOEbG/lMTjNfXxPcHzD4+uNzkpKVlFynUjBVSkcDrVhvhtOYHSPkWUTXnPH/GAH2q3lQlpvjWMwMxdU249xLcznlXcmuaX2z5Rn8wy59bxHVAZJxN1wSXvA3YyW0Lej74Vrx7G8s0RBKe2Ca1b5MWODvdMFfc5PKnvlLgLieT6nZbd+naCWkNYhN37k8ZJ/r45OBxlYUkfZelSXhMzllZJ3V8yhzok6PyKe4HVM61lMsCc11qj5pIBWcjetk+nReqKZGWWowl8y3CNx0DqVb+/eA69beWkUrrY0CMQckZs05ZOVu7cnyrobzzTC5eburmtJKyPGqG4Tax9rnYOSMIri31G0RbmszE6cWqvfe/l+QKcG5copPmknp1tGJ/JkkpWT/to4Hj+6823EYFWe1Nndi7YT81ob25R/98JVgX8Xuq/MzUxkFaDzbhtj0c47rHgnFFBR7FnKJd/mZjI4ymIMBrxsa1ShIt0358eBMtSsi7IG0pRopLVVeVQ48qg1iI4Hya1fsVIR7wH3hTUMbFE09cFl0pLiOx4jenTbE6PeZjPTvLvZvlByruxznc8jZa4LyQoWPn8uLbGwkufZUYZ1sF7AvbaMZYG3daJSdbH2sJbrNr6Gz+rm2AZK+Oru4o7MyU371q15zpE64PNd4sZiTsHqlk6DXnjcrToFTIxGxkagtshgyI2zDp4LXJJcP3YYEdZ/ThQ51ZQYR5npFWwAOCbXcDe2/0uJH/nsYVQlmb51NZgOQ+OGWJEasyV+G/fUYy1TMJcZGV0o27jsiPFGDcMzyoJ1DuwrxZIVx+0fun/GL6yZJRNOrX3masSxtG81lySl3yvmWPexDgIqx1lnceMO/UWr8SGlLp550YhI+GLJ+sZBlDQYJllsVcNM8VrO4oxFPb66cD68lCs7z1jbmw/TOlGyIy2hGbqI5zeTm61euXkX6QakOk7eickoKYjvIEfoUleUsG89fY+5d4r1LrqoIy51s03UYXJ2L7Gx6/7zGFiZFfchsLp48qD4FAV78uhzftT17wWDPfwKg/0GGGw0xaop7UNYkW0sBd6t7/lRGyPp6nSvSVsWtqz8I3pBeA3Ufx9CKlB3VjoPS89Y1B1YZo0pVi38tgZutbHLWD7ZhtcfQvi0nwzuCr+bumn/J4a7Dhd5UlfRk9rdfpKbROGuJIIr6hgG8DWVYBRq4p1aMO2po/eGUTreOrlCtGKu+fd8v75PJaFWIVjj6hcwMAnDtfOGP5kb/aNxgZe6pSZjaN8KajlPsex3q1LO5CUVpvqJrbxQ+SExXduo7lXJ22E+phKErpID09BFKkEvtnQ8tgCltPd+t43lao3xrEECU52kNEO5KPeUDN7Vq4MgK3nOeprmyrroMRAb/kgBSK9XNRtw1aBKxSg70r82sZxVAjpHP1d5gk+eejLNm5WMTvvePaTFoBt9XnWpyUnmJXkpg+E0Is/iOG5swsfjTq4006yMw5+NmidkTouheAYscEO1UKWSXmZ+EnLuyQRnyA1jrUYizUXhiIdEY2yycYoqKDz7+Mx2W5+DY/I9X/ucGRyO0/ccs8jKnakTlnStYSgnGglZjFBFgv5BBx01k9rLfWdLBjG9hEufN/JrLFXWnJKaGanCFO9hqTVwEmsRauut/vZYaN5XmuvycgWsk0e4zNc3UzyHPQlNWLovj3mF/QhlO/prTB2bhMFvg96NUqWz8V0hRmgpdj0pC4RXxgg/negSdIOU7uzKjRzhsGA5rclRigFvfFgI/kDaj5D25KpINZJBKhPDVuxPiR4g71Cym+1zAHFJ8Tz6M9wQkvLQiwq8iSrjDv66VAdsEjSdzxrAJepDmAM3zuIm+xZjfqBH7aD1LJWsvefLpMKzFI2s/7sIgU8oKxv9om68UaJ8qWNTp2UshXiTxUrl6CnCBEqrOirXR5WHCWmlPbVS+Jl3aha4WblXssEpchW7JSyxWR09vXYIOp4JXDvxfAKtI64jm4eTqzYQzlu5ppJyB1AT9zFvlnd0N6RCZ6Kg3dbLGXJQMh/71INB5W5BDJGFDTauIB6nHw2uPuW1+z3vJ+23kxrHlbFED3e9+ZxDClK3UjGjWQ/mFxuD0lvfUtjeGyrZ2Fd09QE84NCV3auguW3PvgBtHtnV12GeMWUK681SqUgl9BSjivGAXN3aGEPExpIX7rlQLBTYQ0IqC7VJpLi1zlIWTTCeFReqS/Z87s/a0htyyeO5xH6APdIgy/2k8+oSeySEcUn0EvmrR5UW0r3kPS7WNUtASJeHeM5LBewujGn/kKe9zc/YJ6PXeb8ePRZp43m7vqT1GxvG1an3hPaTKTr9+lmRNam0OI8X3GeXPQlEUQFdUOpepZZyd1rARYp7JF0HQW8g3UpnuNMz1vP12qtPBhVbhJMJQa9ijwynhbnD2CWpgOR+Rcph5TFA2IfUlCky7d0RBa4aokxlKn+VgCmxV+xFdbTXn0Fb6nT+dzwTCp5PTcXywBpnToJrWi/lUV3udCzzUx89fT9jbcBRAWvF4Ie91Bu4Pvu8XA3YjN4Fpd3cWzu8L10v2XgMSosU38V4NY+xwikq5WYOa1vvheDuLhWUEvOSi9N81ONB7mSu/tae00od819AV8mffJyql48fFJ+aS/nL/qBU/dHXVP039F7Pr07HoTv1mTH2bfvurP/ux4yVsSpCSJSFJTR1p+0II6rr+4yV7roKafjeGC31xlpBrbYZ2+HsP1gFZbMzBUCTheO5IDr2hc9effnz/tzqgIbh5g726akX6S6xMFtVn5x5P2NiulHxnSqiqk7sZ8eL0HiwCxa+5St8Uk9rpjelGwZ5JU3pYhaPO6Wy/Hwcr7M2TNvUw/2EMM6NVLUlnmtPsVv1C2TFbGbMvFrH9L9d0TQ4ije9KhSFnZYyWSjVEskE6ore5naPo5sUWnsTQR9Vpz7GFCf2NA2mWVGVLPCTSadDAGhV5IxUtmsYZgYcFCYuxUJGyqr+6L5tTwuxFtH/lGOzOkR1RceKxEgY66CWSgdVV48SGmvrd5b8at/LBpFwzG4hlB8GkAxVVRmfsY0hxFbYPorhQ7GZG914Wxz2EVU1FHPUBHXvmeb0vZtqH7yVwyqOW4X2G7RcvEttsLwiSZZQy9YJqjCC0e2VsUFMfYctIJxmFY6hl6ejepZgF6uKYVBVqZQU5kkcw7VCKG/tsFU6dihYaetrsXhrT0lYLUpsiGjutdsrnUYa4yLcWkyEUhaTFVnETPVvYnVZVetnNLZWz2bYTSZDN59rh5lhN+ZcjfcQoz2yU8QGp8IhVqxnvTMX6LkOU1LM+X2qCIRwHCwKVC/Qa5YGYxYCEZIqpRSQiSorr3hv9oOt/BoPk9o8iG2zieLYA+E89kHvZe4wzFIOiktPSNOqFJqu8xmDfFRlx/cymrFZyLiKkNjAlkixvYTe136GHrQwA4AwnmltJpNnY61kMGul2qhkSrWW+qFXj9XzEOedrSvMJ7GfJ/a3tfV3SHakakVijAn0a0ZlHuNM4TRVDmBPYY5kgqrGFEpvJbiNMCLsSDdie3AMD25IdUeDArVa8raEgo26pC5TyD83C1Pln/u9t47zcJ9mUtiXuP6Q/kEB1IwSRE8dVFsGwZiouBGb3Oxem8RmE2zbQZ2F1kG5xO9SCKDiqftlW5B2dBZA48qAiRXEI/dIpARgKc0UV43SBRPgL+y6Rhfmqx+2w8iEzJMou/7nZryAaNVbPLbHE5zSe7surP3uWmYaVLpNMgNx1UWlr2mPscrnlw10d7CrrNqlZPITQdmO9cie2oG16pPiS03kOXHlnqN2hLqJxCmT3FgnSGzcM7ZvvzgSBbmUfZ06uK8FFiq5omVJEXmr0cIq6UP4xM8+I/Mk1VY9O0rHLTJkSwHjJgIgIRgzADSM4JjsYAhkWiLNejUsJv8LmsQ8rD/Ol6oHjz7TavczflJRTf3vz5cef82XfkNpM+QsDdTKm9U5N0ZIdzqOHymnhytj358ORTgzQs6zHpRTzUubISawZjPHIqzMIuRKU8dSpp3LH8K+HnKh1sqa4T2OtcUCG8ujTlBhVxu03LV8ycqg4RpX4d/h8zamsLay6+78B5c7j38ou9+aHxgjPORJhcfVzgbZyMhVDPGLINiscbZDNNJvHWbOEsS89f1/EMuaedFJBqc4R7FT5enfa4dzJpjuXGunoalm1d0+fS/2yKw1IM0srFFMLKmuoglv6YwmlTNjY5fYDEfsm05qbTalASO8ZFkqtkG9MM4HNB5b2pLBvI8mTvz7dgb5ITZhSdhzyZMrkKEwYP45bw0Idk0fYdWZIpJNYEYxEmV+MjMm9Hui4k3tWMFWcmOmZF6I+FtxOEsw116OXscGCmJI5c6qd1V1F3OLQyzz6dk4e61I8XdsfVfq7B48duApumBOhkYDC7GTve97r1OwrWieunWGTuamk57H3qyospVRpcXPFWOrvTOjPE9yg+ApQufI/c9ugDKIWUXlG0vOUSmpUiXaJ7KU440ZUL5ROaGVGpztCAlzk0Wu5zN20cT40LMlqOXefWrgQ+iSBi+mJGCeV8l8Bm1GpbocwJTk2ipUvrNSTR5ZlCdrU4wmGoxBLZ9zVr+rVpjnQ72vhjCA0WPJXuXXpIhYZ84AZ5tbtuOFunYFRU8m1fXopVUpYZ2tXKQcsFFks3UWs8eosQ11S5NTqkPZkCGj4RRyYOboq6PvDYUwABrqkJnMkuktWMCY9yhre0n35C1bFyrf2Gf0dJYAG/S5DKGh+ExGulzznrt6G2CaP4s6QKZ7H1mtMzpHrlJFFpu+ILpC5GfPfNYS9+CM/kF5DXJAL2OToUkVb2qFCXZeL3Ys28CujjKfPZNGwGYiiZ2+RO49qHWD4zKDDNmmjdj6KoNdPIJWvuF7bZ6avJCprvJvYnI7VWYVcywoZKTudheNEjkly+I1TLxHV3Yf4/4RWyVTGZTfM2nFeGK8i9jcJaoEzn4esUHVSeblSy/RoHxWJGUZzigZJR7u0QZ0lomSwxbfrZdwyVx1HMby8FI5+yC1vUzsj8iTdSblUu2j3GnPd7Z337maHWNH0zDHm5gBsDzLNuNLZg7CpSpvyiSjd2U465LzYj0m8zeotXLsKSvkvKMrO4THsFQ00cTdKSeGk21ovjQJI7Sfg43fUdEw0sxrkWgxU0PaVTqv6WZxio3oqIZRG+2OzGPQEQzfaGdNPdtkEDWhTey1OQ4kl5eWpScZpWnuQlW0VYzTykwRscDkCq3N9eK+ORoMIf0s3/tZ73OcTFyOV+UmwBtvgody+V54BmgkUIfNDOgyd16wGCC2Rj3ZuQRlWTGj+txtvI2LzQWqzGTqd3ADwRkLVs1thAvLyUDnFVqNx5hOyqPYsKzzeOa0cKUilBIbZZ/c41uptBAvou00Sm1sp+zzDGVJqH8MA7OxfdbOG9ip1LnQvoe9H2ztThQMnpvN6HO9vZVKboquQLFBmJT6lTC9qru+R/fR+XPIpRrL5VwwufuDK6ba1Mo3I74lPI/7Yt6uRJ8y5xGeLfdcShQPFwmza3RtbaZ4MNvE9vQqz59k4seGfHIXwV5EY0xXBn0Z74CPzGEgaWFdx5XC04hIHXP8IVqms+CeU/u05ShM3oLuOCOUWNT1ccPe/wI84OHDaADwL1rGPsl/LzTgyVc04De05gmZ/t4Q8Txkk3nI0qcwzz4mOldXp6YKEc9glc9N2EHCjvKxj1rYIatsE6K2kAXmYdfIzE6WnjeVIapliLbNJ80i5kFtEj50dqqbHfW0Lzbm6eWW04Y6TOtwTYcq7NQhCtz+icnP26yjvW0BDDKRmweSbpVprJSpRw3VObU9OIF0yabyaFkxqyQYeZLVqFyVPYu+cs8epfPyzE6RGX/feHsI6vlGRmDWck7V06QVZatDoslZxERZ7clEAASZr0NzM+iYRn2Nu+1G1V+RgV2rY5H8PZ84+RDkscWliHkivWTSQd4jX3uLhI7aQukiQXyzzyiYVcX2PQWqqLvWT2z3aHJ/O51u5172nSOzpIYVU2Tpe2rHPethJS9WtiPCTwJjD+8tEuJot6uWdB3tstXWIrakcGtptQWyM2arthZWASTGjpN+OSMms4o7KRqc5B3Qq8oo7ag3o28v7kkHAi3Qpn306tkoCu2AFLRsOZFIdKxkTGt51ewdQbhEkuCpZRXo1jWaIhmytRR17Tva9YJEeCIxaoPIIOrPaEm9hHfWRUhDgfckGZgk3ZPIgsqoOpAS95f078Mlav2pZ85dD9mdjnOiuipMrIYgG1CbJ81ZtiFFNH1Wm6CQIapNls0lROTerorIipMrizQnhG65ZtuuUZ4FIBkzcqvdG+MwNU7UF5m/8Ta/M78N9+5Yp2r9atFHH8KoXZUPm+yv4fHASBTZr9ZnjdjEx5pVPBKo+W/LOu8R75kVN96qJkMlN2bcrmNDjJQsuzkPWeFfUpebNOap/WcknonI2Hp1dqXK4apXxLr3NQTWhPwtBlVPWYuK79+U0uXmLh5pnWQ7SaBwLb8xZrf91awNUUT2dq2IpEdHF0ZFkESS2O5brVdtPR81Fkk7LAJjqUzFtYNjEiochUqwVa7GgXt8rH6j6hyJ5VHTenICJyud3ZKedxgzvjezzNvF4IwBekC4ZwKYUM6QGTl2R66PJdGYsJ+pRa1ntwehoraHw+ewUjRdbzjWzPTUokznFezdLbZBFhHmysbbu61cFLMQwfo4y263eqYHr+cNG+gkHbXb0/pfAgCu07Xa9bSsUvuaHl0nnVoIzEipyjadCIt2tMxeRuzt8mjxNrJqK0X0vFYm808qttZWfj8lP6B9ZCUxX6AnEAigyiPEtOIYO/IJtMr2YkfYDZVCrbEgewga5Yl5B8+7NtYmW+1p9sykWffqulcB6ClSquW1rKjdX3HWnhVz6zw10U9ALQnItGELMmN6wA8DYhGib5hvfkY1HltkbIAan0Ues2y2DSx4pri1/VY+IGR0KIPUswZC4UTsUcyWUWIQ2mnfLu4hABIplDPys9CQl//53Cz3FCsmZ4+yz5FbH3+W3Pp7FWvL7Gt69hu6/wyWZnSnhZFOQ/p0MDJT2d7e6/5jxKwPG+tJvDOr6x6Wbe0nHVXXIdVrw++b2jjs3WlrZEl0BTE7tpC6hd+FbfS0rsORUNG2xKiih0u32k9hWy3QVQgarGa8shTN7NnC1g2bthPsty/HN/+Ynv/4XfF8/810fPaP08tvvgupzutfv7/+rrL07Id9/vPz5dO75z/8I2tO2upGpUXPjr3orG9uyu/uwt8oZRvCe+zftYnO+t70HillO/fHovvphq/xtOz91S5RXf923fTP37x+fxNSrmbyv2v6Y/n38aa4ex0+68NNUb+O13R/+y2pE2vUZZXHP61Gew8zxY9vso1bkZ1cx7MddexmctOXdkFHvAPIADS3o1KXslWRZpM0RWXj1mfT1lMT6cIaApkMXwH4NyxeDjNtEAlVU7SxLZKN8pHWLkvXlvb6W4S76oLC4loqRp1F2FOXQLf7mppMZEh9dWBz65/rpFLpc7xzSaOQuE361GuknbJpOjtgN7MdWPRe+G3YOYIpMuhCPUm3I7RJpeY09WDTEeFNS/uAXDYSIoqqu8sqjhsIfF1KvXufA66NlJWxW5WpKHtQ+OlWLnvvnoeO51feIRdd46Kmqo+a3uuFLGxbdqZI3ZpK3avggqMXaVW8tfCCNiUWWrRMq+99TouQ8MiCIAp6Fs4dRY41MNyK/iy8+H1uqOcTmLrvRUaY2XydpcNDUbxOIVjfd9He2VPMo2s5UQS+4f2icwkIj9YBShbjSPvs2DXr1JV3rW2lk7HwNAK9svLzuXaINkqt1k6yNl57tw/XvETgt4uawgM1MkwN75j2R8jDO+pS0xKtB92q20DlrRd+UPiN9si0MB5QdLhGuJ2bzQWK8LFTknftuvOUqEpWjn3ZxCKg5nu8frdIWbNrGFO8UfOXNuC37ArqQHrUE7NQ3Yt0y+u00CacDSzMc254IVC6WXV3guYJ2kiOP2w6nAg7zEIzdJUijNO7htpt62RbKILFLtoLVYLNBw+/ub4QXmewBCJUbmHemArE7LxMe8y1uo2ee+3BCJ9bEcY7wjUMcVGoOXiqVPqeQnKTW1ZvE2F3InlCxULX27v9uVtAglyzMbiAkM6g9MsIE6P2jby9jp2+CtfpmXAhpewoDHI8AI3ElgvSnza9unuNtLNfyNugUei+j53SQKqX7pMd6aJWPmeasHchAbRntIBstHYNYjNIhR192xWLeCB7r2gr7UVdjeMkeI6p5ZTECS2tdZgS7WSldfIOd9ItjiLg0Kqx5GtJxudzZqdwkfFH2aByvifyO8cznoHRvnPwIjos7jmPCha9WhEe7FoO3i3VyRwi+KqL2HKuT4z2/BPTQglJVts4Fu4xYdd+w/Nr1PoczQ9AAoJacYMgEBKI494aC5WEBN3aShCyd0uzZ1QwHd8qRUZMImj5EMkMNl8tdeM5d5jtQWeHqi7ai/JOBHf5b1y0hmukVdGCv2crDkB5oNr6ei+b2Ol8n8gJp8Yt3ya1L6ilq3dC9/CRrWAufwR1G0dXtRl0zPVgnchMzLCZE9IJR170Vd2SAIXnGpuxozjgwjPUSOAt4w3uc5nm5p00qurChi6NiZbse8y4cKvX0QVKgpwGprkO50IAwXOS6bZS+dhhTUVfkOqLmR3gqAL9HeP0hSyXHAJZ+7mXu+4bZwmgOPgFxFSXdlUWD8jiDxDwvJDfSOe9kKjnjG51dr32laRx+CQkK3Pu625LCvgrfA9Rh2v1KcDZDmneAaoYGtffi/wH4gTFFrZfc1wnWP/yzF81fYydVj19PkzMcityI8/DeUuGQcXsrHX70VMv2LX1bo4kXywXDjsP6voHEhr3g0MfWwtw7VohO3PLQZ7PR12b3XsrDXEjEiRgsEIQjohFJLxITz1pLO6kOx8iLHWateWI8GmjOalO7PQKueisA7Gt03Vzf2koiPJ2DOrCpu53tjfb3hg7iqKdA+aoCdUat0uLJR35D7AUg3jyUKd/Rzto2UBSvNONOkchPkpkgZY+BBcSZeSxIjs0dkFsdY5A1Mb9yEgQt7Yn7BXvKw5Ed/p1LU23x6ojaf0NoEwREEmKhc3m1gkPY2zXQ1IjiBbR++fknfYIB7Uu4hrvt2MApGdnIVrqHGQ9fKDXie2DCerF/N8snUR7rBz6s/IpyILoxoZ9QaTeKI6C8MZgfJ6FEOzYWVJ1bBnAeQo4vHUikZ0rhbwGeLaTaJIrFjR/h0Jd3y6z+NH9T2rOz+8kKjqzm/kJlnH02CEBy0nD2Ls3EXpfu3Wq/A/2hL5j7C0hZ9zD9ve6EOJ5McZLZRzCo955Tu0iZKG3WjtZFTH6TLRU6gwanKAraHFQqxLYZBtBOZKssB+BkHrZuMhwktDx5CJDWDnOaBxb5f5rkXL2g7eXOUxqMcPzRn4cvF/di/tSxXgunaEgHiZBHjrA+n6MEqfO4CZ6XvDv3S9jL8idhEHLmcPzq/w8UBkLOZs9n80sh+X7+J7RXmIcJ0+qZKOq1lsrESxZap1coKjS0JBi8EOfPG8a78DqpV69p3cvVgwmq01C24B9L7LJzeDTYTHsNXAC62htsRiJxLNSmcU4hyn677j/gwi76/xeJ8N75f1+YvnPy4LsfKjcaBJhjxbE2Ht6wu6JiKnyb+P+HbOy+/6iPWZ0e1qWk3rF+WotozzDsZZ/IoequC+Z7wfJpZskT3KrSidh5Y3a2Wivr5RT5vPXb+ZlwNhBUq/5cX4dl1+fz/59LJ+//b4HMv57guPFk/IjcPxx/cCh6HkPyDz7DDruLez//eh4/hUd/w1SpsyaNrZmwRCi+c5WyyfNSs7VFSVHZTgZihDBlhvbAT4mL6HAVdUhai6AgO+OZYieLGK0Ro/VxiwFDF0P0Wm7QlOSsNtVbPoYsvfudMy7qR8Mmf8zWzzYzmQniKFxN7YDRoek9pIMnLdOTx02sRngYVTmOCVzWBmFx93sLDQ7NlNiI4mdaOpR5rMGSu1NlKLNgqJNRkJtJemuiDqtN3OaS3ucPIXIwBuh4DoYHQxqeCdbCZEPTg2ab4CAFSmgQPy0U6b+3TyFtm5Twcidzfo0dn36bMpIJiFBQtIk89kpa/bIfzqkSP8USUoD+8V7k8B9QZmBmbyf2Xsa0hprxriuSLYzgg52cxT4Zb+hKMKeY3IVExIvhAyEjqFxCdJEBzHYZUSHxLMjhKMiJdGRe6fzkoyxQuOR2JBRUb4kLr1ORFHTKVUZ4ljDyJ5ylSSg9azNIxy6hLldQ2q8hqh2kDQFCK2qBZOigZHSDmZQboju7kh0AqP0yhvCEFVf36vuCM3IY2PK64g0KNO3SKOZgACHbBlIHmwgDG1Yu73ISEsRGDYLbYaNhv2tk9ZkdE7UkvKTxiOvMVKbITCWWf6ohgogEJLQBDdARcPIVCk3MwKJo5CybGiF5rSyHtleEsHBMgoj/h0mWZvURNK26iuPvvEjo1pv2LN2UkFaQyc0hpgk2QJK4ki8rEpgIaGITtYIdh1HSgeQCTbeUFKoyYIoJlB7/xw0Usg9g2dFpxkUYZesYB1IhDmJkI2mnXgGmWcilOadKffB/GR2m3qE93KtjNEV14qkjWqoNAid0H7UZPf3qr3vE5KjiNg0GzfMN8+ggFCcuUfSpPvSROIIpH4k1Nwm6alHlJLpOCpyl+STyTi/cyRbsqnZHJcs5qz9qEXjEHt2qki4+y3c4lrO+5yVKmWj6XO4j8lRz6utRJgbOqBR8mJVsZp70VHkKpt/+0io7SKSCWTFK7NjmgtbCvPhfunIcuMNTrhGIHdMmQ0NAo5y4DtWsvSBe6OIRHKVBBIZ0RNfx/EMc1JprCbH568MdMv1ot+7E2CKyIGmjpLlmXQsVihkYQNpiiRAkmOFuTiJdMznl0GiibM3NmQmysnsmm6GJ7gk41laFj0z6h91jkSXSq2/wSuRsr4YJdcsOUZ7PYdwPUS00vkSz2bM/YRCLhfe6772PRroOT+jjCQuVpuMYO1VWsqI3cqHTcIoqVpSZtal5lW2RxaywejV5BOVCcYgTw3lqIkCW4avPftW1YfdWo0+9rmyeCHmdjbdN4wgmgESIGXHt044tTNcFj12LUCGzmpELDuOWOnsp4SObCNCbfv+zEX6AsSC9zyRMLzo5ZDtZ1mfiKpObltnQrMHNCATut+qIbTWY5H2O5Bf7+hcKBkwztwo+WGMh6pRXMsln4HtHUdVyJ2YCduvSpm9xT7uoIjss3UEBETlI6tceJ8DG9OJ9Lihy6uTcrVGG80BoLIXIRFad32qViWX11p7tjePdQmRk9l0hp5jHMd5IJkwq92GetWoCKDiGSs7oxDZCyvaQmXRvLkthQQDUehkoiKUsExrhci+x5ZAtUmw1fNaeNyQtySqo+EbY562l4yPyPuPIMiNROzNmbLVs9C+dZ32KzpdrzO3taGMNDYEHDZc+1VsmpNECL2cainnJckZzu1WDeM8ipUuNZdqfI9URSChqxuScytvrDZDZKfu2/848lBmHyEPT6rytymmnvxeoEPxFXT4DaBDfWXJfgiwwgExhQlZtKYJve83OZgaKiSjdQg2qs3uXLcr64TyUYfUU/+hOxmwcM7NdwteKpMF2ltrCzGF4OlCn5QQ1JwMEvzCZ6++/Hl/ZjACrGxLwHdb1xQPafM7lgqs71oqCAQfL7xsYz93m/g72QDPyoEzDWv0SaRv1yZ230YndLLHUR5dE8J25QUh3lk3UFpq0+MvatcvShaKjrpXJjVJBeXUEL92bfDq7kiQgId8CLZvZH3P0tG807iVUhFgqdzHA04BnsqFB7eWH6GAYRmo1r+9VQRt9U3rTO+3yctb3RQ9WwqABUyQMtgYEyofveVACJK8lYT8Zs5K5M8pII+dYbfeWSq2OEh0gbPD3VM8iNiB3H9WNdE+XqV5lQuUwHv7CiZQk4JvWJcn+l+XlAEX+FTI0nvD5CtrZPNtQZEr2KRPz7zNQceS5YWADEqVsGqmeiklot7lioobWJUDDFFZvRa91LXw6AaPz2S3MHSTbEX56SJocGDZMwaax17s+UpBGdV/p8VcAZM6eboaDeANVInRV7Wb3Arfk0MoTiKtDM/XaWKnvWh7e082qKoaVdafBCytpM669k67bnN+dk/QyRXT3bz7HYG12DFeYMrIUj0pCN2OTnIbPCOoHOTHICe6a6dmrVWSWZfNRx39VKotZQVOgEv+flS7GcDBsucmJo1bUSDO3jFV5SDRfa8X8m8ydVBq98Ixxjynr18c89YVOGxrAGXE1v1QhxigrdY+XmqxsO0TXaWZlWda98mZqAY8QjFpSfQGnecXAD4NcNwgOaYPBoHK6COTiUIzKchWedUUQRGoUoLRy+7fEh62XjHAXdSYSdQl2++0RzOw1LihbYAS4UKA5h28ZMyjOQEhddxHpr37dmTtuPC/dW8QBdrrQSUseliu4jkwMOHHHslSNVXM0SsW5W8CKAUCejyjg6+9AhTkJfdllHxB+TZw0wGrdS2KStUSQBlAX6JS1deT/24U1S2jz4931mSiLxqF6AOtK6B6UZZk3R+9KAq3ZbczbJYk6vzb+r7JdbqKFOXM23K44rA7UYGltiIjlTYAeQefk/Q8hneFz70p0UFBb8i1Z1buwSA6uneHJih/Sucsaevwcqq1B0AV1S7ZiiKcv05Nv7D7ne19R4KDbJFie12dAEIoDAt574Ciy4SElDSUR2OrjKOvzWE2dgJJtw64kcpPDzeok6JvSqRmzX1C92ynEu+fcwW2/Spnx7Hj/pl3PEudRkqKns1bm/cEeqkAln9I5x0KMe/7gUkd/aVjnDJr+0LABDQd96W9k0JM1OStklwBfnoNY4z0XDbpHJZSkWrvjzojesJW6lxnPLAELQXxhLx3yk2cV7YH9YXAQHrUrJyqilYW+dxblcCAq+QMpH16JxXmABDKldknJZi7KJtwyrVTyVMbLCSk54+8b9clzzAA8qQ+cK90b7c7guxt2cTOpk0vaje6P1JxuS5wtoPmhnUkb18HQfqRSTfeW8D9UUpH7954oEfdyqgl9EzvRA21/YheS1v5Nbk/m2IQPE+NHf2mMqf2+hnNWK7RGQ9ZjIpna7VQanpSi1tvB8FC2Ok1r48eRJF2ynNqi2fqSkeBOTWo1DMqP+Ps1NGzFVBrz/cwqg3a9UJ+Xtzz4/3gfXoHsGuMCcA7i/FICd+wwDRImT/QyWDhXVsvAvOy7sv0igwx3HQskt9S40piqlevKb8RFRIxsfsqqyWZnGDQ2Vfd5iUj+s/TKfKP6RSP88820sizz9Ep8ke/F7JRfkU2fpPY8Lq6dBadhSjbkIkQ9dXdeE9sWHVjVXUna4x5zjchI7POF93tR8jGbvEhrLKQXZ6zEA3kraEcIDj1HywbD7tDZqXize54wYkEB1gIGOswu8fwN+MGO9fxw2Z1sDJazfdqrKvGn9gLprcTu7IdxwjrDcUDFKpILAcSq8RYKvMN7IYhKgJ1/yLhJjd+EeQVxbLHcMuoqkouoiixTYCwIbhCNmzRLHth05+kIDntTG8OwtQzUlrfuyOZCMMlI2tEUoymk3hsoDs/rnnyyFUeMlMXszCSDdkMyj0YzOfEIsU++pbQQ9wj0SOb+qlsKvfHTEKMVBbxxpNLRTgcV47V9Yxo57s0ifJj/J5iM0ZuuyjEHDuWx2JTUXioRAi7V3bdSBTqHUFcoLMlSkFvgipmyh51EC7PPYISLO7a+iQWvHbBCE/zqyQkIvl2t9ZJY1lQz8xP0aiaQHojM4tSGOUhGll4CUzafHjTiBi9VdnO6Rx7ZaEqWUNcuMi8WRmzYjkhnjxyOfdO4GR0sPf+1e6GOTCaQBbr5O2M6Ar6W4uEfYyCS7jNonx+8H7RhegDF5ZMj3LJ28vdci+BESID27sS5cYFvlPrXTjYbx3dKYhUkCpg5cBWWcYxF12oQBShRqlCP0Z5M8VSyc1K/u9EYEYTZHhDNZYz9rmi47p1/4jkyDnNslGUmXTNpL2wkaaTW4HueCYJtIJRDr1eTt7cDCW1Wp9HgQR9fWrPAIhurEdHbegrs50LvSb3rLEs8kYonzrGsMsCyohriUr2k2gyo7yhMt8TOEfgpXLR/uXeLI66Zoz21rHZpLxurHyVK/LN4hqKGbIc/7w0RQ+aPvp7zOY2s5QzIzRS3Uo1MhSpWpmbl/G5RiYXFTniqGZsbGZ37VnEupojM0kY0EsEvfUy6OTu3UmkLGdciaW6VcxcSMwnOpbF7CqKLJq4j3QRQd6WHk3P9r5CxGdvBlh6qftq5w1Ht3FO8loTlaNdeSPUBpl1pJsxCy5FWcpbPh/uVbcx22KTvuuYNbvARR2Tto4qZO66ySamQFHM3yhHpk+flJo0OStHWrchNN3LW9Kq+it32dU+L4Qlc6To/rg1bC5JisrUEcUvVS1g5oO9U6JJ/h5dZpCt839mrCd2QMI6ASJ8nmV4awqViPxlpDvJ2wQ0N9Bz2LSVSHBsAG0eXhtVEkS5EDIFUwO7h0RNOYmYH7tnLXqJnySKhPvnOHNfHSSmtn4b7KhDoV6lZrfynem5zyZEZhQNbhQpf5TYuNYaHdKzB200CtKFCEdxTEt0S0IZCZAsRknVllF0sNQN6NbRxcYbVY4SJxTq5iOEc9b8eblwFJIddVDSbsf52BGNAm3OETZ2VAJ9Up56dMgtVYkoWoju3EHa6S5a96S8jdxbXNxjCAkRPp/rXNvIpiciLWdlm5E+I2fqhiKmZ5eKyLFn8oZsyU16NXONphDFfl6JVmNIRbbxPi67KFSSyN4rCc3MY2zBOcmzyEUNWRJtvyed5j/tmfOxn+njrHhQfFqez/P80zT28e9mmVN9zWJ/g6PpCLfSZEkzq8sfiyv0HDnX4XchOurDSRROyftigBChVFmY++a4NXXTvu6mQ04yl1e3Qwa72xYhQx3tRAynizmWuu2OSfk+WB1/E1Ze2P3y8PtLiJr/xFnrNgfemrJWJ+axb5x2A9aw9t5r645+1O1FRNMJJzNPdstSZz294ATJSH6KVip+Og1eP1aEw94bMVPDrgQsvL32GiNl0CATp0gO/TvcKZRZx1YZnDsKrt3/v7IauE7SATYLsx6EjJxAtO1pibK+dzKI/HbRe7EPgp9AyPC9exWI5ZT9T16LPLtDIIj0itAGWTqUXcS2k2jCo9iUSXnNrS/o8nYc/QRoae8BB0yc7MjwXeaFU4a+8pFodVakvsVrIPO3euGKPRNoF3GQdPfgTn10rpTQAIgGI16X0/Wxb2fMwM+p1ffEnh609TjQygfyc5ys4bS7wBF0c60W0jjRGrdhmITdAwmhVHKbLGQmO13J6VDPAxLdFT1F4ry/N3p8KOod5b5JZ0+6xrIONynLYcbprblPbI+u6KJnb4N2UFRfeI1UfU/L5M67T/MMv99SNDLqlBXR+L5IZK269NH7ttEVU5m2RTvKPrzHwJ0jJ1a/oCXEVk7AFqUIwbhX84q1Z11XXJPOvend1qVjDxiiCuhPia+StEoaHkU9INjOLWTgFLrZubx/rfrDgs9f9gbqlzB4rwReT1Om6EfIzwp7yySp5V03n9uympHjIue4rEXYr6ZxJ2XagUwuh9+W6bM5t10Sq/qwPau5uIZ1GXfVRX9F51UwYm1Jbr/DXrgCoT/vJBZhrfUQybIkBjfuQJzu6eT9SNso6+T10aKM6/Gp14qQhSHC434yRnnnuIj9c7txnj0nlERE27su9WaNnfI62gqUyojz2A/UnW2XJNEepjVrNrR38XUxtsmVN5NgiWgZhANbksBv0/yVXYKQOdqjuBWEMt5aWauvb1lAHFKWQn5C3vq6V39OQ19Yx0RmTv4SJPnaKzmv6NzLGs4AAv4kLtoK/LVZe/u+v2dLw/pkPjtXbS9112bYC214XQP3l8VFNiEXnkVmdSBX0vvdCd0yTTVY9KOhuMVkw3YO2Z7HeY49CmNGt8x6o4xfkv6hc95RzFzWM5u2tdAUt46LPUIdQZyEIgKd5fXzeYuMLTSl8XNziqjXcjFHZEQslmSdyEjROc9sR9dfoW6sncsmSNkYkQVy6aYm9lrDmTQTEfaeDY1CQCadm3cdxTwDkL5bZZJeKzx5XZt2HP8FNbsye/Lgo3Tn0ZOHn0t3PucQ6i0e//3ZTv012/nX2c7x8hHzOOzIVcg4WuvsWMDg80RZ8n3msRldVcaAnDqLyKY+ZDn7nNkLmBTZ1WlfhZ3HdkuLwEfJpv+sBp85zItg3gnjIz8VBmdjSCYnRqqv/qMMbEyS4h181qkbVTILLBRhuMGSTibUIaKRlnZ5Z885ZkN/46VHD2T/uBR3Vv8gi3ZJrIwMTvNzPhaeLdysjvLOjlgqOj+x7iRGBneoi1jGjPAkOetiJ2zgUi5ppTwHkibg8c4ugzmFmHu12JWD8LVB2ZY6yMmLGZii4ZnJaIomQDxx1f1L8tIF+w7A1EfdwXkKZW54IbaVRQU4wVrvnAVsFTt6GVmqGJNGzBAzXlKWJSnivf4BK+8/3JSJoXZ2U73IhCL+LcPAKLdBJySrx7CHwulAk1IaH2aU8G2tljTFzPBaUR9PQpqvMDNJJkOU9Ai/9AhWRlmGez67XMSioYTr+umJEXPbe8YJ9t6pcZyXmXDsOwQJU92R9Q1Wfexcx7oPolHWig7ec2RoVimrozzaGaVbzgfWMuz6xFpmvcQQg8N09hqcjx1NGVdndjbHXG2UxSTzszZKipEZ58oIdN1bReix6/0UOwAmaaH3+3BGnuToTS8TvFSDiBnr2g0l9b3NU9V73IgQdQga+Tibk1JMGsWiOxp9+JFRyOTNa7FVNMKUOVMHczsZBzHKZs1j9NpNlGynXgqSh7tJ8UzymzIuSRmTdYNL+rfKLNoBDCgy7S80J4wRWK1onkoJ71oaMeI2Mp5obGzGOBex42SKZqxqw6xhHIvMDV3LxAwu0rwDZ2H2O0TfXJO7M3qKkHG47908EJYHOzGLEZEzIuW8QPdioAlEFyIqMLIuAeNC9VlZu1Ena6bGTjREZqcI2rtRUjoaxp9ZVXgO3PPBwIuGafncu957MLQJJZI0f+1e/gO7y6pHzojXX8SCJjPeEBhKVhkBxw535s2vGvN1NtCwD573NLQ+RUajTJ4ozY8mjKhr73W2iMl6u4idePm8acxrteGUAegMgdHZQUjW+aJ5orUJRibN9qzm68ZpUo/ETOraO27u/TycMebt/r5kxrSXWan6vbv5rLgf8Qwjm/MiVEWMdHQ/FDdGckNIeX1vOsjoWuz201b3ib1ziOfkCd1MC5eRHyavETdCJ2SQdhs7MnOPmWZ1dNWBqUwAX8a74uWt23Rg3zx6TIL4hPyYs2qLVGKQ8/DNf77mUnwiiKyzT1KQR5+1Yap/rxzk4dcc5DcoIi1OzzszTQy5RjgTLp9WXmxNVHVnzQsmq8xY9/mmaq8/5Q2aKeVm1Vdh/zLlo1kNGeL9oV1tq3CO1OE8GLtpXbSrQ0Y1VHXpdtYj7lCZSqK1vyM36EOIMQr2rbPP7S2H+RN3lW9pirnD2UlULXLccO7Omgfg3OyvXBkZmeytIz3gpd3v38Ququ2to2R73zuVI3mvmYXvY+IYom/LpB5SlThvhVDXob3HHWwyoSlCjNXRU58rhDzG5c5/JKKkeD38+4aVAuyf4HdYLHHLuDda2BAtK4UMwQ6gu3WD2BgvFG2swBwVU+4d2Y0db/n5MnvFNZznXDHlVx5/uEE94mUi7buDbEnAS6mUczo6OsY9PXJ/xJ2812MryvHdyP7SuHx/KR7NqvFKSw70fWQjAplwXsRFuXNO5wY97bbkekaVnZtOHnvxKamijZyvXvZQMMHEuedG+DLxhiGiDHQv4sbpPEWn7qjChQIqmj9711uLO6Riuk1VHnWkRq5ssYPHClBC3Ro/Y+s2KhU5UYgDY2UOdhan/Rhjq9O58H93tE+oZFFDBRa7S6NRCrmxdh/ouSfrA6oMO+UNnpO17AUozuS6aGQEK9sNR1Erzl3EmTirN7JoaGkYOahSkPk6UcXRVGxTwgOgbiwdnfXqGfNZUyiC/2sxOu1/UPGTET/iWFbhhLTafHHe5J3fz829xhRbmmo+a1XpPCoHk+k5DPCB+uMakIOozyIqVCM6jVP5hthqMeP7tupRJUNqcFRlI0fEN2ujdcm2aqKRt6v7nOvmhpqt52Y0YndD+mnve1r8vfCYQVUl2W8dvIM8UP0W1hzKK2jeXjRuwSQFJ9U052ym3LqjjREqxawoJBPwC/ldqFBNXt1Rf0TmKjBfVjf2pcd93nRmn7hcyiOJzB8ymdqWvmYTwj1rIsNKu/PPh+7kXNh1teF+qc9q5xWfzCtBqEBy7eZuHryhMm3UvuTqI33+d6zyGQ4B9ZVbz0g9nZR1mefexHAaViW1F1M9uOZ1I1cDH7gkT3zL7ucnKOFLfg7mrXe6Ru5x37CUnba5f6amLbYmOxhLt+xETfszy+lQxVSeDOXs1Q4xv1vGoU/qTFk3eaXAxnij3n7tuPD+i3nEDHfR/qTWvHZ176DGCG4IO0Qud+pZ6PM69S4lE8F7emZqsCAVFs11N86hU07jOEeyZmxTL9lUCeY1Yi04btlGtWOXTJ5nKsm+F0+RndHZ6EV59nZIufcWyqfYf414RyaOnM2DjJZz3j+bti/MmTpVNtlwQ3Y4sWo968k9uF1XO69orlxVZzoYVwS3ML2OHEniXpcmNZ7pZXWpHLD1JjFUxE2Ra3eZVeaqqEbd6SwyfGwkZuhWYMIgo51VOwFjKHUNA5gPrDhWiIOWtLKaK3zbZF0oXAxWhd6BXqr9rTjn0RLImQ69qvyVnveY+MXYn+YVblUz18IG2rSurj3WU8/eZYxBWDGMz2ZdxPjS1bGKY+fzIVpYQT199rPOO9VnrsJWNVaWmG7x6E4Utv/2uXCNMuLGrgMwI2xwiVvpIFqd/VInx77De12D4dseo/5dWJCxMQ6Z47rJ1hR4Zn4/tmP83qhXPMdy6z1xc/LNz+AMbxhXeoMi75M8bFBdxzWUqOqzrtHPqung6MeGGjJJd155S6sxxfJHxmApv3DcUni/LFz9Wncwmp9z2ifpW/LYoGg6SE1pZ3of+aNX3gjI8wDNF8ZxbCzC3qZqGgbOcDQjJy58u1BTjC350adWuDxjNRn5x8YfHM++l6Leze0n8akZt1m9zB0h1MiFDLMtGWvECtHvNrLLxoVyAMTyjGUQO6w1TmebD7RnjDocwwT9c9j4zRk6URfjdrlJWcx9/pb1KrKVzlrH0aVhBA60k7Uc7mHrMfHYiDdPPL8R3tQ7/3oQJxiYfMcezpn2aTYgw769Zi5GTQTGuqMd7BQ1ATKVV+MpZ/zYs+V+sOvFkPBaC5vXpT1oTYye7jb+Veuolw5i1gTo5ApzZ4XFpm6+n6G+cJXM9amcdwaDMPZNZBZ5M4uZZsN0aDvv99uWHrPo+n3PLLRvwtjdDfEjm5GMHMSCyhuol4GjhWsPwM4wqze75xhTQfPyn8b0Hj3+2OPs0YNP1cBPHn9ODPy7gXqPPgH1Vu/evXz96pt3v7x59csnAF8YknBNT3+6exM+ZpWHf/7y6v3t9OIGLzDU7cWHu3fhJ6/06xdC7Y5h5ML7fQrnvbl9+dL++OnP727f3uH+6qd/qe1B/M/t69eOCL5999Ze9P7ul3fnVx/9ENChgYC4gP959/buWhfw+PeUdz/OH5QfdZPNs/wz+u7yMzit/+zf/0Qff+GJ/vJs0X19nl9+nnX98Dc+z+IPfZ5P/jXs/urty8Uvv7y7pCGcP9B7o8m/ffWyf/VvGbvZsNSfGRX/2S+vXr+4u/31/md+bqj0Cd/b3EmPpirqB4+ePHr86ElelnX25COGVl5nD8Iii/999P7v33345fhKb5mexv/6U7Lio2d89+KX/tXdJ2+MxxzH6Dc9+Xfli+/evuj//79W46NscRkef/v/ZX/9DW0vrGry8xcXxPHd27evjndazOFn2b9loVQ23B8NzufPsrIsHriR5j2e3OPfa7V8de38DTWqbXllCp3VsUIdaVoUrfFC7tWo2jy8xppc1xtr+hVyu/C66SN1UIitKnM4GuHaGXKkdjpn4EqcGuOiFZuQ45nLzibE0eG9DHf+/Gevvvh5fxjHrjtt/8j6FFnyq6O7dTou7Tw28iJGz2tg3S6+lRpNXS+cKyOL8Vg38lYfg3CsIfGAZFks3Fi/y6Q5FQbWCI84ODt5FNdmovJrxuNbHdzxyJvw9dQLA8+/SElzcY4aXULhMnaJuOm48KZwheJ5+zrPh5zZHD0wblaNNzOevEGoXCp72rHTGQh1DLUkgFtWxJiJZSqPT7wW5R5ttJqOjZk9F1Z+FN0g6agYx9frQG1yuSQX6Z62/B5vkY5f/jykG1ctwps1y50ztrHwOqU30FpFFvk9fFj+B7U8NNBawNneGnfhdK2/v9fzMjYeXk+qUU7uIeCtHuj0mThV0GazHlZ7Y0BveUNcAjiPsFKw/N0Dw8ei0tfSMQnyvJyZn5ofyoEWXB6vw/hake+LryGOIXJzNY9NnFXHZIzHN7ofRcTOVntXa4nPenDcL7XvYZ0j28jv4UC/hThWm13kDDGfneQUZjynmatlIy6WXdcN25VMPoamg5avygCuH+u4Xkcau1tvXLwWZnmcmrkHSppzxL2EBcz4p6jrwHuAXCs1nlazv6VqitM3xCG8vvtsZnu+mjdlXnijUFmfq20EVTIzJVDkCcqTpPf9ptY8jPjkffvy5MKn+5TCoxEOuzUr+CHVQNaOQd6xua/7KnBed8SRqSkP738D3b1de8QBC7SfES9gI0US58KxFGeRqh0bk3u4RlKuRPyamArbPaCNinHTYj20Fm5I/JgqjFgbO0xqBsk6CK+bStJKWCjXRMSqWyiA3IG3jXU91A4H1RBnjcv3quu3o8YUrY5mHIroIZNchMlnBO7qXABgQrFufaefZTwLDvw8+ozoOX2sc597W3cnVP8KVKhmn+RVSDigOJq4XOjEWc9Zqq41uagKn88cOHg3t3436+weY2755O3s3xZdhav8rWnol+P6L8MKWf3g4cMnVZaXVVaWDz3/U6RtLfnK7OGTx1nIVMon+afZaV4+eFhmxaMqyx5mdf3kc6p8a+tXPAq/zaq6yn8vD/2vEv3fEIy3IwPic9mZxP50GK1gYHb1M6O5zIzn7otWFh/CgWPW9oWRGrqpDwF1zwLo6kvvucD7/JE2+IfsD5StwMC9W1HGwdLANlLTO1FwG4rMvXf64PZGKh1MpNv2TnMp0jENSr2g6N7LBghNjCqlcHSMPblxFJ9dApN7GE6K+sFDPKdPIFy5b+S+dsGd7EG26qi0lZEwaDIFrecg8hRNJgoQa5Xr7lI5VxThVaRk0eYHtArIVdz0k/dB0aeulyU1lmramRC+9+P9Avh+pbK4upO4/ZCud6A9DMq06nzidkRnSlhA/ThIjM7+0TT4Xosy0ss8YU1a3agufYkyXjjFWBS2S6NGADIJtZCw8BK+evtWek2JvuqRunVQ2BUt8cZ516Bo8u3iX5qdk1bhYkiz+1GoLoG6WYddPDWTvY5SAbegs5JfpJJMGjeYBSuk8U5SU8uetHb0MySQ0JiSqWgtpW5z22SfM6r71G2iL7Xe7/gkGz/2eZ+i/Q978UrWg1KXS3xi//kulrtBeWG3JFK3rOzuVAXS+bhbleqSVyoUid0n1ZGqiAJelZ1lETbvyhZNmmk0vu1nEgKsjw2pTsMmddBjb/uQPtj7bzSunVJJrm+Wqrx83XmparXw8HomVEbKHNNClTmx3tkB6eiGGW4vdD8cBt0eaUUZO/rt3CT/MLeOM6pb7lABbY1gjXdx6inL1Za2HCCulgHBgLB4qTDPU3vSNCe3Z5xL6Px7pXLcS0YaFLCcaxTZn/7NpbX/S9BUlA+qosrL/GFdPnpc1MW9oKmuHj6ows/Lsn4SAqiHDz+Jmh49fpAX2eO8LB89yR+W1WdaDeV5/qB4WDx59DCEV0/C19+r3/GnJbmvYdNnw6am3uwstLFuQWFx7z7m2RumeMhaa0Vs2idzOTr19r4/P//x5fKmtPdppqtTW5uLtmk0LYwKgYLaMFa1cTxCWDV2k+mZzC0JNfsPYWO955qERTftP3Sn7bSxQ+l0HMNmXmz+4K5Df2y4tUG4tRfD9diLEUXfkFXs8ZNtvP05Qxlto1LNihnX3FMtRX+HO7GgZ8rPRqwEsqmTQ2RD/6B4JKxHhURFJ4dOetyf+9RvqJU3e0uFm9jvDR3xRqFsgzd4pVf7mtcsxgjyTc965YLHUKtBrqmtlH2NyL7qo3Ojh2q3jiCuvZHk5N4iYvG4o6F7W+AexdIE8zqiLzzihOqthTT0juT5e+ZyKpycxcGwdFvA8REsGTxP+pBcO0ukcRZWBpTOQhtk9UDjxIpxZu3ZXWepsMKR0bgz56A+FiNUenpmCT3dJ7bXRJWlIyOdMznRdHTPo5/oZgl2LX1/gITiWHcFhx198pXSvaC/U/ocMqw20YdpkdSko1DMXVSYXtqlQsjdopiHA12co7HZLnCKFDaiES/m0IaOmFBY0OnZWHFbsnluwcyXG+RZvQ2o1OY8bOjsSxaXwvBtYmiyp1Th19A5ZuLozMyB0JWL8kIa2x/l/cHQzj1HMq1VR55TOLL0Nbj2flkaAzGGgLqJCQa3yxCO/ZubDv6fgobqwcPs0aPHWRkihyqGBaOXgx8+KB+XISzI88chpMg+rW8+qh88qss6f5hlVVY+Kh79x6CW/Gvh87cEDZer09aKjLnZGLarYx4CgskM+2dYS1jq1dSuzPDDWhHaa7fjJ6YhU3hvMwo5Hcs2ZEkhMKitEGpb3JUFB2H77qZjbQWv8Fk1SH7LKgtBhAUYRWeYze5QoVg1tdbi0IxK6m7XMgA5NX+kOC8ckX9ks8KwaUF+QPO0SKWuGzdgIgV+atyanodyrq7ntunmLrVyk7BNaqgVG/vR7GvhBnaiCQPid7ruQOzATL3Owhxk/0DJeGxM1nm5Ihx20TY+SrYApQ+gO6/cWr25uMyqVYfdWblTObE9fdk6nES3Zodce596LjnkmJhl89atJ1hyPR110KKJnDdPo4Q7/t0+5px2ECAnxN+0VrIYUw67Ll2mtrmWTG5mpJfaGGy9USPlSaKoi/qMrxveo7qU0+ag/fadTKOAs0BGeJgab0KZeyMxmVVODJoWkhlYCfng1NFL45ReYEFbyhOj1cM6j4HXMhq00dzqdpHLWlnNjvaR1kvK69qxI5WomjnNXp8JOcvlo6DVg0OWs2UdoXGMhpiU9zQRs4gUbnZn70nPj9JF4konHPLJ2gZlRXveDSWC1x544u9d9jK1nrPjHjFHch7ILIy4JINzuxmE9yRTxknjYbR8jGur569gBGXiQy+zTmArlEL62BrudGAX9f+GA//hgyeP6nDkP8ofVo/y+5WVvK4eVFWZ5Y8eF+WjR9VnQIL8QZY/fmIa+/LR4+pz/Yi/gCP8+8/7r7WV33DeNyUAgLDTblZ9FnaWcN4e6/sgwTakzNs8pDtGOqq6aZ9tzPz+PtEprIhqhIB+tbfXZZ2BAhCXVGY2dAnva0J+a0dGkOB0NDDgEnbMahN2bCM/GWrdrbYfLGXuwt9sdn2IAbbh9+2f+KzvQAXaZzzriY020dJL1kAuoznFlguxSahkl25JJBlMxEKj6bFZvKQWL62DC6y5TLFFzKwp8tol6pR2k3bB+oAbhUY7E0ooYXTKpK/29gY6z1VOt+ZsaoQKycxWybEluntJ8LbExbk/Zzqv71L94jyJXlIrQS8p++sFIsC0N4/mvbded2loW4Q6TtvLZij3ppSScdk1znFkxSKSrs/b14AORmmOW7Sp3O/0l1rn4oVnQ+PX7ZZQUYbWSorTstXHyLE02afbYrFhMWov4yIm4zKaJDgESsmRZy4lg6NailzmMhO9ZvIYofW4ayd7NLS5WEs+GWVJMttl7BhNZCX/3cjEM531Dm6cfZz1TBAXsfGsbGZYY2JzXcPybyI9hmCZm8AS+GCT2UNsHk3qkQFXGzZaVrNOSKBr1nQOaE7bejsZNetrd5AQl7ILuACkS+2SWM9KtYssgWdbrwuoGbXsjdQuRS2bGGcu3baqT/Jul/qBlgdQLMYTImqEZ3Xo1fgUpqOQuhkIgpY9BHbYWumf0UKMbGqR5SY27BBk5i1/ZRcrclvBJ60mFZOemiJG7TThNd/9N1QznjwoqxCE1NmTvH5yXzZUP3iSPy7z8uHjx4/r7OFnnEkfPnxQPK7K4mGdlXX48hkCSJ5VD/KHefa4qrO6rvOi+r3ilK+mQb8hTtlPIQbJwkozQbQRqjMjsX2MSzDWOJM4Hc7hzepYbT5nGgQj074ys5+wk2YU1housQjxj8UbFse0eQeDii999uKLn/enLWjkbBm2QOuyLtmA5qQe2y4Dqrrb1JGauDo6+cz31igFDk8gSYnVsFUgKaTvakvFffWkVlsn5WWn+PuYc/LrPra+Ul0fu1u0ARQlUJYVZiUZAXz+vxbOsZbE2+SS7RA/m6D6QLnzdi7JVDuuZAHQ0BJO9M2F2r9Z/rrlGQzuxV7160ZnNltv8bqPsqOhgXe0dARle0ZnXzXe3k/2gU7hFpVQLZH0+aM+3+OTQWbZzsWpWIg5616bBJg7NZE2JW7HAexJBZ+hvRUmFKmSBzZ9oN0Cc27a8+Wg1Y645pJSaT8HScUH1oF2em2vJsfGNajVjFtN7Bvxfgxnsq9nYSKk2rPmf3SLUVlOtR4Tuf1ubDpA/kjjPCGNJxsOb1hYG1XYkZXOXljM1q0uvGA30UQ80nEzl1ej6JPo6ckqa4pWF5kXscTRihR64ihtNHiX/Nfwhjrek6z7WGBQS0bKxknxt9Z5slD0vELSB9p7ROun1i2gvIgD2fvcpjEWdtzGMLVeY7HDMSvaW7pdRhF/51YqHldiXfeiBadCZMxviMVkWidoSD3H2TrJ72Xvw69YO2f/nFLz9E52IGgsPYvFXeIwyBph1GsnWruiEc3A+LWlNRnkG8bP+UlWDa0sMf9fe9fa3Lh1Jf8RC08+Psoi46J3ABZXohP6W8xsYUXGSSqWQwK/fu853X0AaTST2d2aTGpqPqhkayQ+QAC3b59+OFbFfQK821WYkkOgwWsgew2EDWEdStfn+TkL+bYPiH1Qd/FShZ3vp44dJOGK4UZkEG0cpm2JysYtMHjNew60TevfodDdsaXXCt50XHDdUyK/RjSN/R20Sp0NxVgf9/4+hrja9IWMibHoCMbGyj7xpGHwKD+fnN9+TVpMCnnWiKB5UfSDIhDU3pKTFcZvEKHT71j+c8Swj4Nf54l93SJXPtH3HMfrbYwIZa3qHTg9x912r+mg9Xqaxv2eJF/PqcliUUR6fsUxP/C846D0pZbo0OmaUFwAY5pvLmnH/pb73KNK359jaHlmpItr7VTkMLmH958qTcc1hefaK3Lsys874/fbvwNnmS1mdVbl+XxRz1d1/SoXtFitZulfF2VRV4u6rt6Xg2er2WqegH6ZL4plvpxXb8wos7SnSHuFfJV+fbn6bJzlt63AJ3gz+3dn6yu4VK13gyfIvk43u5eUpUH20qB66x7LJt89NmX73lbATmzzNKaLeNhmCagm2J9g/frOh29vPs/6w4/91cL+3pOsjKIzimTg8qsEGcjGQTOuKXMdlPiJZO9m/D2mlIXWw6EONB5OVVVjO+s+aC327iBRkK3d7yKBSo4lSq3VnQON0HQU2cfS/gBNyg70oPQfz2Nj8pZjrf2YEOSaki7DqGfDLqQ7dcNEglDIs3lrVUeckrfUaKw0q7YXzcutDFOvkMZtmqQDH49tmAGR0bFGyu6q5Q0p9yfI7vs7jjFPU1g7SoMhSa/CDccE0dYTI6djO6WwHphc2bDNlelfPqo9MR3n4Fua3SPStBpIx0nz2u97p9+NTRN9uBYHUaWe2OZOP6bnuLQW7rYt+4zY/O2dUGOnHGCWa7LqgPTnSBcPDZiOk7Zl1GNFeg+OlUO5G3V6pVKWkD55weeJ7re+7eUOO44p8D3TrJ7g5gTlzg4yJnXjs/G2BTbDH4Y4/r6d3ghiDTumurpG7/zi/WBJj36/w8DWYrVj9KDG9XWB45dja6QMbnNI8r3JWInA/TZS/ifjb52zjwHxb2rE5WfC6yGaukGj06lH6bo7AdtIRPLtaD6ecyc8L2wkcLc+yD7gDb41En4btYJL1zeYG9Tl9T6KP9k1VDuUd33YBWmYE+n72J6L3jo7B1tSFYSNcti6E3xMaYvt6HMbTvDjKFf3xwiH5LhFQpMFOidjxEPI+2AJkXENeq8iWn63nVKY23BbHyKhdPtCZq8x+f5KiiKL79KePkZCc8ktwXMbqc5+XdAO4k0l0RbMFOhbaBylMxwuE4q/Yau5uZ99dJDhnuCQOZ1fP32Usu68r6X1dx35mRU3eJnuxmGAWHve34BOCRpFlAOIHLx/D8q6KIvZcpUgZFZU1SLL85dauvl8OVuU9SIr5lVCo6s3Yq9Wsyr943KVUGyxWq7e4qzz2WJVLpaLvMyqos4WnwmnfqOsP0VKlxDayfLo09e+by3743wpX+nvi/Q7V8/4MMnbcCitN+t9nNr9ZhnF6SrLLEfRMv8a9yebNn8zpLtP3iasann6jWXbW1PdfVXuHrsifWWWU51WY8tKtrbislmn5zxvyoS48ta0/+fDVzxeT2tl4XeUe91RgrLqx7auvWw+oHEjiWCjhBDezz1tcrSdKX3krG32MYvHxXh9mPz3rRlT5m8YWXvqQYWU5g3S533LjX5etf6Intih/QivM+7Hppd2zKe/y6ibdusbbV0ldMyRrOu4dode0knCB/tkQW/1TCoNGdsuZGyePj1aESHfe94FrjiROncq8crjSpsYk8CBH4dYk7DOKeXQZWDS/DPB3NPMG9BXA5MshujSNJqEVO8O+FB0aKnuSVI93rpCjJIBi1KaRrsq1kuXAogOKpymG2IEnjkWcIwWOAyJyDaGeCTVZ9jVaVemyaNbuBsbXPx5tcoR16LPVzQlpZ311I8/piUgpUWWNTUFkGIDxY/zun7xs6D3O312fmxbyD2Hsa/Zxh0/egONUnDRQOMpstgrPY3PrQRW4UOOSjomwmJvg72ZywlGzf9JaeGizZC+68dL6bouMai0n2wmtDnT7DGK6u/Y771VostkL+MpqjXO8U3G/aA3OzSQ51V+XP0z60DXPTKlGxKbPkYp9xr1gwLF63qBk4EVSfNN7IPXEffZfurANHC7jk0aYxJW7wHXsShhi7b0FOwhmqnFdtKAQanLjekXovNAFT5NEybQUPCKyjN8HsnhR0iCsrCjkq7Haz/EvZHHuI9jrRap8F4k/PiXL04TFlU+yxerLAGwejGvisXilZehmmV5vVhV86osltXyrT7T2bKuq2yZF6sq/eobMaTLxaxeZVmW/mOVEN1HYvz+X/jrEwIPv+GvbZ0wR8I+lrtmVoWuaM7N1bDVNDaiSTipGU5V+rfMuoO8/9Tx0AsPZP/OeMBzukv471yG1pkQN+d/4HnuPvLYXy3WGtyinnHPOmkedQ4m7tdmydYIYvQsbrQfvVH2x+bPGMNcyc/I0j+oqQKJbN5+J1l4NBeRXxjHjcB1PXFdNP61Y7+6N6+iycASuxzXSZZZcVR+Ixc1jA1qB8oBN5JA+FidO1I0LuB1DVr3HJs9KOHb1ySOpE5KmhvEPdDXWAA3KonOjvcmgxyDiXQDpAtHl8ZFZIZepydf73gP30lisb74CD89Rx081v0dU8E6cQdKhGeDj63NSt0KO4H7IuFh7bRGjM24o6WeYySt7Rfu1KOxCZL4nvEj5o98RKtA+kyFVUtwW8YnxnhVko2Bo8ZsB6/ejWlUhfN48s0O7HMfzyt5Y8nBjHJGvs8bsXgm+wh4SsNSLhFwvB1SwgkvpjE5+SRfxxFP4vy1JBbiaWkV+RFjzAccz+PQiAkpgnvB/mUIHIljWHI8XhDjPFv7zdh7fkJb4YO46K14xqxFe6LLFBQVEYlosGXkzudgnzNIRMh2VTvnPWphwk0iamXNdEbwgHXL68vPU/Bd/STmQXjCz8UdbUnRDLJGcxj4b0qIrP0TvPr1w42OjSUdDna9H/us4r0i42c6KAmPnLrdf3iXb5hsSPlzj9fqjbh/CMvOIG5wHLsedC5FkqBwMi03gcfci419CPcjzi/iWoCE6fpFc7eKajkrltmqKvNlmRV5+TLhFoPWLF/U5TwBJOWmT2WXy9m8zudFuVws5/Vy8YYZdLWYZfUqX6XfSY/0+eri8+wbhPqEUevtnSVjne2COdWIse3KVxDK3KCWsFU0NlgcDlljI9T3KKzGXB25//1gqVvuMi3pAfjA89x98LG/2lHr4C6N4eSFxu09KRBRKloKVUwdTo3tOHZAcKWPU+nEKF64DM5wKzhN8SB1+kbBoooxyETBaDk0GMMtLUq/sH2sqeRCcdeklBhOzwbJVR40e+rGYpAI16XiJpK8BhXrMiCTLrlNN97+D93EWk8VoC1/W5QWrVGOODpR/PjU4/sRbOJSijQlp5QYFCoqBElPo8pSt39b0qoGdIYnXtk4uUFhS8lyKXdrelEzHruIiIwnqu2YaAb6yVVz1wndFxBYVE/AKi8/RCAlHZSM+Lh0Ufi1pgsFMArUoVMaKDCXsuk4INoDUGMjd0ofoymlVjlVdBJUorrQk6xUzIigSoWE9rGUjaGnT1SfaSROiQDVS3KCAkYznasNWusStBYpDlfANRytmmp4HL8fOpYIGZXBkZ5Tpj5Ko5PkxkLRcB1zzBvUaIvz3c7HiteWVL6TkmpXtY7hxaEAPnEEJwqR6Wr3dPWilKfiFkdl5v6+WsIKh4DjuTfSrE6d+efAayOukT7UvKEe9a1LQfiQMRxZ8LofP+O4bu0csHPYY0cAf1Aa6FsDh6unEhRpF1Qq3tMPHJcbpQh1afsCAk+UbqSu9RpbbZ/w2TMOhJ+hHU9ARi/iZPk4FMjY2vmYunmSerzrp9EoVNNV2pL58xFioyxng2JyuIYAmy2b4ImO9PNRqkovheU5iK3fH748rfXxseI/t+z+87HiMptVy2WeZ4u6KparzwbJym+Q7NMg2aFqH21SeLGGQbUATKeK9bvzqdpZ69f5dGvT/qq1RrDXAR2mfEsLQrPe9gkE9N5GwHbKd8ZwPV6KZn1KPzvmaTHuAdPeeO71h5/vK4Zprj04IuN7CHVXjl4670xzE0yY/R41lfLuM/aPHtkhh5DMUFU8wTSzg6JpcJYD4nioXswMMnaLyoAy6dB2BVXB1yOzpu3AK040XoiByZCVDCYoZfaM4MpzxyVzzynYhoYMh2g1E7gAt9yE46El5SQfHGEceK48XmeopgTtop8032oai+W+b8H43aTq0vLA0MUsRPUeCImOvzZMsBtNdgPi0vKIHH0EgFyp5jJYAcMxAyliWSdbNC5jEYYxCKK/i0kfINnuXgzlZaJQ2yjQA719ZmD1cyCWarejjYzavoMx5BApZy1TuPyzQHdswWU9jAZ6HkLFK81DmU938XOqkhRmwc6H9H2H37+CXfNAX5qS0FveiAFLv78LNuyOjJGnp+mY5+zX9J9NJrKlT6QGTAk9TOYJXa40oaAP4Cy120Vd8DQ6q/+XzLEZtGggaM6HMZeeaW+7tdR3zoqNXSIwFcmY0odib30XE8mxO+LI7/iZP+7DCDvBZEbAyVWM82QKmI2G8BOCTO/Rpd2iU3IIqMZgGU4YJ2lvMcFHPyAmglSp6XU26Kv1Xmx2vq7Rg+KpaZx6gj1DUMnOp7Vh5jfDCTsDNtzmIVk+1BGPVEXe6x6i6e+xi2ubMM63SPfxPT0uFQA2bURnaAG1YwQpZzx3e6bZudnO+1oiXTAYc70v/D4mt5jQP2mbs9U9pBDkbqbsnCag/Z16Mnsw7ToHTtH/8WFmcX/10FybFqO3I+d1WPKe1nNLQua560Kx/Bj9jaXOQdxvpMp4rZj7N9Kv/a+g5KqazVf5clVnWZ1V888GJb8FxH6Kp9qGmlnae+Y7y7634llbIF5AyYMxc4WFtSZQkydI+GbJVQJZJm7LXGT2mBYDllzt7i0nLi38Vkia9sDpZlY1VqT8uP0tgZs83ahry4pJF1flsZ6Pm98siz/dDMt08qfXdUjw9OvOfjH26pBR7D3xwUHgTRhUqd6dwtxejEkMb6I+fH99CU+2V+VfebULSz443EDuxVr1SDI+KGh0g4HqOrLdroSQXgM0impcvIasNfcjXmpCt8HhGZgx+H1phnix8x9OFArtVQOt5YLQqYkdPdhDF+YQQvt7J7u2HeiJyzWIAUz2Gq6ok3dI8YTMEh4PZxwo+p5m5FFmvAkGsb0XFJRwfE8f+ImD5qiDAqP4wPw7HLNSeXTtOXzvEtn3hJwIVpXQ35cGE61vqq3EhPBNPyPgVkaEI1jZh/Ay1mSbJvlr5h09EGZHBTcGrSEoU7XxRhX0yFjTUNih/57ivy0zgRpW3hBG+mfWuSe1xaBs0PmnPBuez4OWOiRCdfhsCddj+ItgYQ3LbuP5fSfmayDcBPRxOA12DNky8p42ysDrAQlhInkXPQWbYfTeuwhSGTMMBW4UNCxoO4yD0YvYTIbKju+BfQNiCHt65nP0PDRj18SwZ424/MXp8x0w+DYzzY6CS2xvOg7vN/y+pZff85Vk8Kkc1qr/op92QETeQLxWwvSrvOPwFG9VWUZ42Y3TB7DGGa+50SzyMLmOet2XnHFTNfbzmJW4IbzSfQesMD2/qg9Px6ljNX3Dz2v0+0Y3A6G+bwdl0vDX2Om93rbItawwLYARgz0R/W4MAH4OU4fnZmFS0kTWlUNHDmUdVucIUvb7LbIilAmJ/ANmAx6V4ZnpXOC2HflEZwlmDy+qwsAWT8R2AV/xM17zfs/UVlQ5nvh8KFI0g5a9F79Hu4c7ts075WA5fKaJ5V6QfNySqAtG+Ue6p9Lb/8xAccB0r4fbogflHBkSVw3u6Sl/hqhEVXQwu/D+A2EATVdkrm+ss7txC0o//rFjltI47eoJ4fl5jkKMbY2tDTIG7D4QkP1euZa+nYv7Mu/DHUUcLuBIx22SQeVCnQzXr9c33iZC1hzHkplRT3c8V064HiDcqSDi8VyIyoUVPfK/xjXpwpwB2wrv2Q9iWyWv24sKxBddILY2MJx6nBbula02Vq4xryGEzJjmZMz4UJap8t+s1u/LM9/VbFkWy9WynKfvi+WrRos6ny3n8/Qfy3k5L5dvGGrqWV0tFnlZZ8usqudvhFWuytmiyqss/UJu3pzPJej8Rn1/iqGmTPuVsjWFwOMp7S+6tN843t6nvruqteTkc8Lj522W0JR3hb1WIyTU2Sc0M6Q9Sdr7HK7tuSP1vbkZxW3P06zTpj7d0R3J9VXtBh37t/UhPcfFpFWWAWV5UZZHPRiVnlDONe1tvuI9S2MGzPWelA0oizHLBpWYDU3OnBqXxBgwSER5gqi1cUpshg5iK96HGz0GqdFtznVhUoAhsdVYnYjc3TB2DPi6xN7ITK87YVEpLM40LpwPnTq9ZPB2PL0GBgItmXa3WJftvdc0SA4xHR8FfSrJGAJDy4CkfQKmusRpR/UweU4g79GBoSi8L5VR7Jiun5goBhicwoSD0pGrxgHKXxF2NWxhZpWfOQmnaehKDHPj6MGmvi9ob5Y8VJwG+1pBg73vW/D52joFLNH6fsCrWwtmag4+bf2+GZRpRWo5MqDwGI06oTLl1rQ0iiOj+yL6d8y7ljBW+6VBx0AjBjenGo7W+u10ok95eyhQ2r/QDNFjb4IMn64bK1PZEwcqOeM5yT2IjOlb4s5mei4UYfJZH0RhEhMdgJWigAPiz5bqjyP2iuU4zY8eOx6/7W1yHWF/r6rgadkEVA3KH6PZjQoE4GqnXbFX1mjCrz+Zxye5TEeONTbsdPOc8dzVS/3ds8pTWJmtz+AKHO+4+uZdg2uOYijmxjXy4xzv5/lZtL3T0MqmhVrFjeFj8IHUL2awO0KxhBEYK7eR2wbFxfYjQtFN5obmYTQ8EXMPPB/tHObI7MT7RVzT/Ix5D/Fr98cvT+EuZnmCMMvVfLXI0wq7yF86kP9Zb+pyNVvVizLLLElnVRVv+F/+Rfxu/Q0ufVq0d1NaPUcC9xZrWTa2yXiv/+tiqS0JCm3L3bq7poXxfbh0TnAp3cjThjLBHqeLB7gMuvTzzuHYLi2w7nMeTq68f/O51x9+vq9XKdC4gOoATwAoE/lcct7+npFNwT4wUBSDL0X3EDn9rCkXop+1bKEr6kECwFPHugouedjm+cRuTeEh9PmINbTtst+GOwmeQHWQQjUoxuWkkHeFE/cbpLku0Cuk0SdlXFIrX2A73EnQVWGrapBhkwn6eHwaJuQRMQkB6B0n3hv36ABe2Rl0GFDXyLhKLTOPEtwBju60tT1PvAv476DAIitI2SEPvE0rEjpBLdagTDrUMN3j8klBqZZ5o699EpsDDgXtGuLaFv4JKRJMtFaBivTHYO2EUwB9gxju6EDbRfYIA5pBS0+oFVTB4ri1iKsExOg97tEhjHdiofKXNa6oEkGeEqhy1PE2qLFEpxd62QQvckyqt6Qajq44gOfhxZQd0Hr0ivZR5WGfBUcKu/sxzYP+EtH9rAI5dhPfvp3PVLS45+VK6howybxNiEbsR9jvPqgC33+CSBFbBvrrn9WZXm9VUUxIhNce8apDi7ykYqSIqOQIb68i5qfxfZcpFZq1D9rWCMpvtZXpBaHhc/KY2x6Vur5tujG+vI+xwiQCEPGXo5CTFBfuAU+qEWp4j6Fo3Kugr4z/3OsariPu9jEgmXr2ekTUb/UZWR3sr83YNwgPOegudQC6b3qsYDko20pbByo9/g1SWvLFzEICF1WR8E1elMXLMMG6Xs2q2v5hYf8oymnqcSlnq3yRFau8WFbZvKrfYJX+RY1ny28o6RP0lLkRRtaHmO4/hQ2krfjsVbB4WqerhFQu13Sel21ay3dmSXnPJez0wG+7x0ttXwnlFOkaZoGaXfdGFG1urY1Kz4airOTMK5wdRTXrTXruU0JMl6v9e0Jq19MvPw4//eGH4qfD74bT9z+e//S7H9IW5c//SJijMvLo94eEjO6/e/7p9z9mgT96bme+P3XEQL/8XP7wnP6GhNItPYbjHZE/v7bnfeCj/3j4bjimww6SSaRRukeM+Ohv6bX98sff335Nx+asv0s/G9Lz/eOPxWG1/aX+x8+/HJ4+4MEz80QBjyq2SeM9UVtNVJTQHwylGu+pip1GVDNrJ+7fzkPYvaCpX+CyzGj3dJwnuCxG6aMZRhl+UX+yUWYCxhvM8sIW727MheHWWcYW0i05afsSGXfbmhS+PJ3PqgHBmALUy0e8jKWPugcdx82VWS1QTYkCQoxuzjFQIQUhK1Z638bSM/2R50Jd99C4+J71FXqP122oqo7yNj6PVMwmcseIf0EJPb3KyhAN51l+7heXx7pE9O0dttrw91akiyxe+NdWmZFS3MEk0HO0/KKHc8dyC+C1zjBl7f7n9V7b9wkuuoxVO65cPSiz0cbIMPyg57SE45/UVY9jE/Hx/tgXUjxHmplO7Ps8dJMY6n6M+HXsN+DvmNeSsJDn3Ul1eMYI3Mbdx+HAbErLXOukJCtEVzITTspTYehRCQkaR9IF/5s3aJ5Q5u2U53Lek+posq1izBE9z5GUn+PAliHBOBJX3XWssZtU+YEqpHyiZoZc7SN+j3vfcAxm3w/TWiLvGo46vkdh5IjUh598Ukdn+FsZK04vOU7aFHE9no8Rx2yfAWvnb9y32XlAUw1y8/Q4xEnyBT9jlG5eZypy1aELqrhT/mH05NJn3qj32LHZkce/GUhtI7PxHDWI2aSHNm9HSoyYcgusx15oV356tqp7sAtk9NmezXIU9teJpz7fvoz57rk3RXUNMplyYtUcOal7Haue6mNSvtP4alDlOpaUZTDDUXu0bYcqv71/Jy1LBfDWaOrJiNav6T7uReeT6DqqMffwuPt31RhM84cwZojxKSUKuLd23bRAZ5q3pPND+QZUwlZYHw4df8Y92gSvR+UAaF543yNz8taooAfXFkbH9MxTjpRPqpFKVV8p74FrTkbzYd9gzJ7hvNjb/bHYymAY1+So6FWmaYvPVddewTVreOXvT3/ffiTXMZ1lRvBnnoB1zyuU/gLuarwYseEupxGr0d+BTYAtD0em/OLD52q2KKvFqszLvMrnc7UDyQo/r2fL1WKeLxPIL5dv9CLn2TL9Rj1frNKeYpmV8/d3CXmZzYosL+t6XuarcvXZ0hzzb7Hjn2q8qr3d8Gzl7JfBFKyvYsdzIxbb86FMXxYjXrUWKvxaLdtbU2ICDoMpbhOIO3dXBwNuyNqX9ppsut3aZuHsDtK3n3v9wef7ejcO5q5OAEEOUBIPQ4A0BzpHzZlk9ZhI/hvp1DBnnjrWObf2z4JWEs75snFRcWBaxawRoJMkKDtZpg7mPoiifOoadjB3r+c4ijyJTlwQlq6NGxwAmZXsyW/mmFOTaPM0gPVxSrzeZEmBG9YXoJxE2AANGDpubQPQYm4LEEwL2U6aN3VMr9E/DGtON4Yxn6WH26r7EV28Axc9zbA1x8dmofS+pdggoKPoSJ1g+4owlBt63OA11ADuMxKVGbpOQNYxGHIgid6T/OsJFklMy0Li76vATNBtTBNNs2kx4XRuZQ2SJuGsOfuGYJQz28cg7Djnl31oW6krkoAyE7BuXQ3vYfFILlhfOAO+g4443QN2INsR5m7HE/049r4YfrhhGCFmjSR0XePWDmOXJUm9mkHMPG+2JF39/Mm3k3AtaRhJDI8hNmdaf7gho458OpQog5wEgdpNeqcjqB7XbjMS7IN6Sf28vXGz7Z1RChhvxh6TnpryZwVevYuOIicrobXEuX7j8OFGp7dv3neCISA4rwzaSoD1u1+dEI3Nj+vrKhEFJN7Hzi3X6t1dFeQTXSxPd2PXTfRr2nv7T3YEYZ6s5zlSc+4BRbwOQLy7nrBW6sK70N56AHjNa48buG4SiulueJ6HHpZVjjp6aIox84eG1TeyvIe27DmjruMZHdi+YSTpv81IetusX5viQZ/l7kVw1lQzzP63NTrDfQMLzQsJlz/B1gfNhoV14dgNes9NxvfMEC/pbPS5X3C/Bll98w46pXHgc7o1XzCMUrb8YjmrsqIqF/WqWJbF+8P2T0n7ns/q9MfLfLnM5otF9bnEifm3cfunqBPrd+ejuZ2GtG4mXNPUTbquXsdN7h6q9O9byzMqPKp72A9pvXxNJPcJP7ly0cbq7dmI5842e8U7KBqr1olqUx0efTOeHvdlRhIIart//JZ+1rcWSXne5BZFaU3cXzFGvLrlGH1mtXwy6T4Ufp6d45j97bWF1NNA1APWRydWhACCIGhqwy/AYe5N8lBtw6M7hns3KJQZGLjL4OJNRb1SJgKAIXPqfXMSaJqQIiJ1TGU6iuQcwyzDzh7lLhUafLE2yz5MAkwB2s/RDz7soeNf495pxCSTi2xLnkkbhsfZl/E4KODxkGpgjw6Y5J5B0ST2x3VAXdTAirHmn2OImjf3MVwlZrzjesH+RxL+8NvcuUZu53bvk+OnCCq2YGckEMl3diO5mgkXtj09SfBgcFjcSSQAko+B2IjN94Grnx/hN8HaFt47rI8HrDvQhTFxpsOehFhgh9SoG8KRHZvWJNQGL/xAj3Wu8263DkGAd2uym3Bg+tKVogrDNiyA2QizGwGOhKG1hr+nIHlHUcN+1A3ic80USRG6XXjqVAQkj1vRqtjlscGg9+m7c6vzMnw/m0ma2KSHLkpktjGwcfKXQ54GQwaGtG+YesRu9TUDSUU+gdDvSOq+6GDlUGkcDEG8UVCIcgU5b499kMclV0B3o0iL8DZOAzEN3474Gp4NDl3wWiqS5SXJsyvfR8n9WKlyJ3xmrlntWQZAv9OGw6aIg7DnKaG1bJjupeG+9pLYU4lGa5AYxedqOgaT6lj4vpMhqRm7UklcR/C/unSvwl38jJBChRo1fqGwqUF6V88hnBGPSODqg2CfhqVX4+AgwvB5Te557aMbHtpslPu08hAiuPym+IqjSNYnkphPkz3+WrrlTafn3SntbJD/1QjihpEK21yaWQ7qCvkIJ0IkxMD047Dk3RgKf2tV9oP0KB/E0ZOrztIe9ybeW0HUXnF9bgolAe6CL5jGQVDQ4SKOY8d+XRYNnDoGtPqeQn2o/pkplsLP1//88sKLDxCqU0T8prb0BSLO69kqAeZFnkBxviw+H2P6TVrxiYxplxBxl1m0VLp682bYvEbE6Y5Q1WnnX6Yr3KKjEtpt0or12q9zSghya7vChOwOCTlvy52P5quEhC/pce15Trede3nsjvmh57774PN9tQLUzIdT581tInQY42VQ8gBeb01hqnwp0YW9kUiSUTbHEDr4EAq4GWHtDwyOBp7NuVaTq9wSi+7lK79R2DaEiHSgrwL36dH3P3JRPQauG/rutz1w/JZxK/SVRqyUPM7bnvdzDbhvI1/8mpv1gXMHgQBwFLF7xXJBDSvtvi48PvDLsRnXTUQlMX/BO+cfAy9oKKo1rwxOY7jo/wsdc+NrHPPeM3DbfeAbvn/sX4KjZoj1KLDoegpqahyjvQphVKQyimSRPMl8gLHsr31SfoR7GqqJeG8idDWMc6DfFa/TPf/nk7BkAR/GHvzs+lKwCAmCCnA1EEk+cb3SHsR7yoOPY8j5hhwvBKLAm5uOohtxb8LnSvV8ZpIjueYoHohiP8Pw0/OuRanhs4of/dj6QHrj782LhQbx+eBi7XGDw5O4AwNZ52DhWw6ftw9Rcfwd62Tggn1EmumzoXh4YGmSd4tTJH0TJlRfeLrWlVbK3JApH7eNZFmJHOiXikRU7Fu3EltApLTuGFfE9NenO6Wa3phL8TyGsQPDNT2EWyzY7MYyH55nEGUwEfXE0h/uaR+ZFIxzys6n3Llp+N6i/JJ7uhsEOC6WtX1J5qJgiFUpeNijyMfP46ZrIxg/ijmruC8iBTkH/zzJOVEcHO9B2OuGMDn6ylVMAQ7WhRdZ4GYXYjcQr+Ma9nvLiB/9swB3MRBfhqdpW3LfVSk6r41MBI+w69ozi1Pt+b0zfYMciTWxv8QVZ+ylIfA+Or621+txdB7JZhwx+XucZyi9ZOIsC7p6je93Y1JrH946m+usL/LXq9yLGRksm7QviGk4p+I9/J6cv5//f4ag3Y/f0T5H7mVf9prjnOG5Fn5LzQY2xaQ0oKOZIf3Odx8rpDx7wgATerY8U6KiC7toZjpPphqUWJ8KhYvtojpqMw2o6zGtgvxQq20kqjCQDA59VB0bE7N7CFYDkpunYAa8gi0mbsOYk71jwkvLiQmnLNoV2/NIvqVqOLtTcseyRcgkrsQadwTYEHilhZwPu8I9WBo6VzUR2ZG98V26/ezLpwjk2azKq9LU2EWdzeev8nOrtKWoIL8YldpTT1w5s79dlItqlc3L+o3ueOunz8sySxuY5aLOlv+HVqj0v3//61+fJ//2/d//+Lf/bv76p/+y3/gf</diagram></mxfile>
|
2112.03258/main_diagram/main_diagram.pdf
ADDED
|
Binary file (41 kB). View file
|
|
|
2112.03258/paper_text/intro_method.md
ADDED
|
@@ -0,0 +1,113 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Introduction
|
| 2 |
+
|
| 3 |
+
Humans have an outstanding ability to easily communicate and express abstract ideas and emotions through sketch drawings. Generally, a sketch comprises several strokes, where each stroke can be considered as a group of points. In automatic sketch image generation, the objective is to generate recognizable sketches that are closely related to the real-world visual concepts. Here, the focus is to learn more canonical and mundane interpretations of everyday objects.
|
| 4 |
+
|
| 5 |
+
Different from the standard sketch generation problem discussed above, *creative* sketch generation [@ge2020creative] involves drawing more imaginative and previously unseen depictions of everyday visual concepts (see Fig. [\[fig:intro_big\]](#fig:intro_big){reference-type="ref" reference="fig:intro_big"}(a)). In this problem, creative sketches are generated according to externally provided random input strokes. Example of creative sketch generation includes doodling activity, where diverse, yet recognizable sketch images are generated through unseen composition of everyday visual concepts. Automatic generation of creative sketches can largely assist human creative process *e.g*., inspiring further ideas by providing a possible interpretation of initial sketches by the user. However, such a creative task is more challenging compared to mimicking real-world scenes in to sketch images. This work investigates the problem of creative sketch generation.
|
| 6 |
+
|
| 7 |
+
<figure id="fig:ganvsformer" data-latex-placement="t!">
|
| 8 |
+
<div class="center">
|
| 9 |
+
<img src="figures/intro2.jpg" />
|
| 10 |
+
</div>
|
| 11 |
+
<figcaption>A visual comparison of creative sketch images generated by DoodlerGAN <span class="citation" data-cites="ge2020creative"></span> (top row) and the proposed DoodleFormer (bottom row) for the same initial random input strokes. We show examples from both Creative Birds (a) and Creative Creatures (b) datasets. DoodlerGAN suffers from topological artefacts (<em>e.g</em>., more than one head like region in the third bird sketch from the left), disconnected body parts (<em>e.g</em>., the fifth sketch from the left in creatures). Further, the DoodlerGAN generated creative sketches have lesser diversity in terms of size, appearance and posture. The proposed DoodleFormer alleviates the issues of topological artefacts and disconnected body parts, generating creative sketches that are more realistic and diverse.</figcaption>
|
| 12 |
+
</figure>
|
| 13 |
+
|
| 14 |
+
Recently, Ge *et al*. [@ge2020creative] address the creative sketch image generation problem by proposing a part-based Generative Adversarial Network called DoodlerGAN. It utilizes a part-specific generator to produce each body part of the sketch. The generated body parts are then sequentially integrated with the externally provided random input, for obtaining final sketch image. Although DoodlerGAN utilizes a part-specific generator for creating each body part of the sketch, it does not comprise an explicit mechanism to ensure that each body part is placed appropriately with respect to the rest of the parts. This leads to topological artifacts and connectivity issues (see Fig. [1](#fig:ganvsformer){reference-type="ref" reference="fig:ganvsformer"}). Further, DoodlerGAN struggles to generate diverse sketch images, which is an especially desired property in creative sketch generation.
|
| 15 |
+
|
| 16 |
+
In this work, we argue that the aforementioned problems of topological artefacts, connectivity and diversity issues can be alleviated by imitating the natural *coarse-to-fine* creative sketch drawing process, where the artist first draws the holistic coarse structure of the sketch and then fills the fine-details to generate the final sketch. By first drawing the holistic coarse structure of the sketch aids to appropriately decide the location and the size of each sketch body part to be drawn. To imitate such a coarse-to-fine creative sketch generation process, we look into a two-stage framework where the global as well as local structural relations among different body parts can be first captured at a coarse-level followed by obtaining the fine-level sketch. The coarse-to-fine framework is expected to further improve the diversity of the creative sketch images by explicitly modeling the variations in the location and size of each sketch body part to be drawn.
|
| 17 |
+
|
| 18 |
+
We propose a novel two-stage encoder-decoder framework, DoodleFormer, for creative sketch generation. DoodleFormer decomposes the creative sketch generation problem into the construction of holistic coarse sketch composition followed by injecting fine-details to generate final sketch image. To generate realistic sketch images, we introduce graph-aware transformer (GAT) encoders that effectively encode the local structural relations between different sketch body parts by integrating a static adjacency-based graph into the dynamic self-attention block. We further introduce a probabilistic coarse sketch decoder that utilizes Gaussian mixture models (GMMs) to obtain diverse locations of each body part, thereby improving the diversity of output sketches (see Fig. [1](#fig:ganvsformer){reference-type="ref" reference="fig:ganvsformer"}).
|
| 19 |
+
|
| 20 |
+
We evaluate the proposed DoodleFormer by conducting extensive qualitative, quantitative and human-based evaluations on the recently introduced Creative Birds and Creative Creatures datasets. Our DoodleFormer performs favorably against DoodlerGAN on all three evaluations. For instance, DoodleFormer sketches were interpreted to be drawn by a human 86%, having better strokes integration 85% and being more creative 82%, over DoodlerGAN in terms of human-based evaluation. Further, DoodleFormer outperforms DoodlerGAN with absolute gains of 25 and 23 in terms of Frèchet inception distance (FID) on Creative Creatures and Creative Birds, respectively. In addition to sketch generation based on externally provided random initial strokes, we validate the effectiveness of DoodleFormer to generate creative sketches based on text inputs, incomplete sketch images provided by user as well as generating complete house layouts given coarse-level bubble diagrams. DoodleFormer achieves impressive performance for text to sketch generation, sketch completion (see Fig. [\[fig:intro_big\]](#fig:intro_big){reference-type="ref" reference="fig:intro_big"} (b) and (c)) as well as house layout generation (see Fig. [5](#fig:hsgan){reference-type="ref" reference="fig:hsgan"}).
|
| 21 |
+
|
| 22 |
+
# Method
|
| 23 |
+
|
| 24 |
+
**Motivation:** To motivate our framework, we first distinguish two desirable properties to be considered when designing an approach for creative sketch generation.
|
| 25 |
+
|
| 26 |
+
***Holistic Sketch Part Composition:*** As discussed earlier, DoodlerGAN employs a part-specific generator to produce each body part of the sketch. However, it does not utilize any explicit mechanism to ensure that the generated part is placed in an appropriate location relative to other parts, thereby suffering from topological artifacts and connectivity issues (see Fig. [1](#fig:ganvsformer){reference-type="ref" reference="fig:ganvsformer"}). Here, we argue that explicitly capturing the holistic arrangement of the sketch parts is desired to generate realistic sketch images that avoid topological artifacts and connectivity issues.\
|
| 27 |
+
***Fine-level Diverse Sketch Generation*:** Creative sketches exhibit a large diversity in appearance, posing a major challenge when generating diverse, yet realistic fine-detailed sketch images. Existing work of DoodlerGAN struggles to generate diverse sketch images since it typically ignores the noise input in the sketch generation process [@ramasinghe2021train]. Although DoodlerGAN attempts to partially address this issue by introducing heuristics in the form of randomly translating the input partial sketch, the diversity of generated sketch images is still far from satisfactory (see Fig. [1](#fig:ganvsformer){reference-type="ref" reference="fig:ganvsformer"}). Instead, we argue that having an explicit probabilistic modeling within the framework is expected to further improve the diversity of the generated sketch images.\
|
| 28 |
+
**Overall Framework:** The proposed two-stage DoodleFormer framework combines the two aforementioned desired properties by decomposing the creative sketch generation problem to first capture the holistic coarse sketch composition and then injecting fine-details to generate the final sketch. The overall architecture of the proposed two-stage DoodleFormer is shown in Fig. [2](#fig:method){reference-type="ref" reference="fig:method"}. DoodleFormer comprises two stages: *Part Locator* (PL-Net) and *Part Sketcher* (PS-Net). The first stage, Part Locator (PL-Net), learns to explicitly capture the holistic arrangement of the sketch parts conditioned on the externally provided random initial stroke points $\mathcal{C}$ represented in a vector form. PL-Net comprises graph-aware transformer (GAT) block-based encoders to capture structural relationship between different regions within a sketch. To the best of our knowledge, we are the first to introduce a GAT block-based transformer encoder for the problem of creative sketch image generation. Instead of directly predicting the box parameters as deterministic points from the transformer decoder, we further introduce probabilistic coarse sketch decoders that utilize GMM modelling for box prediction. This enables our DoodleFormer to achieve diverse, yet plausible coarse structure (bounding boxes) for sketch generation. The second stage, Part Sketcher (PS-Net), creates the final sketch image with appropriate line segments based on the coarse structure obtained from PL-Net. PS-Net also comprises GAT block-based encoders, as in PL-Net, along with a convolutional encoder-decoder network to generate the final rasterized sketch image.
|
| 29 |
+
|
| 30 |
+
Our carefully designed two-stage DoodleFormer architecture possesses both desired properties (holistic sketch part composition as well as fine-level diverse sketch generation) and creates diverse, yet realistic sketch images in a coarse-to-fine manner (see Fig. [3](#fig:onecol){reference-type="ref" reference="fig:onecol"}). Next, we describe in detail PL-Net (Sec. [3.1](#Sec:PL-Net){reference-type="ref" reference="Sec:PL-Net"}) and PS-Net (Sec. [3.2](#Sec:PS-Net){reference-type="ref" reference="Sec:PS-Net"}).
|
| 31 |
+
|
| 32 |
+
<figure id="fig:method" data-latex-placement="t!">
|
| 33 |
+
<div class="center">
|
| 34 |
+
<embed src="figures/mainfig.pdf" style="width:100.0%" />
|
| 35 |
+
</div>
|
| 36 |
+
<figcaption>The proposed DoodleFormer comprises two stages: Part Locator (PL-Net) and Part Sketcher (PS-Net). (a) The first stage, PL-Net, takes the initial stroke points <span class="math inline">𝒞</span> as the conditional input and learns to return the bounding boxes corresponding to each body part (coarse structure of the sketch) to be drawn. PL-Net contains two graph-aware transformer (GAT) encoders (<span class="math inline"><em>E</em><sub><em>b</em></sub></span>, <span class="math inline"><em>E</em><sub><em>c</em></sub></span>) and a probablistic coarse sketch decoder utilizing GMM modelling for the coarse box prediction. Within the decoder, the bounding box parameters are predicted by the location-predictor (<span class="math inline">ℋ<sub><em>x</em><em>y</em></sub></span>) and size-predictor (<span class="math inline">ℋ<sub><em>w</em><em>h</em></sub></span>) modules. (b) The second stage, PS-Net, then takes the predicted box locations along with <span class="math inline">𝒞</span> as inputs and generates the final sketch image <span class="math inline">$\bar{\bm{I}}_{im}$</span>. Following the design of <span class="math inline"><em>E</em><sub><em>b</em></sub></span> and <span class="math inline"><em>E</em><sub><em>c</em></sub></span>, PS-Net also comprises GAT block-based encoders (<span class="math inline"><em>Ē</em><sub><em>b</em></sub></span>, <span class="math inline"><em>Ē</em><sub><em>c</em></sub></span>). Further, PS-Net contains a convolutional encoder-decoder network (<span class="math inline">ℛ<sub><em>E</em></sub></span>, <span class="math inline">ℛ<sub><em>D</em></sub></span>) and a mask regressor to generate rasterized high quality sketch image <span class="math inline">$\bar{\bm{I}}_{im}$</span>. </figcaption>
|
| 37 |
+
</figure>
|
| 38 |
+
|
| 39 |
+
As discussed above, PL-Net takes the initial stroke points $\mathcal{C}$ as the conditional input, and learns to return a coarse structure capturing the holistic part composition of the desired sketch. The *encoders* in PL-Net contain graph-aware transformer (GAT) blocks to encode the structural relationship between different parts (holistic sketch part composition), leading to realistic sketch image generation. The *decoder* in PL-Net utilizes GMM modeling for box prediction, enabling the generations of diverse sketch images.
|
| 40 |
+
|
| 41 |
+
<figure id="fig:onecol" data-latex-placement="t!">
|
| 42 |
+
<div class="center">
|
| 43 |
+
<img src="figures/ablation_visual.jpg" style="width:100.0%" />
|
| 44 |
+
</div>
|
| 45 |
+
<figcaption>A visual comparison in terms of progressively integrating one contribution at the time, from top to bottom, for common initial strokes. Compared to the single-stage baseline (first row), the two-stage framework (without the GAT block and probabilistic modeling in the decoder) generates sketch in a coarse-to-fine manner. As a result, the two-stage framework (second row) produces a more complete sketch where each body part is placed at an appropriate location relative to other parts. The introduction of GAT block (third row) in the encoders of the two-stage framework improves the realism of the generated sketches by capturing the structural relationship between different parts (<em>e.g</em>., the tenth image from the left, where there is a discontinuity between the beak and the head of the bird). Further, the introduction of probabilistic modelling in the decoder of the two-stage framework (last row), improves the diversity (<em>e.g</em>., appearance, size, orientation and posture) of generated sketch images. Our final two-stage framework (last row) produces realistic and diverse sketch images. </figcaption>
|
| 46 |
+
</figure>
|
| 47 |
+
|
| 48 |
+
PL-Net consists of two graph-aware transformer (GAT) block-based encoders $E_b$ and $E_c$, which are used to obtain contextualized representation of the coarse (holistic) structure $\mathcal{B}$ and the conditional input $\mathcal{C}$, respectively.
|
| 49 |
+
|
| 50 |
+
:::: wrapfigure
|
| 51 |
+
r0.5
|
| 52 |
+
|
| 53 |
+
::: center
|
| 54 |
+
{width="50%"}
|
| 55 |
+
:::
|
| 56 |
+
::::
|
| 57 |
+
|
| 58 |
+
To encode the identity $t$ of each body part present in a sketch, we define $\bm{v}_{t} \in \mathbb{R}^{d}$ as a learned part embedding. We concatenate $\bm{v}_{t}$ with a feature representation obtained from $\bm{b}_{t} \in \mathcal{B}$ (box location and size information ($x_t, y_t, w_t, h_t$) of each body part). This concatenated feature is then used as an input to the encoder $E_b$. The conditional input strokes $\mathcal{C}$ are passed through a linear layer before being input to the encoder $E_c$. We add special $cls$ tokens [@devlin2018bert] at the beginning of input sequences to the encoders ($E_b$ and $E_c$). The output of this token is considered as the contextualized representation of the whole sequence. Further, we use fixed positional encodings to the input of each attention layer to retain information regarding the sequence order. Next, we introduce our GAT block used in both encoders ($E_b, E_c$) to encode the holistic structural composition of (sketch) body parts.
|
| 59 |
+
|
| 60 |
+
**Graph-aware Transformer (GAT) Block:**[]{#sec:gat label="sec:gat"} The structure of our GAT block is shown in Fig. [\[fig:gatr\]](#fig:gatr){reference-type="ref" reference="fig:gatr"}(c). Each GAT block consists of a graph-aware multi-headed self-attention (MHSA) module followed by a feed-forward network [@khan2021transformers]. Given the queries $\bm{Q}$, keys $\bm{K}$, and values $\bm{V}$ the standard self-attention module [@vaswani2017attention] computes the attention according to the following equation (also shown in Fig. [\[fig:gatr\]](#fig:gatr){reference-type="ref" reference="fig:gatr"}(b)), $$\begin{equation}
|
| 61 |
+
\label{eq_13}
|
| 62 |
+
\bm{\alpha} = \textrm{softmax}\left(\frac{\bm{Q}\bm{K}^T}{\sqrt{d}}\right).
|
| 63 |
+
\end{equation}$$
|
| 64 |
+
|
| 65 |
+
While the standard self-attention module is effective towards learning highly contextualized feature representation, it does not explicitly emphasize on the local structural relation. However, creative sketches are structured inputs with definite connectivity patterns between sketch parts. To model this structure, we propose to encode an adjacency based graph implemented with spectral graph convolution [@kipf2016semi]. Our proposed GAT block combines the definite connectivity patterns from the learned adjacency graph with the dynamic attention from self-attention block. Let us consider a graph where each node $i$ is associated with a structure-aware representation $\bm{n}_i$ and corresponding neighbour set $\mathcal{N}_r(i)$. To represent the neighbor set $\mathcal{N}_r(i)$ for each node $i$, we define an adjacency matrix $\bm{A}$ where each entry represents whether two nodes $i$ and $j$ are adjacent. The edge weight $e_{ij}$ between two adjacent nodes $i$ and $j$ is given by, $$\begin{equation}
|
| 66 |
+
\label{eq_7}
|
| 67 |
+
e_{ij} = \bm{W}_b^{T} \textrm{ReLU}\left(\bm{W}_a \left[\bm{n}_i , \bm{n}_j\right]\right) \; \forall j \in \mathcal{N}_r(i),
|
| 68 |
+
\end{equation}$$ where $\bm{W}_a$ and $\bm{W}_b$ are learned parameters and $[\cdot,\cdot]$ is a concatenation operator. We set $e_{ij} = 0$ $\forall j \notin \mathcal{N}_r(i)$. For each GAT block $l$, the spectral graph convolution operation is, $$\begin{equation}
|
| 69 |
+
\label{eq_9}
|
| 70 |
+
\bm{n}_i^{(l+1)} = \textrm{ReLU} \left(\bm{n}_i^{(l)} + \sum_{j \in \mathcal{N}_{r}(i)}^{}e_{ij}\bm{W}_{c}\bm{n}_j^{(l)}\right),
|
| 71 |
+
\end{equation}$$ where $\bm{W}_c$ is a learned matrix. Our main intuition is that the adjacency matrix representing the neighbourhood graph structure is static which is computed over the connected components in the graph and predetermined for each input, it is also symmetric and generally sparse. In contrast, attention learned from the self-attention layer is dynamic, can be dense and also non-symmetric. We propose to combine these two complementary representations through the following equation where we calculate the attention weight $\alpha_{ij}$ for nodes $j \in \mathcal{N}_r(i)$ as follows, $$\begin{equation}
|
| 72 |
+
\label{eq_8}
|
| 73 |
+
\alpha_{ij} = \frac{e_{ij}\exp (\varphi_{ij})}{\sum_{j \in \mathcal{N}_{r}(i)}^{} e_{ij}\exp (\varphi_{ij})},\; \text{s.t. } \; \varphi_{ij} \in \frac{\bm{Q}\bm{K}^T}{\sqrt{d}},
|
| 74 |
+
\end{equation}$$ where $\varphi_{ij}$ is an element of the standard attention matrix.
|
| 75 |
+
|
| 76 |
+
The special token ($cls$) output from $E_c$ is then utilized as an input to a Prior-Net for approximating the conditional prior latent distribution. Similarly, the $cls$ token outputs of both $E_b$ and $E_c$ are provided as input to a Recog-Net for approximating the variational latent distribution. Both the Prior-Net and the Recog-Net are parameterized by multi-layer perceptrons (MLPs) to approximate prior and variational latent normal distributions. During training, we sample the latent variable $\bm{z}$ from the variational distribution and provide it as input to the probabilistic coarse sketch decoder.
|
| 77 |
+
|
| 78 |
+
The probabilistic coarse sketch decoder within our PL-Net utilizes probabilistic modelling to generate diverse coarse structure. The decoder comprises two modules: a location-predictor $\mathcal{H}_{xy}$ and a size-predictor $\mathcal{H}_{wh}$. Here, the location-predictor $\mathcal{H}_{xy}$ estimates the center coordinates ($x_t,y_t$) of bounding boxes around body parts, while the size-predictor $\mathcal{H}_{wh}$ predicts their width and height ($w_t, h_t$). Both these modules consist of multi-headed self- and encoder-decoder attention mechanisms [@vaswani2017attention]. The encoder-decoder attention obtains the key and value vectors from the output of the encoder $E_c$. This allows every position in the decoder to attend to all positions in the conditional input sequence. The part embedding $\bm{v}_{t}$ from the encoder is used as a query positional encoding to each attention layer of the decoder. Over multiple consecutive decoding layers, the decoder modules produce respective output features $\bm{f}_t^{xy} \in \mathbb{R}^{d}$ and $\bm{f}_t^{wh}\in \mathbb{R}^{d}$ that lead to the distribution parameters of bounding boxes being associated with each body part, representing the coarse structure of the final sketch to be generated.
|
| 79 |
+
|
| 80 |
+
To enhance the diversity of generated sketch images, we model the box predictions from each decoder module by Gaussian Mixture Models (GMMs) [@bishop1994mixture; @graves2013generating]. Different from the conventional box prediction [@carion2020end; @zhu2020deformable] that directly maps the decoder output features as deterministic box parameters, our GMM-based box prediction is modeled with $M$ normal distributions $\mathcal{N}\left(\cdot \right)$ where each distribution is parameterized by $\theta_{k}$ and a mixture weight $\pi_{k}$, $$\begin{equation}
|
| 81 |
+
\begin{split}
|
| 82 |
+
\label{eq_4}
|
| 83 |
+
p(\bm{b}_t|\mathcal{C}, \bm{z}) = \sum_{k=1}^{M}\pi_{k,t}\mathcal{N}\left(\bm{b}_t; \theta_{k,t}\right), \textrm{for} \sum_{k=1}^{M}\pi_{k,t} = 1.
|
| 84 |
+
\end{split}
|
| 85 |
+
\end{equation}$$ The GMM parameters can be obtained by minimizing the negative log-likehood for all $P$ body parts in a sketch, $$\begin{equation}
|
| 86 |
+
\label{eq_10}
|
| 87 |
+
\mathcal{L}_{b} = -\frac{1}{P}\sum_{t=1}^{P}\log\left( \sum_{k=1}^{M}\pi_{k,t}\mathcal{N}(\bm{b}_t; \theta_{k,t})\right) .
|
| 88 |
+
\end{equation}$$ Here, we simplify the quadvariate distribution of GMMs in Eq. [\[eq_10\]](#eq_10){reference-type="ref" reference="eq_10"} by decomposing it into two bivariate distributions as $p(\bm{b}_t|\mathcal{C},\bm{z})=p(x_t,y_t|\mathcal{C}, \bm{z})p(w_t,h_t|x_t, y_t,\mathcal{C}, \bm{z})$. The parameters of these bivariate GMMs are obtained by employing linear layers and appropriate normalization on the outputs $f_t^{xy}$, $f_t^{wh}$ of $\mathcal{H}_{xy}$ and $\mathcal{H}_{wh}$, respectively. In addition to GMM parameters, these linear layers also estimate the presence of a body part using an indicator variable, which is trained with a binary cross entropy loss $\mathcal{L}_{c}$.
|
| 89 |
+
|
| 90 |
+
**PL-Net Loss function ($\mathcal{L}_{PL}$):** The overall loss function $\mathcal{L}_{PL}$ to train the PL-Net is the weighted sum of the reconstruction loss $\mathcal{L}_{rec}$, and the KL divergence loss $\mathcal{L}_{KL}$ , $$\begin{equation}
|
| 91 |
+
\label{eq_11}
|
| 92 |
+
\mathcal{L}_{PL} = \mathcal{L}_{rec} + \lambda_{KL}\mathcal{L}_{KL}.
|
| 93 |
+
\end{equation}$$ Here, the reconstruction loss term is $\mathcal{L}_{rec} = \mathcal{L}_b + \mathcal{L}_c$. The KL divergence loss term $\mathcal{L}_{KL}$ regularizes the variational distribution [@sohn2015learning] from the Recog-Net to be closer to the prior distribution from the conditional Prior-Net, whereas $\lambda_{KL}$ is a scalar loss weight.
|
| 94 |
+
|
| 95 |
+
Our carefully designed PL-Net architecture, presented above, provides a coarse structure of the sketch that is used to generate a diverse, yet realistic final sketch image in the second stage (PS-Net) of the proposed two-stage DoodleFormer framework. Next, we present the PS-Net that takes the coarse structure of the sketch along with initial partial sketch $\mathcal{C}$ as inputs and generates the final sketch image.
|
| 96 |
+
|
| 97 |
+
Our PS-Net comprises two graph-aware transformer (GAT) block-based encoders $\Bar{E}_b$ and $\Bar{E}_c$, following the design of encoders ${E}_b$ and ${E}_c$ in the PL-Net. Here, the encoder $\Bar{E}_b$ produces a contextualized feature representation of bounding box $\bm{b}_t$ associated with each body part. Similarly, the encoder $\Bar{E}_c$ outputs a contextualized feature representation of initial stroke points $\mathcal{C}$. Both these contextualized feature representations from $\Bar{E}_b$ and $\Bar{E}_c$ are then concatenated and passed through a linear layer to obtain $\bm{u}_t$.
|
| 98 |
+
|
| 99 |
+
The initial stroke points $\mathcal{C}$ is converted to its raster form $\bm{I}_\mathcal{C}$ and passed through a convolutional encoder $\mathcal{R}_E$ that outputs a spatial representation $\bm{g} = \mathcal{R}_E(\bm{I}_\mathcal{C})$. Consequently, $\bm{g}$ and $\{\bm{u}_t\}_{t=1}^{P}$ are provided as input to a convolutional decoder $\mathcal{R}_D$ for generating the final sketch image $\bar{\bm{I}}_{im}$, $$\begin{equation}
|
| 100 |
+
\label{eq_5}
|
| 101 |
+
\bar{\bm{I}}_{im} = \mathcal{R}_D\left( \bm{g}, \{\bm{u}_t\}_{t=1}^{P}\right).
|
| 102 |
+
\end{equation}$$ The decoder network $\mathcal{R}_D$ utilizes the ResNet [@he2016deep] architecture as a backbone. To introduce diversity in the generated images, a zero-mean unit-variance multivariate random noise is added with $\bm{g}$ before passing it to the decoder network. For fine-grained shape prediction, we utilize a mask regressor [@sun2019image; @sun2020learning] having up-sampling convolutions, followed by sigmoid transformation to generate an auxiliary mask for each bounding box. The predicted masks are resized to the sizes of corresponding bounding boxes, which are then used to compute the instance-specific and structure-aware affine transformation parameters in the normalization layer of the decoder $\mathcal{R}_D$.
|
| 103 |
+
|
| 104 |
+
The training of PS-Net follows the standard GAN formulation where the PS-Net generator $\mathcal{G}$ is followed by additional discriminator networks $\mathcal{D}_{im}$, $\mathcal{D}_{part}$, and $\mathcal{D}_{app}$ to obtain image-level ($\mathcal{L}_{im}$), part-level ($\mathcal{L}_{part}$), and appearance ($\mathcal{L}_{app}$) adversarial losses [@he2021context; @sun2019image], respectively. The loss function is then given by, $$\begin{equation}
|
| 105 |
+
\begin{split}
|
| 106 |
+
\label{eq_6}
|
| 107 |
+
\mathcal{L}_{PS} = & \mathcal{L}_{im} + \lambda_p \mathcal{L}_{part} + \lambda_a \mathcal{L}_{app},\\
|
| 108 |
+
%\mathcal{L}_{t} = & \mathbb{E}[max(1-\mathcal{D}_t(\bm{I}_{im}), 0)] + \\
|
| 109 |
+
%& \mathbb{E}[max(1+\mathcal{D}_t(\mathcal{G}(\bm{I}_{im})), 0)].
|
| 110 |
+
\end{split}
|
| 111 |
+
\end{equation}$$ where $\lambda_p$ and $\lambda_a$ are the loss weight hyper-parameters.
|
| 112 |
+
|
| 113 |
+
The introduction of the GAT block in the PL-Net and PS-Net encoders contributes towards the generation of realistic sketch images, whereas the effective utilization of probabilistic modelling in the PL-Net decoder leads to improved diversity. In summary, our two-stage DoodleFormer generates diverse, yet realistic sketch images (see Fig. [3](#fig:onecol){reference-type="ref" reference="fig:onecol"}).
|
2112.04728/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2112.04728",
|
| 3 |
+
"month": "2021_12",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "AAAI",
|
| 6 |
+
"title": "Reducing Catastrophic Forgetting in Self Organizing Maps with Internally-Induced Generative Replay (Student Abstract)",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2112.04728",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.04728",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/tex_files_extracted/2112.04728",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.04728/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.04728/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.04728/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.04728/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.04728/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.04728/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.04728/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.04728/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.04728/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2112.05261/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2112.05261",
|
| 3 |
+
"month": "2021_12",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ICML",
|
| 6 |
+
"title": "Equivariant Quantum Graph Circuits",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2112.05261",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.05261",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/tex_files_extracted/2112.05261",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.05261/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.05261/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.05261/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.05261/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.05261/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.05261/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.05261/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.05261/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.05261/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2112.05883/main_diagram/main_diagram.drawio
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|
2112.05883/paper_text/intro_method.md
ADDED
|
@@ -0,0 +1,27 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Introduction
|
| 2 |
+
|
| 3 |
+
Self-supervised video representation learning has recently received great attention owing to its success in learning informative spatiotemporal features from unlabeled videos. These methods commonly take inspiration from human's visual understanding system and devise various pretext tasks rooted in certain video attributes, e.g., speed or playback rate [@benaim2020speednet; @wang2020self; @chen2021rspnet; @yao2020video], arrow of time [@wei2018learning], motion and appearance statistics [@wang2019self] etc. However, these attributes over the input video clips are temporally invariant and coarse-grained. For example, speediness is mostly constant for a given clip instance. This limits the methods' potential in extensively exploring the fine-grained features of videos [@wang2021unsupervised]. To learn both coarse- and fine-grained features within a self-supervision framework, in this work, we exploit an essential yet under-explored property of videos, namely, "video continuity\".
|
| 4 |
+
|
| 5 |
+
<figure id="fig:intro" data-latex-placement="t">
|
| 6 |
+
<embed src="figs/intro.pdf" style="width:48.0%" />
|
| 7 |
+
<figcaption>Illustration of continuity perception. Observing the long-jump video after manual clip cut-off, a human can easily identify the discontinuity between “takeoff" and “landing" and infer the “action-in-the-air" of the athlete.</figcaption>
|
| 8 |
+
</figure>
|
| 9 |
+
|
| 10 |
+
*Video continuity* suggests that objects are represented as the same persisting individuals over time and motion across consecutive frames [@yi2008spatiotemporal]. Our choice of using video continuity for designing a self-supervision strategy is motivated by the research findings in cognition sciences [@spelke1995spatiotemporal; @yi2008spatiotemporal]. They claim that temporal continuity is essential for a correct and persisting understanding of the visual environment. In fact, based on years of visual experience, human beings can easily detect discontinuity in videos, if any. Furthermore, humans are often capable of inferring the high-level semantics associated with the missing section at the discontinuous point. For example, in Fig. [1](#fig:intro){reference-type="ref" reference="fig:intro"}, after we manually cut off some portions from the long-jump video, one can easily notice the discontinuity between "takeoff\" and "landing\" and infer the missing section corresponding to "action-in-the-air\" of the athlete. We hypothesize that enabling the neural networks to master this exercise of detecting discontinuity and estimating the high-level semantics of the missing sections will empower the model to obtain high-quality spatiotemporal representations of videos. This hypothesis is motivated by the following observations. Effective video embedding requires learning both short- and long-ranged features of videos. The features could be temporally-rich motion patterns and spatially-rich context information, both of which are complementary to each other [@huang2021self; @wang2020self]. Solving the continuity-aware tasks requires the model to learn those features comprehensively. Fig. [2](#fig:method){reference-type="ref" reference="fig:method"}(a) gives an illustration of the continuity-aware tasks used in this work. First, identifying **whether the clip is continuous or not**, i.e. continuity justification, requires a global or long-term view of the motion consistency across the clip. Inferring a clip to be discontinuous based on a local perception of motion irregularity is insufficient (e.g. a continuous running video could have a local motion irregularity due to the sudden acceleration by the runner). Second, finding **where discontinuity occurs**, i.e. discontinuity localization, necessitates a local fine-grained grasp of a dramatic motion change along the video stream. Third, estimating **what is missing** semantically, i.e. missing section approximation, requires model to have a high-level understanding of both the motion patterns and the context information in the neighbouring segments.
|
| 11 |
+
|
| 12 |
+
Following this thread, we propose a Continuity Perception Network (CPNet) solving the novel continuity-aware pretext tasks in Fig. [2](#fig:method){reference-type="ref" reference="fig:method"}(a), to learn effective spatiotemporal representations in a self-supervised manner. We assume there is none or few shot transition in source videos and the discontinuity in the clips refers to the break-point manually created within the same scene (shot). Specifically, given the continuous and discontinuous clips, CPNet is trained to finish two discriminative tasks of continuity justification and discontinuity localization, which drive the model to perceive the global and local motion patterns of the video sequence. For the task of missing content estimation, instead of explicitly reconstruction in RGB space, we formulate it as a contrastive learning task and estimate in the feature space. As shown in Fig. [1](#fig:intro){reference-type="ref" reference="fig:intro"}, since the discontinuous clip encircles its inner missing section in the source video, their motions are more similar to each other than that of two temporally further disjoint clips, even from the same video. We first uses a triplet loss [@schroff2015facenet] to pull the features of the discontinuous clip and its inner missing section closer than a disjoint continuous clip from the same video. Further, based on the observation that clips from the same video have similar appearance compared to those from different videos, we use an additional context-based contrastive loss [@wang2020self; @chen2021rspnet] as a regularizer to pull features of clips from the same video together. This contrastive learning scheme will promote the features of the discontinuous clip to approximate that of its inner missing section, and encourage the model to learn both fine-grained motion change and context information in the video. The CPNet learns video representations by jointly solving these three continuity-aware pretext tasks.
|
| 13 |
+
|
| 14 |
+
We carry out extensive experiments and demonstrate the superiority of CPNet in learning more effective video representations. CPNet outperforms prior arts on multiple downstream tasks including action recognition, video retrieval and action localization. Also, the discontinuity localization task is shown to be the most effective pretext task in CPNet, and incorporating it into other typical self-supervised learning methods can bring significant performance gains.
|
| 15 |
+
|
| 16 |
+
Our major contributions are summarized as follows:
|
| 17 |
+
|
| 18 |
+
- To the best of our knowledge, this is the first work that explicitly exploits video continuity to obtain supervision signals for self-supervised video representation learning.
|
| 19 |
+
|
| 20 |
+
- We propose CPNet to solve the novel continuity-aware pretext tasks and promote the model to learn coarse- and fine-grained motion and context features of the videos.
|
| 21 |
+
|
| 22 |
+
- We conduct comprehensive ablation studies and experiments on multiple downstream tasks to validate the utilities of the proposed CPNet - these include the SOTA or competitive performances on action recognition and video retrieval tasks and evidence of complementary nature to other self-supervised learning methods.
|
| 23 |
+
|
| 24 |
+
<figure id="fig:method" data-latex-placement="th">
|
| 25 |
+
<embed src="figs/method.pdf" style="width:95.0%;height:41.0%" />
|
| 26 |
+
<figcaption>Illustration of the continuity-aware pretext task (a) and the Continuity Perception Network (b). CPNet is composed of a three-branch architecture solving continuity justification, discontinuity localization, and missing section approximation tasks.</figcaption>
|
| 27 |
+
</figure>
|
2112.07917/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2112.07917",
|
| 3 |
+
"month": "2021_12",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ACMMM",
|
| 6 |
+
"title": "SPTS: Single-Point Text Spotting",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2112.07917",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.07917",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/tex_files_extracted/2112.07917",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.07917/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.07917/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2021_12/main_diagram_database/2112.07917/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.07917/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.07917/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.07917/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.07917/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.07917/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2112.07917/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2202.06200/main_diagram/main_diagram.drawio
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
<mxfile host="Electron" modified="2021-10-22T07:49:43.992Z" agent="5.0 (Macintosh; Intel Mac OS X 11_2_1) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/14.9.6 Chrome/89.0.4389.128 Electron/12.0.16 Safari/537.36" version="14.9.6" etag="YaDzz1gQkvwt4hJJBCXi" type="device"><diagram id="4HHdxF6sFevDd1s14i3u">7V1bc6M4Fv41rtp5CIXu4jFxOjNTNdPVNT2zl6ct2iYJtbbxYtJJ9tevxM2AZJBtCZyk3dWOLYOA73ycm47EDM3XLz+n4fbx92QZrWbQX77M0O0MwiuCgfgjW16LFgZo0fCQxsuiCewbvsb/i8pGv2x9ipfRrrVhliSrLN62GxfJZhMtslZbmKbJc3uz+2TVPuo2fCiP6O8bvi7CVaRs9o94mT0WrRyyffsvUfzwWB0Z0KD4ZR1WG5dd7B7DZfLcOBb6NEPzNEmy4tP6ZR6tJHgVLsUJ3R34tT6xNNpkJjvAYofv4eqpvLbyvLLX6mLT5GmzjOT2/gzdPD/GWfR1Gy7kr89CvKLtMVuvxDcgPu6yNPlPNE9WSZrvjfz8Vf9SwSUu9CZcxQ8b8TktkLq5j1eras9NshEHuFmGu8f80LLv+1W8/Xt5GvLzL2V7eQlRmkUvB2EANbiClVGyjrL0VWxS7gARgx7lfv1CRQ+vlQBJSYTnvbgBYB4qt3tsSBvRctuwZNlDfbi9IMSHUhZ6uaA3JhebspDn1UIfIRX9XGBMRR9wC+jjYfTFDkLbRMPIh7ttoYLu4xcJWQtOIYhrNJ/fXVuCDiDYhg5S4FXKtYFepdeayEELwJERgbvLX7aAI2xS4OibZRzmHeACOiZwbFTGzcXLFnAMTQocHwbuZJviEjcIBEyKeQ5U81wRs2UdbNjmwCXljEG1ehdDQj0cdEFFno9VXImvdXua7eegW/n4E8Nr1bogH3jVbjW8AfegCi8JvACq6DaazwIXuAe3ciWR3nVMkyzM4kQ6mFfSWbNEYAoFgYU5KP/hjm4F0ENMARsBj7ZeWEX+0DZnicEg7HLB8QkFhAAR6mEvIHLZAjKIv96bgCAXOB4UEKDICxovfjmyMonWNstrmffZg6j4MYdAb4pnH3LL/QSy6es/xRffoz6rGv6VN/g+rRpuJcJ+/e21+vYSZ8XOGATl92LfIKDl9/2u8ktzzy9RGgukorTZd7exgCFaVsmsKueUPKWLqqnEKgvTh6giCjpe/Qas9+5mVHd3E41rS3octPL4X5JYnFaDuZT2HhyD+uBVrwUEZUewkR5T+ma8/8K40ncBpdJ3TuMaOjNmm4TTbpndZjUx5DRo8ZkczeZTiXuC3zApccnAwcnpxIUo6O078F0S1ySdMSJxjVUxI6StiilwQd4qn3kueXk/f6hT8voDDh07g7x5LHlU3xbJ6zSldJG+H+So3/dT0yuTeXsGeav3Jh0IhD+5lw7t3A1V1uUSpGOSG3Oq+EFT8QNTxd9U+r0K38QzYRrlDt+Yckew9+AwOMOlxv3uOkEOlTs0yS46DhYZ5Z1gkZny1ENBx0HhxzsoJ8eKValFk9h1yYM5sVk/sYlLYkOO+w4O+DmxIuj3Wnzqktgmmd3xXG5TzZtbthajEQ6OZvSp5OUnkLff6yXYJXkH3GJNTGfuckvm9/XtlLzQvVN3jOtGrLluBPa6bsLv9tm+6ghejCNnUoDkPKlKO0nVgJobStpQKlfCxA6YSZ1JtKhpKvSO8v/6NQ12qmlov6bh52iaoF/TcKeaZvLBAmHxAGrzmmLjQMUT7lHLXPpsxMECbofZAw4gcuoAkt7bCmjsnDmzWW/UBJjqXFpktm6wgK7yAs4kP837sBQk/e+TrC+++VMQYCd++Bw9i/c/knW42f/YuieqRtnT1S6vx74WGyC4fWnuQR/yv9S/T8Wvv/17xm5uvvwxY7czWXdTnIy4juJ8yo27d5+wtlnHoHfqUaVJjhfh6rpsX8fL5Sq34ZE4tfBb3pOk9FbCmgNNbmbkVnb1lCXF6WsqmMqb3YrdZ4FHgr1h79h9LGIc4vfZfRBo6I4sFC9pKw0d0wTiAzS5mpH5KnkQ7/dpuBBsEZ+il+3fxB+xm/n7T5Ji+d67p3XRenI3Npi6iu7fCFGVWmvJXEAaA/2qIsaOKhK1tZyFKHZbyblDTLsP1/HqteCa+Clcb3N0EMLib1ayd5OzNy3Z295GQ872PWFDEwoZPi0kw8RJovy9h2p1c3Hl75mBwK8L+fcDPNgDaoVJrQCtK0XdoMJUStGWKjxqNwvvH093MsWsy2IHRV8CV/pSN9jyfvTlNk0EszLxP5HcQr30+tj6EgWiTR0FhDpLbUNfIpNRlPc1CCtOojeTRy+qfhVNPxrgIYBnnfSdcZaD+nzWzt/5yGmao5nSqNh9VBlZf7KOuUzWIThQ/wjPGdMaGBbADlMa1WRGR7NgLSplJKuju5OSqrlGDYljKgd7FaEDpJaRnnTbu502bBMw3YwYzSRryj0WqIAFFrAyyARbmQBnM600+dw3NJhlPDrwOTq2Kpy7nz9/vjION9pGyt68xltwfWvrjuDdTIzepQA+0PgQxOPQgnQN6nYfBJbbgxdcrrhResqzxgIaZwABVL2gR8EGwQ2qPy2xqe1jAR2/yjmIDZf2CI+2kKYK9CgwXkSZJtBhd9ZNfAjTcg/Wpq6aDHA1mxtNNCl5YrzxVHjji4iAR8cbgKnwPi6gXazC3S5e9MW08HAwaGCsDKO4NFoJ4Xxvd3/MaDXseHvE74BpHslRMtCVvcANGwRu71Ja2Ja0lK4QJp7PnQlMFzgWrnV82Kd3mEf+dmpiGfBOdLGoVeG+ES0WuT5VDj//TbJFIgn9r5E45UzQUgYigizfknQ3nH9WTly0xQcuZrRBpQrUVHNyRw7TaDMCmnC3SqwvxL0pk2+HU+s649hOUhhZrv6oAgQ08FD3nkIaA5bXwToyYjbXFWstwXbM8m6t/LjGYQiC+TyPOhJx7DiTYGFr9bCIekQzalZl+lqhCfa4JleGLaTHsc11ypwJol42aS8Ie4XJSK72sK/y6MgEaVJxFWhNYXDfs5CKwzZXP7MhDqSTxqc7nGsjF9IIpDQ6IqC+h3nfghtEKxAL4hhhAmxP1qPWP1aQJVAZLcVsaCkTDbDEAq4GOZEsjcPNw0G7eNA/7kSCV4Gvw76DNP0UoBtuC2lOPIaVmsYadMo0tREHsvuejTFQrEuJnOl/VSsRV23AzNVcLA47ZT3+dcOr1fp1WyHxJHvdRn2+5ttw8VpaW375Emai0zyrIa/fFknzutxhL5BBT6MDrKyQapA4Wjyl3xsj6nWcnGyjzZ+P8WZIEYhd7mJ5Bre+Ik7ByPnc9xvLMV5wQI0AV512QjzeKK3uLDlong+RyqeTEmnW0QwE2UIm4Wtjs7IgqOdaEKsD+Pp4rLU6t/hQdHtqGE90WbLJRufAex6dQxC3HGjViXA5UEd0CbbJBA3ftaA59dh0gn5LC8HbwF+ZkKCrCnG6+DvRZWlGKqstAkpf+mRXJfqyt7w69eya225qtK65HSg1/2kGh6rRy01Wy0TYQIdlusOu5gUV6vq60FNd8VfrQVlxNSeYETg41euDTRHwsYd65gFiTUjsig1OHzEwclaHqfkyzIGHJ8jqkHeULdNVZWLCJsmWkREqiEbDFSKVrzTwONjjqikPd4OrQaXQqFnIeXANrq0lePIHGRzMQiJOdBNlXKYh6QiVQr01hDafvZGvsn84xwvYJCqYjvhUA+cQM6kqlPRWDbFwKgGbAGJdwuGtQoyZcMz3EHeTgFJHwN4Z8G4gHuGZA6NBTLiHGo/TO35s0w3EBgUYI9u+Obwm1rwM4HWnJKOA1EHoWAbP6UPghnlcjdNbgRQgZQACQziNlZs2gLOLK1ZmMIsoYwqNMG30ZpmsVL3/uT+NMTu8osbHLSiVC8BkT2kogfpRUvr2S0qZrxiHAw/Nc1lVSg0C+tNVmMu0CCAd1xtzdaxHt3oKsAAbcxqXu4TNp3xC2JzG2m6fjdyFTTO71tljag3i51EjD6ueBzBhpMuYgxnEzuPCazOa9qUTNzHAlxY52zVEFLWG6bpJN900fN0KC4HHbGA9wbitvjbgr12UXv2aRWvV3ftV+mDhIpfZ2QO6b2lYX9YDqV6fNsuiW8MV2zAmxz1erWd6pVkd0H5dpubDR/pXVC9O6IJrSzHqFBopojEuJVW7qtLl9mdqMpOEhSXZgw8jexAQa7LvetkWZW8yJP7jvj/2vg+4Vz237Hzxo272zKL4TQL9H+I/UvxyLIpDZUzbAhO4s4XxuEnu4gcTjmSCHNhpLtoPbDEBYxGa2SKD+Jom0kvfby7iq8ffk2Ukt/g/</diagram></mxfile>
|
2202.06200/main_diagram/main_diagram.pdf
ADDED
|
Binary file (64.3 kB). View file
|
|
|
2202.06200/paper_text/intro_method.md
ADDED
|
@@ -0,0 +1,145 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Introduction
|
| 2 |
+
|
| 3 |
+
In the age of information explosion, recommender systems occupy an important position to discover users' preferences and deliver online services efficiently [23]. As a classic approach, collaborative filtering (CF) [10, 24] is a fundamental technique that can produce effective recommendations from implicit feedback (expression, click, transaction *et al.*). Recently, CF is further enhanced by the powerful graph neural networks (GNN) [9, 31], which models the interaction data as graphs (*e.g.*, the user-item interaction graph) and then applies GNN to learn effective node representations [9, 31] for recommendation, called *graph collaborative filtering*.
|
| 4 |
+
|
| 5 |
+
Despite the remarkable success, existing neural graph collaborative filtering methods still suffer from two major issues. Firstly, user-item interaction data is usually sparse or noisy, and it may not be able to learn reliable representations since the graph-based methods are potentially more vulnerable to data sparsity [33]. Secondly, existing GNN based CF approaches rely on explicit interaction links for learning node representations, while high-order relations or constraints (e.g., user or item similarity) cannot be explicitly utilized for enriching the graph information, which has been shown essentially useful in recommendation tasks [24, 27, 35]. Although several recent studies leverage constative learning to alleviate the sparsity of interaction data [33, 39], they construct the contrastive pairs by randomly sampling nodes or corrupting subgraphs. It lacks consideration on how to construct more meaningful contrastive learning tasks tailored for the recommendation task [24, 27, 35].
|
| 6 |
+
|
| 7 |
+
Besides direct user-item interactions, there exist multiple kinds of potential relations (*e.g.*, user similarity) that are useful to the recommendation task, and we aim to design more effective constative learning approaches for leveraging such useful relations in neural graph collaborative filtering. Specially, we consider node-level relations *w.r.t.* a user (or an item), which is more efficient than the graph-level relations. We characterize these additional relations as *enriched neighborhood* of nodes, which can be defined in two aspects: (1) **structural neighbors** refer to structurally connected nodes by high-order paths, and (2) **semantic neighbors** refer to
|
| 8 |
+
|
| 9 |
+
<sup>†</sup> Equal contribution.
|
| 10 |
+
|
| 11 |
+
Corresponding author.
|
| 12 |
+
|
| 13 |
+
<span id="page-1-0"></span>
|
| 14 |
+
|
| 15 |
+
Figure 1: Comparison of existing self-supervised learning approaches (e.g., SGL [33]) that neglect the correlation among users (or items) and the proposed neighborhood-enriched contrastive learning approach (our approach).
|
| 16 |
+
|
| 17 |
+
semantically similar neighbors which may not be directly reachable on graphs. We aim to leverage these enriched node relations for improving the learning of node representations (*i.e.*, encoding user preference or item characteristics).
|
| 18 |
+
|
| 19 |
+
To integrate and model the enriched neighborhood, we propose Neighborhood-enriched Contrastive Learning (NCL for short), a model-agnostic constative learning framework for the recommendation. As introduced before, NCL constructs node-level contrastive objectives based on two kinds of extended neighbors. We present a comparison between NCL and existing constative learning methods in Figure 1. However, node-level contrastive objectives usually require pairwise learning for each node pair, which is time-consuming for large-sized neighborhoods. Considering the efficiency issue, we learn a single representative embedding for each kind of neighbor, such that the constative learning for a node can be accomplished with two representative embeddings (either structural or semantic).
|
| 20 |
+
|
| 21 |
+
To be specific, for structural neighbors, we note that the outputs of k-th layer of GNN involve the aggregated information of k-hop neighbors. Therefore, we utilize the k-th layer output from GNN as the representations of k-hop neighbors for a node. We design a structure-aware contrastive learning objective that pulls the representations of a node (a user or item) and the representative embedding for its structural neighbors. For the semantic neighbors, we design a prototypical contrastive learning objective to capture the correlations between a node (a user or item) and its prototype. Roughly speaking, a prototype can be regarded as the centroid of the cluster of semantically similar neighbors in representation space. Since the prototype is latent, we further propose to use an expectation-maximization (EM) algorithm [19] to infer the prototypes. By incorporating these additional relations, our experiments show that it can largely improve the original GNN based approaches (also better than existing constative learning methods) for implicit feedback recommendation. Our contributions can be summarized threefold:
|
| 22 |
+
|
| 23 |
+
- We propose a model-agnostic contrastive learning framework named NCL, which incorporates both structural and semantic neighbors for improving the neural graph collaborative filtering.
|
| 24 |
+
- We propose to learn representative embeddings for both kinds of neighbors, such that the constative learning can be only performed between a node and the corresponding representative embeddings, which largely improves the algorithm efficiency.
|
| 25 |
+
- Extensive experiments are conducted on five public datasets, demonstrating that our approach is consistently better than a number of competitive baselines, including GNN and contrastive learning-based recommendation methods.
|
| 26 |
+
|
| 27 |
+
# Method
|
| 28 |
+
|
| 29 |
+
As the fundamental recommender system, collaborative filtering (CF) aims to recommend relevant items that users might be interested in based on the observed implicit feedback (e.g., expression, click and transaction). Specifically, given the user set $\mathcal{U} = \{u\}$ and item set $I = \{i\}$ , the observed implicit feedback matrix is denoted as $\mathbf{R} \in \{0,1\}^{|\mathcal{U}|\times|I|}$ , where each entry $R_{u,i} = 1$ if there exists an interaction between the user u and item i, otherwise $R_{u,i} = 0$ . Based on the interaction data $\mathbf{R}$ , the learned recommender systems can predict potential interactions for recommendation. Furthermore, Graph Neural Network (GNN) based collaborative filtering methods organize the interaction data $\mathbf{R}$ as an interaction graph $\mathcal{G} = \{\mathcal{V}, \mathcal{E}\}$ , where $\mathcal{V} = \{\mathcal{U} \cup I\}$ denotes the set of nodes and $\mathcal{E} = \{(u,i) \mid u \in \mathcal{U}, i \in I, R_{u,i} = 1\}$ denotes the set of edges.
|
| 30 |
+
|
| 31 |
+
In general, GNN-based collaborative filtering methods [9, 31, 32] produce informative representations for users and items based on the aggregation scheme, which can be formulated to two stages:
|
| 32 |
+
|
| 33 |
+
$$z_{u}^{(l)} = f_{\text{propagate}}(\{z_{v}^{(l-1)} | v \in \mathcal{N}_{u} \cup \{v\}\}),$$
|
| 34 |
+
|
| 35 |
+
$$z_{u} = f_{\text{readout}}([z_{u}^{(0)}, z_{u}^{(1)}, \dots, z_{u}^{(L)}]),$$
|
| 36 |
+
(1)
|
| 37 |
+
|
| 38 |
+
where $\mathcal{N}_u$ denotes the neighbor set of user u in the interaction graph $\mathcal{G}$ and L denotes the number of GNN layers. Here, $z_u^{(0)}$ is initialized by the learnable embedding vector $\mathbf{e}_u$ . For the user u, the propagation function $f_{\text{propagate}}(\cdot)$ aggregates the (l-1)-th layer's representations of its neighbors to generate the l-th layer's representation $\mathbf{z}_u^{(l)}$ . After l times iteratively propagation, the information of l-hop neighbors is encoded in $\mathbf{z}_u^{(l)}$ . And the readout function $f_{\text{readout}}(\cdot)$ further summarizes all of the representations $[\mathbf{z}_u^{(0)}, \mathbf{z}_u^{(1)}, ..., \mathbf{z}_u^{(L)}]$ to obtain the final representations of user u for recommendation. The informative representations of items can be obtained analogously.
|
| 39 |
+
|
| 40 |
+
In this section, we introduce the proposed Neighborhood-enriched Contrastive Learning method in three parts. We first introduce the base graph collaborative filtering approach in Section 3.1, which outputs the final representations for recommendation along with the integrant representations for structural neighbors. Then, we introduce the structure-contrastive strategies and prototype-contrastive strategies in Section 3.2 and Section 3.3 respectively, which integrate the relation of neighbors into contrastive learning to coordinate with collaborative filtering properly. Finally, we propose a multi-task learning strategy in Section 3.4 and further present the theoretical analysis and discussion in Section 3.5. The overall framework of NCL is depicted in Figure 2.
|
| 41 |
+
|
| 42 |
+
As mentioned in Section 2, GNN-based methods produce user and item representations by applying the propagation and prediction function on the interaction graph $\mathcal{G}$ . In NCL, we utilize GNN to model the observed interactions between users and items. Specifically, following LightGCN [9], we discard the nonlinear activation
|
| 43 |
+
|
| 44 |
+
<span id="page-2-2"></span>
|
| 45 |
+
|
| 46 |
+
Figure 2: Overall framework of our proposed neighborhoodenriched contrastive collaborative filtering method.
|
| 47 |
+
|
| 48 |
+
and feature transformation in the propagation function as:
|
| 49 |
+
|
| 50 |
+
$$z_{u}^{(l+1)} = \sum_{i \in N_{u}} \frac{1}{\sqrt{|N_{u}| |N_{i}|}} z_{i}^{(l)},$$
|
| 51 |
+
|
| 52 |
+
$$z_{i}^{(l+1)} = \sum_{u \in N_{i}} \frac{1}{\sqrt{|N_{i}| |N_{u}|}} z^{(l)},$$
|
| 53 |
+
(2)
|
| 54 |
+
|
| 55 |
+
After propagating with L layers, we adopt the weighted sum function as the readout function to combine the representations of all layers and obtain the final representations as follows:
|
| 56 |
+
|
| 57 |
+
$$z_u = \frac{1}{L+1} \sum_{l=0}^{L} z_u^{(k)}, \qquad z_i = \frac{1}{L+1} \sum_{l=0}^{L} z_i^{(k)},$$
|
| 58 |
+
(3)
|
| 59 |
+
|
| 60 |
+
where $z_u$ and $z_i$ denote the final representations of user u and item i. With the final representations, we adopt inner product to predict how likely a user u would interact with items i:
|
| 61 |
+
|
| 62 |
+
$$\hat{y}_{u,i} = \mathbf{z}_u^\top \mathbf{z}_i,\tag{4}$$
|
| 63 |
+
|
| 64 |
+
where $\hat{y}_{u,i}$ is the prediction score of user u and items i.
|
| 65 |
+
|
| 66 |
+
To capture the information from interactions directly, we adopt Bayesian Personalized Ranking (BPR) loss [22], which is a well-designed ranking objective function for recommendation. Specifically, BPR loss enforces the prediction score of the observed interactions higher than sampled unobserved ones. Formally, the objective function of BPR loss is as follows:
|
| 67 |
+
|
| 68 |
+
$$\mathcal{L}_{BPR} = \sum_{(u,i,j)\in\mathcal{O}} -\log \sigma(\hat{y}_{u,i} - \hat{y}_{u,j}). \tag{5}$$
|
| 69 |
+
|
| 70 |
+
where $\sigma(\cdot)$ is the sigmoid function, $O = \{(u, i, j) | R_{u,i} = 1, R_{u,j} = 0\}$ denotes the pairwise training data, and j denotes the sampled item that user u has not interacted with.
|
| 71 |
+
|
| 72 |
+
By optimizing the BPR loss $\mathcal{L}_{BPR}$ , our proposed NCL can model the interactions between users and items. However, high-order neighbor relations within users (or within items) are also valuable for recommendations. For example, users are more likely to buy the same product as their neighbors. Next, we will propose two contrastive learning objectives to capture the potential neighborhood relationships of users and items.
|
| 73 |
+
|
| 74 |
+
Existing graph collaborative filtering models are mainly trained with the observed interactions (e.g., user-item pairs), while the potential relationships among users or items cannot be explicitly captured by learning from the observed data. In order to fully exploit the advantages of contrastive learning, we propose to contrast each user (or item) with his/her structural neighbors whose representations are aggregated through layer propagation of GNN. Formally, the initial feature or learnable embedding of users/items are denoted by $z^{(0)}$ in the graph collaborative filtering model [9]. And the final output can be seen as a combination of the embeddings within a subgraph that contains multiple neighbors at different hops. Specifically, the l-th layer's output $z^{(l)}$ of the base GNN model is the weighted sum of l-hop structural neighbors of each node, as there is no transformation and self-loop when propagation [9].
|
| 75 |
+
|
| 76 |
+
Considering that the interaction graph $\mathcal G$ is a bipartite graph, information propagation with GNN-based model for even times on the graph naturally aggregates information of homogeneous structural neighbors which makes it convenient to extract the potential neighbors within users or items. In this way, we can obtain the representations of homogeneous neighborhoods from the even layer (e.g., 2, 4, 6) output of the GNN model. With these representations, we can efficiently model the relation between users/items and their homogeneous structural neighbors. Specifically, we treat the embedding of users themself and the embedding of the corresponding output of the even-numbered layer GNN as positive pairs. Based on InfoNCE [20], we propose the structure-contrastive learning objective to minimize the distance between them as follows:
|
| 77 |
+
|
| 78 |
+
<span id="page-2-4"></span>
|
| 79 |
+
$$\mathcal{L}_{S}^{U} = \sum_{u \in \mathcal{U}} -\log \frac{\exp((\mathbf{z}_{u}^{(k)} \cdot \mathbf{z}_{u}^{(0)} / \tau))}{\sum_{v \in \mathcal{U}} \exp((\mathbf{z}_{u}^{(k)} \cdot \mathbf{z}_{v}^{(0)} / \tau))}, \tag{6}$$
|
| 80 |
+
|
| 81 |
+
where $\mathbf{z}_u^{(k)}$ is the normalized output of GNN layer k and k is even number. $\tau$ is the temperature hyper-parameter of softmax. In a similar way, the structure-contrastive loss of the item side $\mathcal{L}_{struc}^{item}$ can be obtained as:
|
| 82 |
+
|
| 83 |
+
$$\mathcal{L}_{S}^{I} = \sum_{i \in I} -\log \frac{\exp((\mathbf{z}_{i}^{(k)} \cdot \mathbf{z}_{i}^{(0)}/\tau))}{\sum_{j \in I} \exp((\mathbf{z}_{i}^{(k)} \cdot \mathbf{z}_{j}^{(0)}/\tau))},\tag{7}$$
|
| 84 |
+
|
| 85 |
+
And the complete structure-contrastive objective function is the weighted sum of the above two losses:
|
| 86 |
+
|
| 87 |
+
<span id="page-2-3"></span>
|
| 88 |
+
$$\mathcal{L}_S = \mathcal{L}_S^U + \alpha \mathcal{L}_S^I. \tag{8}$$
|
| 89 |
+
|
| 90 |
+
where $\alpha$ is a hyper-parameter to balance the weight of the two losses in structure-contrastive learning.
|
| 91 |
+
|
| 92 |
+
The structure-contrastive loss explicitly excavates the neighbors defined by the interaction graph. However, the structure-contrastive loss treats the homogeneous neighbors of users/items equally, which inevitably introduces noise information to contrastive pairs. To reduce the influence of noise from structural neighbors, we consider extending the contrastive pairs by incorporating *semantic*
|
| 93 |
+
|
| 94 |
+
*neighbors*, which refer to unreachable nodes on the graph but with similar characteristics (item nodes) or preferences (user nodes).
|
| 95 |
+
|
| 96 |
+
Inspired by previous works [16], we can identify the semantic neighbors by learning the latent prototype for each user and item. Based on this idea, we further propose the prototype-contrastive objective to explore potential semantic neighbors and incorporate them into contrastive learning to better capture the semantic characteristics of users and items in collaborative filtering. In particular, similar users/items tend to fall in neighboring embedding space, and the prototypes are the center of clusters that represent a group of semantic neighbors. Thus, we apply a clustering algorithm on the embeddings of users and items to obtain the prototypes of users or items. Since this process cannot be end-to-end optimized, we learn the proposed prototype-contrastive objective with EM algorithm. Formally, the goal of GNN model is to maximize the following log-likelihood function:
|
| 97 |
+
|
| 98 |
+
<span id="page-3-2"></span>
|
| 99 |
+
$$\sum_{u \in \mathcal{U}} \log p(\mathbf{e}_u | \Theta, \mathbf{R}) = \sum_{u \in \mathcal{U}} \log \sum_{\mathbf{c}_i \in C} p(\mathbf{e}_u, \mathbf{c}_i | \Theta, \mathbf{R}), \tag{9}$$
|
| 100 |
+
|
| 101 |
+
where $\Theta$ is a set of model parameters and $\mathbf{R}$ is the interaction matrix. And $c_i$ is the latent prototype of user u. Similarly, we can define the optimization objective for items.
|
| 102 |
+
|
| 103 |
+
After that, the proposed prototype-contrastive learning objective is to minimize the following function based on InfoNCE [20]:
|
| 104 |
+
|
| 105 |
+
<span id="page-3-3"></span>
|
| 106 |
+
$$\mathcal{L}_{P}^{U} = \sum_{u \in \mathcal{U}} -\log \frac{\exp(\mathbf{e}_{u} \cdot \mathbf{c}_{i}/\tau)}{\sum_{\mathbf{c}_{j} \in C} \exp(\mathbf{e}_{u} \cdot \mathbf{c}_{j}/\tau)}.$$
|
| 107 |
+
(10)
|
| 108 |
+
|
| 109 |
+
where $c_i$ is the prototype of user u which is got by clustering over all the user embeddings with K-means algorithm and there are k clusters over all the users. The objective on the item side is identical:
|
| 110 |
+
|
| 111 |
+
$$\mathcal{L}_{P}^{I} = \sum_{i \in I} -\log \frac{\exp(\mathbf{e}_{i} \cdot \mathbf{c}_{j} / \tau)}{\sum_{\mathbf{c}_{t} \in C} \exp(\mathbf{e}_{i} \cdot \mathbf{c}_{t} / \tau)}.$$
|
| 112 |
+
(11)
|
| 113 |
+
|
| 114 |
+
where $c_j$ is the protype of item i. The final prototype-contrastive objective is the weighted sum of user objective and item objective:
|
| 115 |
+
|
| 116 |
+
$$\mathcal{L}_P = \mathcal{L}_P^U + \alpha \mathcal{L}_P^I. \tag{12}$$
|
| 117 |
+
|
| 118 |
+
In this way, we explicitly incorporate the semantic neighbors of users/items into contrastive learning to alleviate the data sparsity.
|
| 119 |
+
|
| 120 |
+
In this section, we introduce the overall loss and the optimization of the proposed prototype-contrastive objective with EM algorithm.
|
| 121 |
+
|
| 122 |
+
**Overall Training Objective**. As the main target of the collaborative filter is to model the interactions between users and items, we treat the proposed two contrastive learning losses as supplementary and leverage a multi-task learning strategy to jointly train the traditional ranking loss and the proposed contrastive loss.
|
| 123 |
+
|
| 124 |
+
$$\mathcal{L} = \mathcal{L}_{BPR} + \lambda_1 \mathcal{L}_S + \lambda_2 \mathcal{L}_P + \lambda_3 ||\Theta||_2, \tag{13}$$
|
| 125 |
+
|
| 126 |
+
where $\lambda_1$ , $\lambda_2$ and $\lambda_3$ are the hyper-parameters to control the weights of the proposed two objectives and the regularization term, respectively, and $\Theta$ denotes the set of GNN model parameters.
|
| 127 |
+
|
| 128 |
+
**Optimize** $\mathcal{L}_P$ with EM algorithm. As Eq. (9) is hard to optimize, we obtain its Lower-Bound (LB) by Jensen's inequality:
|
| 129 |
+
|
| 130 |
+
$$LB = \sum_{u \in \mathcal{U}} \sum_{\mathbf{c} \in C} Q(\mathbf{c}_i | \mathbf{e}_u) \log \frac{p(\mathbf{e}_u, \mathbf{c}_i | \Theta, \mathbf{R})}{Q(\mathbf{c}_i | \mathbf{e}_u)}, \tag{14}$$
|
| 131 |
+
|
| 132 |
+
where $Q(\mathbf{c}_i|\mathbf{e}_u)$ denotes the distribution of latent variable $\mathbf{c}_i$ when $e_u$ is observed. The target can be redirected to maximize the function over $e_u$ when $Q(\mathbf{c}_i|\mathbf{e}_u)$ is estimated. The optimization process is formulated in EM algorithm.
|
| 133 |
+
|
| 134 |
+
In the E-step, $\mathbf{e}_u$ is fixed and $Q(\mathbf{c}_i|\mathbf{e}_u)$ can be estimated by K-means algorithm over the embeddings of all users $\mathbf{E}$ . If user u belongs to cluster i, then the cluster center $\mathbf{c}_i$ is the prototype of the user. And the distribution is estimated by a hard indicator $\hat{Q}(\mathbf{c}_i|\mathbf{e}_u) = 1$ for $\mathbf{c}_i$ and $\hat{Q}(\mathbf{c}_j|\mathbf{e}_u) = 0$ for other prototypes $\mathbf{c}_j$ .
|
| 135 |
+
|
| 136 |
+
In the M-step, the target function can be rewritten with $\hat{Q}(c_i|e_u)$ :
|
| 137 |
+
|
| 138 |
+
$$\mathcal{L}_{P}^{U} = -\sum_{u \in \mathcal{U}} \sum_{\mathbf{c}_{i} \in C} \hat{\mathcal{Q}}(\mathbf{c}_{i}|\mathbf{e}_{u}) \log p(\mathbf{e}_{u}, \mathbf{c}_{i}|\Theta, \mathbf{R}), \tag{15}$$
|
| 139 |
+
|
| 140 |
+
we can assume that the distrubution of users is isotropic Gaussian over all the clusters. So the function can be written as:
|
| 141 |
+
|
| 142 |
+
$$\mathcal{L}_{P}^{U} = -\sum_{u \in \mathcal{U}} \log \frac{\exp(-(\mathbf{e}_{u} - \mathbf{c}_{i})^{2}/2\sigma_{i}^{2})}{\sum_{\mathbf{c}_{j} \in C} \exp(-(\mathbf{e}_{u} - \mathbf{c}_{j})^{2}/2\sigma_{j}^{2})},$$
|
| 143 |
+
(16)
|
| 144 |
+
|
| 145 |
+
As $x_u$ and $c_i$ are normalizated beforehand, then $(\mathbf{e}_u - \mathbf{c}_i)^2 = 2 - 2\mathbf{e}_u \cdot \mathbf{c}_i$ . Here we make an assumption that each Gussian distribution has the same derivation, which is written to the temperature hyperparameter $\tau$ . Therefore, the function can be simplified as Eq. (10).
|
2203.04723/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2203.04723",
|
| 3 |
+
"month": "2022_03",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ACL",
|
| 6 |
+
"title": "Language Diversity: Visible to Humans, Exploitable by Machines",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2203.04723",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.04723",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/tex_files_extracted/2203.04723",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.04723/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.04723/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.04723/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.04723/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.04723/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.04723/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.04723/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.04723/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.04723/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2203.07881/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2203.07881",
|
| 3 |
+
"month": "2022_03",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ECCV",
|
| 6 |
+
"title": "LiP-Flow: Learning Inference-Time Priors for Codec Avatars via Normalizing Flows in Latent Space",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2203.07881",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.07881",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/tex_files_extracted/2203.07881",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.07881/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.07881/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.07881/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.07881/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.07881/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.07881/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.07881/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.07881/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.07881/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2203.12000/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2203.12000",
|
| 3 |
+
"month": "2022_03",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "IJCAI",
|
| 6 |
+
"title": "Text Transformations in Contrastive Self-Supervised Learning: A Review",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2203.12000",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.12000",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/tex_files_extracted/2203.12000",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.12000/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.12000/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.12000/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.12000/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.12000/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.12000/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.12000/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.12000/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.12000/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2203.16220/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2203.16220",
|
| 3 |
+
"month": "2022_03",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "CVPR",
|
| 6 |
+
"title": "Target-Aware Dual Adversarial Learning and a Multi-Scenario Multi-Modality Benchmark To Fuse Infrared and Visible for Object Detection",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2203.16220",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.16220",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/tex_files_extracted/2203.16220",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.16220/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.16220/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_03/main_diagram_database/2203.16220/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.16220/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.16220/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.16220/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.16220/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.16220/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2203.16220/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|
2204.03840/record.json
ADDED
|
@@ -0,0 +1,32 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"arxiv_id": "2204.03840",
|
| 3 |
+
"month": "2022_04",
|
| 4 |
+
"year": 2022,
|
| 5 |
+
"conference": "ICLR",
|
| 6 |
+
"title": "DATA-DRIVEN EVALUATION OF TRAINING ACTION SPACE FOR REINFORCEMENT LEARNING",
|
| 7 |
+
"arxiv_url": "https://arxiv.org/abs/2204.03840",
|
| 8 |
+
"source": {
|
| 9 |
+
"paper_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_04/main_diagram_database/2204.03840",
|
| 10 |
+
"tex_dir": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_04/tex_files_extracted/2204.03840",
|
| 11 |
+
"paper_md": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_04/main_diagram_database/2204.03840/paper_text/paper.md",
|
| 12 |
+
"metadata_json": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_04/main_diagram_database/2204.03840/metadata.json",
|
| 13 |
+
"intro_method_from": "/home/zling/lzl/ICML2026/Build_Dataset/data/2022_04/main_diagram_database/2204.03840/paper_text/paper.md",
|
| 14 |
+
"intro_method_from_kind": "markdown"
|
| 15 |
+
},
|
| 16 |
+
"files": {
|
| 17 |
+
"main_drawio": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2204.03840/main_diagram/main_diagram.drawio",
|
| 18 |
+
"main_png": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2204.03840/main_diagram/main_diagram.png",
|
| 19 |
+
"main_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2204.03840/main_diagram/main_diagram.pdf",
|
| 20 |
+
"intro_method_md": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2204.03840/paper_text/intro_method.md",
|
| 21 |
+
"paper_pdf": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2204.03840/paper.pdf",
|
| 22 |
+
"latex": "/home/zling/lzl/ICML2026/Build_Dataset/dataset/2204.03840/latex_source"
|
| 23 |
+
},
|
| 24 |
+
"status": {
|
| 25 |
+
"copy_drawio": "exists",
|
| 26 |
+
"copy_png": "exists",
|
| 27 |
+
"diagram_pdf": "pdf_exists",
|
| 28 |
+
"intro_method": "exists",
|
| 29 |
+
"paper_pdf": "exists",
|
| 30 |
+
"latex": "exists"
|
| 31 |
+
}
|
| 32 |
+
}
|